Manuel Kaufmann (Humitos): Django Girls: tecnología + python + mujeres

   Publicado:

Durante el evento Taller de programación para chicas/mujeres que se realizó en Cochabamba, Bolivia se notó un hermoso trabajo en equipo por parte de diferentes comunidades y grupos amigos de la zona (ver fotos).

Fue la primera vez que organizamos este tipo de evento desde el proyecto Argentina en Python y yo, personalmente, estaba muy nervioso. Había estudiado y leído mucho sobre cómo hacer este evento. Por parte de Django Girls puedo decir que tienen un montón de documentación escrita y de fácil lectura que ayuda muchísimo. Además, tienen un equipo de apoyo -que aunque no lo usé, veo como se mueve y es para aplaudirlos.

DSC_9348_02.thumbnail.jpg

En plena acción durante la mañana

Entonces, como notamos que en Cochabamba había una movida muy linda sobre libertad de expresión, un trabajo hermoso para acercar la tecnología a las personas y una promoción de activismo muy fuerte por parte de la mujer; creímos que era el lugar más que indicado para sumar esfuerzos y sacar algo lindo.

Así fue que el Sábado 22 de Agosto se llevó a cabo este evento en donde participaró el Barrio Hacker proveyendo su red WiFi, el HackLab Cochabamba que puso a disposición su laboratorio para los asistentes, el proyecto mARTadero que hizo un terrible trabajo de difusión y que además gestionó el espacio físico, la fundación Adelante Bolivia que nos brindó hospedaje por más de 2 semanas y la Oficina Jurídica para la Mujer que nos donó el almuerzo para todos los asistentes. ¡Increíble trabjo por parte de todos!

Como si esto fuese poco, Daniel Cotillas Ruiz, un español cochabambino -como lo definí yo, estuvo haciendo de las suyas con su cámara filmadora y realizó un trabajo hermoso de recolección de experiencias por parte de varias de las asistentas que fueron al mARTadero ese día y participaron del taller. Aquí les dejo una perlita que vale la pena ver de principio a fin prestando mucha atención a lo que dicen las chicas.

No está demás comentar que tuvimos una asistencia plena en el evento. A tal punto que fueron 10 personas más de la capacidad máxima del lugar e hicimos lo que pudimos para reubicarlos. Con un total de 25 personas durante todo el día, me tomo el atrevimiento de decir que este Workshop Django Girls fue un éxito que se escribe en femenino.

DSC08757-1.thumbnail.jpg

El grupo completo

Quizás para algunos, 25 personas no parezca gran cosa, pero les puedo decir que hemos acercado la programación a 25 campos diferentes y no sólo al de la informática. Creo que había una sola mujer que se dedicaba a la informática, el resto eran diseñadoras gráficas, ingenieras eléctricas, sociólogas, abogadas, estudiantes de arte, comunicadoras sociales, docentes, odontólogas, química (aparte de @EllaQuimica ;) y unas tantas ocupaciones que ya no recuerdo. Entonces, hemos sembrano una semilla en esos ambientes diferentes a los cuales estamos acostumbrados a trabajar. No mostramos Python, mostramos la programación como una herramienta poderosa de expresión y comunicación; así como también para solucionar diversos problemas complejos de una forma simple.

Marcos Dione: using-snapshot-debian-org-for-downgrading-debian-packages

   Publicado:

Nice tricks I found out trying to unfuck my laptop's setup, all my fault:

  • You can use snapshot.debian.org to recover packages for any date for any release that was available at that date. I actually new this, but somehow I forgot. I used deb http://snapshot.debian.org/archive/debian/20150720T214439Z/ testing main.

  • For that you have to disable the Packages-file-too-old check, which I have never seen, ever. Put this in any file in your /etc/apt.conf.d dir:

Acquire {
    Check-Valid-Until "false";
}
  • aptitude has a menu bar (activate with C-t), a preferences dialog, and you can set it up so any operation with a package moves down the cursor. Finally I figure that out.

  • It also has a dselect theme, but I was not brave enough to try it (for the record, I love dselect, I miss the fact that it shows how dependencies are resolved in the moment they're needed).

  • You can disable aptitude's resolver (-o Aptitude::ProblemResolver::StepLimit=0), but it doesn't make the UI that much more responsive (???).

  • digikam is not on testing right now. It FTBFS with gcc5 and has a licence problem.

  • Don't ride Debian sid right now, it's suffering a gcc transition and it might take a while.


debian

Marcos Dione: ayrton-0.5

   Publicado:

I forgot to mention: last night I finally got to release ayrton-0.5. This has a major update to the language, thanks to our new parser, craftily thieved out of pypy. Other similar changes might come soon. Meanwhile, here's the ChangeLog:

  • Much better command detection.
  • CommandNotFound exception is now a subclass of NameError.
  • Allow Command keywords be named like -l and --long-option, so it supports options with single dashes (-long-option, à la find).
  • This also means that long-option is no longer passed as --long-option; you have to put the dashes explicitly.
  • bash() does not return a single string by default; override with single=True.
  • Way more tests.
  • Updated docs.

python ayrton

Mariano Draghi (cHagHi): La sal de la Tierra

   Publicado:
Portrait of the Artist — the photography of Sebastião Salgado

Portrait of the Artist — the photography of Sebastião Salgado by Steve Jurvetson

La sal de la Tierra me hizo llorar de tristeza. Y de alegría. Me produjo por momentos un profundo pesimismo por la humanidad. Y por momentos me llenó de esperanza, me mostró que somos capaces de hacer cosas inmensas con muy poco. Me mostró la cara más terrible del ser humano. También la belleza de la Naturaleza. Refleja las profundas desigualdades que imperan en el mundo. Me hizo sentir que todos, absolutamente todos mis problemas son una estupidez frente al sufrimiento y la realidad de millones y millones de personas.

Me recordó que la fotografía es mucho más que sacar fotos, lo bella que puede ser como arte, lo impactante que es como medio de comunicación. Me recordó por que me gusta tanto.

Me contó un poco de la historia de Sebastião Salgado, y me hizo descubrir a un ser humano de una enorme sensibilidad, que recorrió practicamente todos los rincones del planeta para retratar un sinfín de cosas, algunas terribles, otras hermosas, que muchas veces elegimos ignorar.

Hacía mucho que una película no me generaba tantas emociones juntas.

La sal de la Tierra es de esas películas que todos deberían ver.

Joaquin Sorianello: Algo mas de 140 caracteres.

   Publicado:

Desde hace un tiempo, lo único que cuento lo hago por twitter. Cientos de mensajes cortos que intentan comunicar lo que pasa con la causa.

Hay días que también hablo sutilmente, de como me siento yo.

Pero, releyéndolos, me encuentro con que en el fondo, no logro expresar del todo las cosas. Esos 140 caracteres obligan a reducir, recortar, y de alguna manera auto censurarnos.

Es que los últimos 64 días que pasaron desde el reporte de la vulnerabilidad y el ataque informático que estaba sufriendo MSA, fueron muy intensos.

Antes del allanamiento, miedo a que las personas responsables del ataque informático me hicieran daño a mi o a mi familia.

Después, miedo a que la justicia no opere de forma correcta, que me condenen sin pruebas, que la causa sea solo para meter miedo.

Alrededor de esas cosas quedan otras: el desgaste sobre los afectos, el estrés, y la enorme impotencia de ser considerado culpable por amistades.

Si, adentro de MSA, hay varias personas que, mas allá de las diferencias políticas e ideológicas que podamos tener, consideraba amigas.

¿Será acaso que, por estar en contra de los sistemas de votación electrónica que ellos proponen, piensen que no tengo ética profesional?

¿O que consideren que soy capaz de hacer daño en sus servidores solo para demostrar los riesgos?

Están equivocados.

Porque en el momento en que les avisé, pude separar mis propias convicciones, de lo estrictamente ético y profesional.

Les pido, a aquellos que trabajan en MSA con los que compartimos cosas en espacios de las comunidades de software libre, que traten de hacer lo mismo.

Porque con lo que está pasando, perdemos todos.

Mariano Guerra: Forward syslog messages to flume with rsyslog

   Publicado:

As usual, brain dump, just instructions, not much content.

download flume from here: https://flume.apache.org/download.html

I'm using this one: http://www.apache.org/dyn/closer.cgi/flume/1.6.0/apache-flume-1.6.0-bin.tar.gz

unpack and put it somewhere.

create a file with the following content, I will name it flume-syslog.conf and place it in ~/tmp/, you should too if you are lazy and don't want to change the commands:

# Name the components on this agent
a1.sources = r1
a1.sinks = k1
a1.channels = c1

# I'll be using TCP based Syslog source
a1.sources.r1.type = syslogtcp
# the port that Flume Syslog source will listen on
a1.sources.r1.port = 7077
# the hostname that Flume Syslog source will be running on
a1.sources.r1.host = localhost

# Describe the sink
a1.sinks.k1.type = logger

# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100

# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1

Install rsyslog if you don't have it and start it, I'm using fedora 22, change for your distro:

sudo dnf install rsyslog
sudo service rsyslog start

Note

For Fedora Users

I had to disable selinux since it was blocking some ports, YMMV

Configure rsyslog with your rule, you can do it directly on /etc/rsyslog.conf or better, check that the following line is uncommented:

$IncludeConfig /etc/rsyslog.d/*.conf

And put your config under /etc/rsyslog.d/50-default.conf (create it if it doesn't exist)

We are going to forward only messages with a given tag, since we are interested on a subset of the logs, in this case we only want log lines with the tag "test", add this to the rsyslog config file:

:syslogtag, isequal, "test:" @@127.0.0.1:7077

Save and restart rsyslog:

sudo service rsyslog start

Start flume with your configuration:

./bin/flume-ng agent --conf conf --conf-file ~/tmp/flume-syslog.conf --name a1 -Dflume.root.logger=INFO,console  -Dorg.apache.flume.lifecycle.LifecycleSuperviso=INFO,console

Note

You should run the flume-ng command from the flume folder otherwise a log4j warning will appear and you won't see the output of the sink

Now generate a log line with our tag:

logger -t test 'Testing Flume with Syslog!

you should see a line like this:

2015-08-27 18:06:25,096 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - org.apache.flume.sink.LoggerSink.process(LoggerSink.java:94)] Event: { headers:{host=ganesha, Severity=5, Facility=1, priority=13, timestamp=1440695180000} body: 74 65 73 74 3A 20 54 65 73 74 69 6E 67 20 46 6C test: Testing Fl }

If you don't see the line check /var/log/messages to see if your message is there:

sudo vim /var/log/messages

Bonus track! sending apache logs to syslog and from there to flume.

for this install apache 2, on fedora:

sudo dnf install httpd
sudo service httpd start
sudo bash -c "echo 'welcome!' > /var/www/html/index.html"

curl localhost

The output should be:

welcome!

Now configure apache to forward logs to syslog, open /etc/httpd/conf.d/welcome.conf and add at the bottom:

CustomLog "|/usr/bin/logger -t test" combined

Restar apache:

sudo service httpd restart

Now open the page or use curl to get a page:

/etc/httpd/conf.d/welcome.conf

You should see a new log on flume.

Where to go from here?

  • Put flume on another machine, change the ip address 127.0.0.1 to that address
  • change the tag (test) on rsyslog and on welcome.conf to something else
  • Buy me a beer

Hernán Grecco: The state of the PyVISA ecosystem (and more)

   Publicado:

Yesterday we released several packages of the Python Instrumentation Ecosystem. You can upgrade to PyVISA 1.8, PyVISA-py 0.2 and PyVISA-sim 0.3  by:

pip install -U pyvisa pyvisa-py pyvisa-sim

For those of you who are new to instrumentation in Python, PyVISA is a Python frontend for the VISA specification that enables controlling all kinds of measurement equipment through GPIB, RS232, USB and Ethernet among others interfaces. If you are familiar with VISA instruments in LabVIEW, Matlab, C or .NET you already know how it works and you can make use of PyVISA as a nice Pythonic API to write your programs. If you have never done any instrumentation, Python and PyVISA is great combination to start.

Code: https://github.com/hgrecco/pyvisa
Docshttp://pyvisa.readthedocs.org/
Tracker: https://github.com/hgrecco/pyvisa/issues

As I mentioned before PyVISA is a frontend for the VISA specification, but what does this means in terms of software? PyVISA can connect to multiple backends, which are the ones doing part of the hard work. We currently have 3:

ni: is a wrapper to the NI-VISA library, which is the de facto standard implementation of VISA. It is feature complete but requires that you install the proprietary library provided by National Instruments. This is the default backend and is bundled with PyVISA. (Notice that we provide the wrapper, you need to install NI-VISA yourself as explained in the PyVISA docs)

py: is an implementation of the VISA specification using popular python packages to talk over the different interfaces: PySerial, PyUSB, linux-gpib and socket (which is inside the Python standard library). It is almost feature complete for Message Based Instruments (ASRL, USB, TCPIP, GPIB). It is available through the PyVISA-py.


Code: https://github.com/hgrecco/pyvisa-py
Docshttp://pyvisa-py.readthedocs.org/
Tracker: https://github.com/hgrecco/pyvisa-py/issues

sim: allows you create simulated instruments using simple text files. It is great for testing and off-line developing of complex instrumentation applications. It is available through the PyVISA-sim.


As always, these releases would not have been possible without our awesome community that provides code, bug reports, testing and support.

Finally I would like to mention that PyVISA is great but is kind of middle level. For complex applications you want better abstractions that allow you to forget about how the voltage is asked to a particular Voltmeter. For that purpose, I created Lantz a few years ago. It provides a very nice way to write drivers that encapsulate instrument specific information to then use them in scripts and GUI apps. It is not a replacement for PyVISA. Lantz builds on top of it to do the low level communication.

It turned out that I was not the only one thinking and coding in these direction. And I have always felt that dividing the community among many projects is a waste of time and energy. Particular in this type of projects in which the community is not as large the numerical programming and there is already an established standard.

That is is why triggered by his contribution to PintMatthieuDartiailh and I have decided to work together. He did some awesome instrumentation related coding in eapii and HQCMeas. Other people joined these initial conversations and a python instrumentation initiative that we have called LabPy was bornWe have created an organization in GitHub to host Lantz and other projects with a common goal: making instrumentation better in Python.

Matthieu has been championing the refactoring of Lantz, putting the best of the different toolkits together in a cohesive package. I have no doubt that Lantz 0.4 will be even better than 0.3

Join us at https://github.com/LabPy

Facundo Batista: Hostería sede del PyCamp

   Publicado:


Entre las fotos que saqué del PyCamp de hace un par de semanas está esta, que me gustó tanto que la pongo acá aparte, un poco más grande...

Hostería sede del PyCamp 2015 en La Serranita

Es la hostería donde fue sede el evento (donde dormíamos y trabajábamos... las comidas fueron en otro lugar). Una construcción en múltiples niveles muy muy linda.

Más fotos del PyCamp acá.

Marcos Dione: I-got-myself-a-parser

   Publicado:

So, only two days later I already have not only (what looks like) a full parser, which has already landed in develop, I also implemented the first big change in the grammar and semantics: keywords are allowed mixed with positional parameters; in case of command execution, they're converted to positional options; in normal function calls they're just put where they belong.

In the future there will be more restrictive checks so the Python part of the language does not change, but right now I'm interested in adding more small changes like that. For instance, as I said before, allowing the options to have the right amount of hyphens (-o, -option or --option), because right now I have code that prefixes with -- to anything longer than 1 character. The alternative would be to have another _special_arg to handle that. And while I'm at it, also allow --long-options. This is only possible because there's an specific check in the code for that. Unluckily this does not mean I can do the same trick for executable names, so I still lack absolute and relative commands, and you still have to write osmpbf-outline as osmpbf_outline. Maybe I'll just depart a little more from the grammar and allow those, but I have to deep think about it (that is, let the problem be in the back of my head for a while). What I can also do is to allow to use several times the same option (git ('comit-tree', p='fae76fae7', p='7aa3f63', 'a6fa33428bda9832') is an example that comes to mind) because it's another check not really done by the grammar.

In any case, it's quite a leap in the language. I just need to test it a little more before doing the next release, which surely will be the 0.5. I'll keep you posted!


ayrton python

Marcos Dione: breaking-off

   Publicado:

Having my own version of the python parser has proven, so far, to be clumsy and chaotic. Clumsy because it means that I need a special interpreter just to run my language (which in any case uses an interpreter!), chaotic because the building of such interpreter has proven to not work stably in different machines. This means that currently it only works for me.

Because of this and because I wanted even more control over the parser (who said allowing to write things like rsync(--help)?), I decided to check my options. A friend of mine, more used to playing with languages, suggested using pypy to create my own parser, but that just lead me a little further: why not outright 'steal' pypy's parser? After all, they have their own, which is also generated from Python's Python.adsl.

In fact it took me one hour to port the parser and a couple more porting the AST builder. This included porting them to Python3 (both by running 2to3 and then applying some changes by hand, notably dict.iteritems -> dict.items) and trying to remove as much dependency on the rest of pypy, specially from rpython.

The last step was to migrate from their own AST implementation to Python's, but here's where (again) I hit the last brick wall: the ast.AST class and subclasses are very special. They're implemented in C, but the Python API does not allow to create nodes with the line and column info. for a moment I contemplated the option of creating another extension (that is, written in C) to make those calls, but the the obvious solution came to mind: a massive replacement from:

return ast.ASTClass ([params], foo.lineno, foo.column)

into:

new_node = ast.ASTClass ([params])
new_node.lineno = foo.lineno
new_node.column = foo.column
return new_node

and some other similar changes. See here if you're really interested in all the details . I can only be grateful for regular expressions, capturing groups and editors that support both.

The following code is able to parse and dump a simple python script:

#! /usr/bin/env python3
import ast

from pypy.interpreter.pyparser import pyparse
from pypy.interpreter.astcompiler import astbuilder

info= pyparse.CompileInfo('setup.py', 'exec')
p= pyparse.PythonParser(None)
t= p.parse_source (open ('setup.py').read(), info)
a= astbuilder.ast_from_node (None, t, info)

print (ast.dump (a))

The result is the following (formatted by hand):

Module(body=[
    ImportFrom(module='distutils.core', names=[alias(name='setup', asname=None)], level=0),
    Import(names=[alias(name='ayrton', asname=None)]),
    Expr(value=Call(func=Name(id='setup', ctx=<class '_ast.Load'>), args=None, keywords=[
        keyword(arg='name', value=Str(s='ayrton')),
        keyword(arg='version', value=Attribute(value=Name(id='ayrton', ctx=<class '_ast.Load'>), attr='__version__', ctx=<class '_ast.Load'>)),
        keyword(arg='description', value=Str(s='a shell-like scripting language based on Python3.')),
        keyword(arg='author', value=Str(s='Marcos Dione')),
        keyword(arg='author_email', value=Str(s='mdione@grulic.org.ar')),
        keyword(arg='url', value=Str(s='https://github.com/StyXman/ayrton')),
        keyword(arg='packages', value=List(elts=[Str(s='ayrton')], ctx=<class '_ast.Load'>)),
        keyword(arg='scripts', value=List(elts=[Str(s='bin/ayrton')], ctx=<class '_ast.Load'>)),
        keyword(arg='license', value=Str(s='GPLv3')),
        keyword(arg='classifiers', value=List(elts=[
            Str(s='Development Status :: 3 - Alpha'),
            Str(s='Environment :: Console'),
            Str(s='Intended Audience :: Developers'),
            Str(s='Intended Audience :: System Administrators'),
            Str(s='License :: OSI Approved :: GNU General Public License v3 or later (GPLv3+)'), Str(s='Operating System :: POSIX'),
            Str(s='Programming Language :: Python :: 3'),
            Str(s='Topic :: System'),
            Str(s='Topic :: System :: Systems Administration')
        ],
        ctx=<class '_ast.Load'>))
    ], starargs=None, kwargs=None))
])

The next steps are to continue removing references to pypy code, and make sure it can actually parse all possible code. Then I should revisit the harcoded limitations in the parser (in particular in this loop and then be able to freely format program calls :).

Interesting times are arriving to ayrton!

Update: fixed last link. Thanks nueces!


python ayrton

Share