On the second day I attended three talks:
- Python Concurrency – this talk, given by Bob Hancock, was my favorite of the entire conference. The talk covered a lot of material on O/S concurrency and multi-threading. When the slides to the talk are posted I will post a link. but meanwhile here are some interesting tidbits :
- Bibliography on O/S and Concurrency
- “The most important quality in a programmer is patience” – Guy Steele
- Explanation of Python GIL
- Generators in Python can now be used to implement co-routines (very cool!).
- Read C.A. R. Hoare’s paper on Communicating Sequential Processes
- Async I/O plus co-routines can be used to build co-operative and very light weight threading in Python (see greenlets)
- The new language from Google GO uses similar ideas.
A lot of this was very familiar as in the past I used co-routines to implement concurrency in PASCAL and Modula-2 programs.
- Brubeck.io web framework – Brubeck is a web framework in Python that uses the greenlet libraries and co-routines to make web programming bit easier. The interesting part of the talk was that we all actually got to try and install the software on our own laptops and try to run it. I only got about half way though it.
- Machine Learning – this was an introductory talk on machine learning. Machine learning is the kind of stuff that Google does to figure things out by analyzing ream and reams of data. A simple example of this kind of stuff is the Spam filter in your email client. For Python programmers there are bunch libraries that can be used to perform these types of computations: nltk, scikits.learn, pybrain etc. As part of the talk the speaker presented an example of a program that analyzed data from Twitter to decide whether a tweet was more Republican or Democratic leaning. Kind of fun…