So I wanted to learn a bit of python. Well you know, I am used to script in Perl since 1997. Moreover, I am lazy. So why the heck I should learn a new language? Well let's say that the environment around me is full of these young and smart guys who love python. So I tried it. After all, it is nice to add another knife.
So where is the crawler? Here it is. Very compact. It uses Eventlet from SecondLife, a nice framework to support Async/IO and co-routines. The resulting code is very compact and it avoids all the pitfalls of calling a cascade of callbacks(). RSS/Atom feeds are parsed using feedparser. Images are handled with PIL. HMTL pages are parser with Beautiful soap. Mysql is accessed with MySQLdb. Eventlet needs greenlet to run. The crawler downloads a bunch of rss/atom feeds, all the web pages referred by the postings, all the images contained in the web page. There is one single thread which performs all the network operations with pool of co-routines.
Boost::serialization is the ultimate solution if you want to send your objects trough a socket. I remember the old days when we were used to work with corba and similar stuff. Thanks to god this is not longer requested. This code is extending the async io server and serialize a test object. Ping messages are objects too. Generic programming is used to send as many different objects as you want. .. we are moving towards a full map reduce implementation