Genetic algorithms in Python

Although I’m fond of numerical optimization through gradients, … there are some times where a global optimization is much more powerfull. For instance, I have to generate two sequences/combs that are orthogonal and for which their autocorrelation is almost an impulse. The two combs have a fixed number of impulse, so it’s a perfect job for genetic algorithms.
Continue reading Genetic algorithms in Python

Book review: Hacking Roomba: ExtremeTech

I’ve recently bought a fourth generation Roomba, which is a vacuum cleaning robot. I bought this brand because it is well-known and has a good history of hackable robots. So the next step was to figure out how to hack it, and hence this book.
Continue reading Book review: Hacking Roomba: ExtremeTech

Dimensionality reduction: Refactoring the manifold module

It’s been a while since I last blogged about manifold learning. I don’t think I’ll add much in terms of algorithms to the scikit, but now that a clear API is being defined (http://sourceforge.net/apps/trac/scikit-learn/wiki/ApiDiscussion), it’s time for the manifold module to comply to it. Also, documentation will be enhanced and some dependencies will be removed.

I’ve started a branch available on github.com, and I will some examples in the scikit as well. I may explain them here, but I won’t rewrite what is already published. A future post will explain the changes, and I hope that interested people will understand the modifications and apply them to my former posts. It’s just that I don’t have much time to change everything…