Now that version 0.2 of scikit.optimization is out, here is a tutorial on the gradient-free optimizer based on the simplex algorithm.
When the only thing you have is the cost function and when you don’t have dozens of parameters, the first thing that can be tried is a simplex algorithm.
The only specific thing you need to do before creating the optimizer is create the first simplex. With the Rosenbrock cost function in 2D, this means setting up 3 vectors of 2 parameters:
startPoint = np.empty((3, 2), np.float) startPoint[:,0] = 1. startPoint[:,1] = 0. startPoint[1,0] -= .1 startPoint[2,1] += .1
Then, the optimizer instance can be built and called:
optimi = optimizer.PolytopeOptimizer(function = Rosenbrock(), criterion = criterion.criterion(ftol = .0001, iterations_max=50), x0 = startPoint) print optimi.optimize()
As usual, you can use a recorder to display how the optimizer behaved, which is the purpose of next section.
Interactive results check
In the tutorial folder on github, I’ve put an interactive application with Traits and Chaco. I would have prefered that the optimization took place in a background thread and that the main GUI updates itself with new iteration states, but I couldn’t get it to work (I need to practice Traits a bit more). Still, you can explore the results with this app.
The other way to check the results is to save each state, display it with matplotlib and make it a movie:
As usual with the Rosenbrock function, the optimizer finds quickly the minimum valley but then takes a lot of iterations to find the actual solution. As we need to move the simplex in this bent valley, the number of iterations increases rapidly. Still, the optimal value is found in the end.
The Simplex algorithm is a very crude one but it can be effective when the cost function is too complex to derive or if the gradient is too costly to compute. It is still very sensitive to the stopping criterion as well as the starting polytope, but it can be a first way to check your result before trying something different like a gradient-based optimization, genetic algorithms…
Next tutorial will be about Quasi-Newton methods, and then the following one may be on optimization comparisons.