I’m please to announce a new version for scikits.optimization. The main focus of this iteration was to finish usual unconstrained optimization algorithms.
- Fixes on the Simplex state implementation
- Added several Quasi-Newton steps (BFGS, rank 1 update…)
The scikit can be installed with pip/easy_install or downloaded from PyPI
It has been a while, too long for sure, since my last update on this scikit. I’m pleased to announce that some algorithms are finally fixed as well as some tests.
- Fixed Polytope/Simplex/Nelder-Mead
- Fixed the Quadratic Hessian helper class
Additional tutorials will be available in the next weeks.
I’m pleased to announce the first release of one of my projects. This scikits is based on a generic framework that can support unconstrained cost function minimization. It is based on a separation principle and is also completely object oriented.
Several optimizers are available:
- Nelder-Mead or simplex minimization
- Unconstrained gradient-based minimization
The usual criterias can be used:
- Iteration limit
- Parameter change (relative and absolute)
- Cost function changer (relative and absolute)
- Composite criterion generation (AND/OR)
Different direction searches are available:
- Several conjugate-gradient (Fletcher-Reeves, …)
- Decorators for selecting part of the gradient
- Marquardt step
Finally several line searches (1D minimization) were coded:
- Fibonacci and gold number methods (exact line searches)
- Wolfe-Powell soft and strong rules
- Goldstein line search
- Cubic interpolation
Additional helper classes can be used:
- Finite difference differentation (central and forward)
- Quadratic cost (for least square estimation)
- Levenberg-Marquardt approximation for least square estimation
Although it is the 0.1 version, the code is quite stable and is used in the learn scikit.
The package can be easy-installed or can be found on PyPI.
Several tutorials are available or will be available on the future at the following locations: