I’m pleased to announce the first release of one of my projects. This scikits is based on a generic framework that can support unconstrained cost function minimization. It is based on a separation principle and is also completely object oriented.
Several optimizers are available:
- Nelder-Mead or simplex minimization
- Unconstrained gradient-based minimization
The usual criterias can be used:
- Iteration limit
- Parameter change (relative and absolute)
- Cost function changer (relative and absolute)
- Composite criterion generation (AND/OR)
Different direction searches are available:
- Gradient
- Several conjugate-gradient (Fletcher-Reeves, …)
- Decorators for selecting part of the gradient
- Marquardt step
Finally several line searches (1D minimization) were coded:
- Fibonacci and gold number methods (exact line searches)
- Wolfe-Powell soft and strong rules
- Goldstein line search
- Cubic interpolation
Additional helper classes can be used:
- Finite difference differentation (central and forward)
- Quadratic cost (for least square estimation)
- Levenberg-Marquardt approximation for least square estimation
Although it is the 0.1 version, the code is quite stable and is used in the learn scikit.
The package can be easy-installed or can be found on PyPI.
Several tutorials are available or will be available on the future at the following locations: