Convolution is an algorithm that is often used for reverberations. If the equation is easy to understand and to implement, the implementation is costly. The other way of doing it is to use Fast Fourier Transform (FFT), but the direct/crude implementation requires latency. If it is possible to optimize the basic convolution code, it is sometimes more interesting to use a different algorithm, as it is the case here.
Tag: Optimization
When I developed my first tool based on genetic algorithms, it was to replace local optimization algorithm (“Simulated Annealing”, as advertised by Numerical Recipes) as a global optimization algorithm. Now a couple of years later, I found a small book on GE that seemed on topic with what I had to implement (I relied at the time on another book that I never reviewed ; perhaps another time). Is it a good brief introduction as the book says?
I’d like to talk a little bit about the way a compressor and an expander can be written with the Audio Toolkit. Even if both effects use far less filters than the SD1 emulation, they still implement more than just one filter in the pipeline, contrary to a fixed delay line (another audio plugin that I released with the compressor and the expander).
I’m happy to announce the release of my third audio plugin, the first based on the Audio Toolkit. It is available on Windows and OS X in different formats.
The UI was designed by Florent Keller, many thanks to him!
I’ve explained in earlier posts how to simulate a simple overdrive circuit. I’ve also explained how I implemented this in QtVST (and yes, I should have added labels on those images!), which was more or less the predecessor of Audio TK.
The main problem with simulating non linear circuits is that it costs a lot. Let’s see if I can improve the timings a little bit.
I’m please to announce a new version for scikits.optimization. The main focus of this iteration was to finish usual unconstrained optimization algorithms.
Changelog
- Fixes on the Simplex state implementation
- Added several Quasi-Newton steps (BFGS, rank 1 update…)
The scikit can be installed with pip/easy_install or downloaded from PyPI
Old announces:
In the next version of scikits.optimization, I’ve added some Quasi-Newton steps. Before this version is released, I thought I would compare several methods of optimizing the Rosenbrock function.
Now that version 0.2 of scikit.optimization is out, here is a tutorial on the gradient-free optimizer based on the simplex algorithm.
When the only thing you have is the cost function and when you don’t have dozens of parameters, the first thing that can be tried is a simplex algorithm.
It has been a while, too long for sure, since my last update on this scikit. I’m pleased to announce that some algorithms are finally fixed as well as some tests.
Changelog:
- Fixed Polytope/Simplex/Nelder-Mead
- Fixed the Quadratic Hessian helper class
Additional tutorials will be available in the next weeks.
Old announces:
I had to port a simplex/Nelder-Mead optimizer that I already have in Python in C++. As for the Python version, I tried to be as generic as possible but as efficient as possible, so the state is no longer a dictionary, but a simple structure.
I could have used the Numerical Recipes version, but the licence cost is not worth it, and the code is not generic enough, not explained enough. And also there are some design decisions that are questionable (one method = one responsibility).