Skip to content

Commit

Permalink
BasinHopping -> wraps scipy.optimize.basinhopping (#177)
Browse files Browse the repository at this point in the history
* `parameter` now can set `value` etc per parameter

* Added BasinHopping object

* Overwrite minimize method for the weird ones

* Added BasinHopping to the smart Fit-object

* Added tests for BasinHopping

* Added tests for bounds and constraints

* Kick up bounds to the local minimizer of BasinHopping

* Made `parameters` new interface iterable friendly.

* Updated docstrings and comments, in accordance with review

* cached_property object added, cache decorator removed.

Test have also been provided for this new decorator.

* Replaced @cache by @cached_property

* Fixed typo

* Use `method_name` upon .execute

* Updated docstrings.

* Improved cached_property tests

We check an internal counter to make sure the cached function is called only when no cache is present.

* Updated docs to include BasinHopping as a global minimizer

* Added BasinHopping section to the docs

* Fixed indentation error

* Processed review,  should now be finished!
  • Loading branch information
tBuLi authored Sep 8, 2018
1 parent ee2d6e0 commit f8a0f89
Show file tree
Hide file tree
Showing 7 changed files with 532 additions and 95 deletions.
96 changes: 82 additions & 14 deletions docs/fitting_types.rst
Original file line number Diff line number Diff line change
Expand Up @@ -143,6 +143,8 @@ Example::
``fit_result`` is a normal :class:`~symfit.core.fit_results.FitResults` object.
As always, bounds on parameters and even constraints are supported.

.. _minimize_maximize:

Minimize/Maximize
-----------------
Minimize or Maximize a model subject to bounds and/or constraints. As an example
Expand Down Expand Up @@ -363,11 +365,12 @@ More common examples, such as dampened harmonic oscillators also work as expecte

.. _global-fitting:

Global Fitting
--------------
In a global fitting problem, we fit to multiple datasets where one or more
parameters might be shared. The same syntax used for ODE fitting makes this
problem very easy to solve in :mod:`symfit`.
Fitting multiple datasets
-------------------------
A common fitting problem is to fit to multiple datasets. This is sometimes
referred to as global fitting. In such fits parameters might be shared
between the fits to the different datasets. The same syntax used for ODE
fitting makes this problem very easy to solve in :mod:`symfit`.

As a simple example, suppose we have two datasets measuring exponential decay,
with the same background, but different amplitude and decay rate.
Expand Down Expand Up @@ -397,6 +400,10 @@ normal way::
:width: 500px
:alt: ODE integration

Any ``Model`` that comes to mind is fair game. Behind the scenes :mod:`symfit`
will build a least squares function where the residues of all the components
are added squared, ready to be minimized. Unlike in the above example, the
`x`-axis does not even have to be shared between the components.

.. warning::
The regression coefficient is not properly defined for vector-valued models,
Expand All @@ -406,8 +413,8 @@ normal way::

Do not cite the overall :math:`R^2` given by :mod:`symfit`.

Finding the optimal solution
----------------------------
Global Minimization
-------------------
Very often, there are multiple solutions to a fitting (or minimisation)
problem. These are local minima of the objective function. The best solution of
course is the global minimum, but most minimization algorithms will only find a
Expand All @@ -416,35 +423,96 @@ your parameters. This can be incredibly annoying if you have no further
knowledge about your system.

Luckily, global minimizers exist which are not influenced by the initial
guesses for your parameters. In symfit the
guesses for your parameters. In symfit, two such algorithms from :mod:`scipy`
have been wrapped for this pourpose. Firstly, the
:func:`~scipy.optimize.differential_evolution` algorithm from :mod:`scipy` is
wrapped as :class:`~symfit.core.minimizers.DifferentialEvolution`. To use it,
wrapped as :class:`~symfit.core.minimizers.DifferentialEvolution`. Secondly,
the :func:`~scipy.optimize.basinhopping` algorithm is available as
:class:`~symfit.core.minimizers.BasinHopping`. To use these minimizers,
just tell :class:`~symfit.core.fit.Fit`::

from symfit import Parameter, Variable, Model, Fit
from symfit.core.minimizers import DifferentialEvolution

x = Parameter('x')
x.min, x.max = -100, 100
x.value = -2.5
y = Variable('y')

model = Model({y: x**4 - 10 * x**2 - x}) # Skewed Mexican hat
fit = Fit(model, minimizer=DifferentialEvolution)
fit_result = fit.execute()

However, due to how the algorithm works, it's not great at finding the exact
However, due to how this algorithm works, it's not great at finding the exact
minimum (but it will find it if given enough time). You can work around this by
"chaining" minimizers: first run a global minimization to (hopefully) get close
to your answer, and then polish it off using a local minimizer::

fit = Fit(model, minimizer=[DifferentialEvolution, BFGS])

.. note::
Differential evolution is rather sensitive to it's hyperparameters. You might
need to play with those to get appropriate results::
Global minimizers such as differential evolution and basin-hopping are
rather sensitive to their hyperparameters. You might
need to play with those to get appropriate results, e.g.::
fit.execute(DifferentialEvolution={'popsize': 20, 'recombination': 0.9})

.. note::
There is no way to garuantee that the minimum found is actually the global
minimum. Unfortunately there is no way around this.
There is no way to guarantee that the minimum found is actually the global
minimum. Unfortunately there is no way around this. Therefore, you should
always critically inspect the results.

Constrained Basin-Hopping
-------------------------
Worthy of special mention is the ease with which constraints or bounds can be
added to :class:`symfit.core.minimizers.BasinHopping` when used through the
:class:`symfit.core.fit.Fit` interface. As a very simple example, we shall
compare to an example from the :mod:`scipy` docs::

import numpy as np
from scipy.optimize import basinhopping

def func2d(x):
f = np.cos(14.5 * x[0] - 0.3) + (x[1] + 0.2) * x[1] + (x[0] + 0.2) * x[0]
df = np.zeros(2)
df[0] = -14.5 * np.sin(14.5 * x[0] - 0.3) + 2. * x[0] + 0.2
df[1] = 2. * x[1] + 0.2
return f, df

minimizer_kwargs = {"method":"L-BFGS-B", "jac":True}
x0 = [1.0, 1.0]
ret = basinhopping(func2d, x0, minimizer_kwargs=minimizer_kwargs, niter=200)

Let's compare to the same functionality in :mod:`symfit`::

import numpy as np
from symfit.core.minimizers import BasinHopping
from symfit import parameters, Fit, cos

x0 = [1.0, 1.0]
x1, x2 = parameters('x1, x2', value=x0)

model = cos(14.5 * x1 - 0.3) + (x2 + 0.2) * x2 + (x1 + 0.2) * x1

fit = Fit(model, minimizer=BasinHopping)
fit_result = fit.execute(niter=200)

No `minimizer_kwargs` have to be provided, as :mod:`symfit` will automatically
compute and provide the jacobian and select a minimizer. In this case, `symfit`
will choose `BFGS`. When bounds are provided, `symfit` will switch to
using `L-BFGS-B` instead. Setting bounds is as simple as::

x1.min = 0.0
x1.max = 100.0

However, the real strength of the `symfit` syntax lies in providing constraints::

constraints = [Eq(x1, x2)]
fit = Fit(model, minimizer=BasinHopping, constraints=constraints)

This artificial example will make sure `x1 == x2` after fitting. If you have
read the :ref:`minimize_maximize` section, you will know how much work this
would be in pure :mod:`scipy`.

Advanced usage
--------------
Expand Down
Loading

0 comments on commit f8a0f89

Please sign in to comment.