Get startedGet started for free

Global optimization in SciPy

1. Global optimization in SciPy

We have been working with optimization problems that have only one optimum. Now, let's explore global optimization.

2. Global optimization

Global optimization finds the best outcome when there are multiple good options available. For instance, let's consider a firm's profit where its production costs alternate between bring more expensive or cheaper as it grows. During expensive periods, growth is harder due to rising costs, while during cheaper periods, growth is easier. Given the red dash-dotted cost and the orange dashed demand, the blue solid profit has two local maxima at q=5 and 18 and a local minimum at 10. These are potential options for maximizing profit, but we need to identify the best one, known as the global maximum.

3. Getting stuck in a local optimum

A common problem when finding the global solution is getting stuck in a local one. Let's try to find the global maximum that occurs where q equals 18 using minimize_scalar. We define the objective, which is a polynomial of fourth order, then we invoke minimize_scalar and print the results. According to minimize_scalar, the optimization terminated successfully and but the maximum is at 5. This is because minimize_scalar is trapped in the smaller local maximum and terminated, ignoring the global maximum.

4. Getting stuck in a local optimum

Let's try the minimize function. Recall there are two local maxima at 5 and 18, with the latter the global maximum, and a local minimum at 10. If we provide an initial guess of 9, the algorithm will find the smallest of the two local maxima. What is worse is if we provide a value close to the local minimum as initial guess, the algorithm is trapped there!

5. Getting unstuck with basinhopping

In our pursuit to find the global maximum, we will use basinhopping, which we import from scipy.optimize. The syntax is quite similar to minimize. basinhopping does minimization, hence for our problem we need to provide the negative of the objective along with an initial guess. The third argument of basinhopping, niter, is the number of iterations. Here we use 10000. Higher vales of niter results in a more thorough search, but a slower execution. Feel free to play around with this parameter. This time, our result correctly found 18, the global maximum.

6. basinhopping with kwargs and bounds

basinhopping also takes the same optional arguments. One is kwargs, short for keyword arguments. To provide them, we enter them in a dictionary format. Here we provide bounds of 0 and 30 for the quantity using the constraints argument and Nonlinearconstraint. Next, kwargs is used in basinhopping via the minimizer_kwargs argument. Printing the results yields the global maximum. A good rule is to always provide the bounds and constraints if they are known, as this enhances the speed of finding the global optimum.

7. basinhopping with callback

Another useful argument of basinhopping is callback which allows us to monitor the optimization process more closely. We'll set up the problem as before. Then we define a callback function that takes three arguments: x is the current minimum being examined, f is the value of the objective at x and accept is a Boolean that is True if basinhopping deems x as a local optimum. We define a list called maxima. Whenever basinhopping examines a point it calls callback, which appends that point in the list maxima.

8. basinhopping - callback output

Using niter of 5. we found both local maxima. Note that the method is random, or stochastic. If we want to replicate our results we should set the seed argument of basinhopping.

9. Let's practice!

Time for some practice!