1. Multivariate optimization
We've learned about univariate calculus, where the objective functions we optimized were limited to one variable. The real world is typically messier with multiple variables.
2. The biscuit factory
Say we own a biscuit factory. Let's consider a popular function in economics called the production function, which can be used to relate a firm's production inputs to its outputs.
We'll consider the production function, or number of biscuits being produced, to be dependent on two variables: L, the labor or hours worked, and K, the capital or hours machines were operated. That's two variables, so it's a multivariate optimization problem.
3. Partial derivatives
In univariate problems, the derivative tells us how the slope of the objective function changes with changes to the single input variable. When we have multiple variables we can use partial derivatives and follow a similar optimization workflow.
Partial derivatives help us understand how the slope of the function changes with respect to each variable independently. In our biscuit factory, the number of hours worked, L, and the number of machine hours, K, impact biscuit output.
Partial derivatives allow us to observe how the function behaves when only considering changes in one variable, holding all other variables constant. So the partial derivative of F with respect to K, describes how changing K impacts F, assuming that the other variable, L, is held constant. The mathematical notation is very similar to the univariate case, but we use curly-d's instead of normal d's.
Let's look at how to solve multivariate problems with SymPy.
4. Solving multivariate problems
We begin by importing symbols, diff, and solve from SymPy.
We define our variables K and L as symbols, and then use them to define the production function, F.
Here's where the multivariate case diverges from the univariate. Instead of calling diff on F only, we need to call diff on F twice, specifying K as the second argument in one call and L in the other.
Here, we are deriving partial derivatives, the first with respect to K, and the second with respect to L.
To solve the two partial differential equations, we pass them both in a list to SymPy's solve function, followed by a tuple of the symbols they should be solved with respect to. Like with the univariate case, solve will solve the equations and output the critical points where the slope is zero.
Let's print the results. We got an empty list! What could have gone wrong here? Let's take a closer look at our objective function.
5. Our production function
Since our objective function has two variables, a 3D plot can be used to view its behavior.
We can see that, as K and L increase, the number of biscuits we produce increases, but there are no maxima or minima!
This is why our SymPy code returned an empty list: this function has no critical points!
6. The limits of differentiation
Derivatives are a powerful tool for optimization, however caution is necessary.
Some functions may not mathematically have a derivative, as in the top figure shaped like a v.
If a function has discontinuities, as in the middle figure that shows a gap in the line, it, too, does not have a derivative.
Lastly, a function may not have a clear maxima or minima, as we saw in our production function - it's possible the function increases or decreases indefinitely.
These are all examples of non-differentiable functions, and these cannot be optimized using differentiation.
7. Let's practice!
As we move through the course, we'll continue to add more tools to our optimization toolkit for tackling more complex problems. For now, time for some practice!