The name Linear Programming is a bit deceiving. This is not programming like programming in Python. Instead, this is just a class of optimisation models consisting of two parts:
A linear objective function. This is the part that you optimize. In economics it may be maximizing profits, or minimizing costs. But the methodology is much more general than those two applications.
A set of linear constraints. If we are optimizing production in a small business, these constraints could include number of employees, raw materials, equipment capacity, etc.
We introduce a simple linear program involving a baker with a given amount of raw inputs: flour, sugar, and butter, and a goal of maximizing profits by selling a combination of two types of cookies. For this simple set-up we can solve the problem graphically and gain some intuition. We will use this example throughout.
Use the graphical method to solve the following LP problem
$$ \max 3x_1 + 4x_2 $$$$s.t.$$$$3x_1 + 2x_2 \le 6$$$$x_1 + 4x_2 \le 4$$$$x_1 \ge 0, x_2 \ge 0$$(EMEA 19.1.a)
We can ask a seemingly simple question: How does the profits of our baker increase if we increase the amount of flour we have by 1kg? We can work this out easily in this example.
We can interpret the resulting increase in profits either as a marginal profit of flour, or what economists call a Shadow Price: the value the baker puts on an extra kg of flower.
As we will see more clearly in the next videos, these marginal values let us reformulate our linear program in an alternative form called the Dual problem
Now we take our discussion of shadow prices and the dual problem one step further. We take the shadow prices that we started to derive in the previous video and use them to show that the profit we found in our solution of our LP really is the maximum amount we can earn.
Our calculations on our baker problem until now have been leading up to an alternative statement of our linear program called the dual problem. Here we write our dual formulation for our baker problem.
Consider the LP problem:
$$\min 10u_1 + 27u_2$$$$s.t.$$$$u_1 + 3u_2 \ge 11$$$$2u_1 + 5u_2 \ge 20$$$$u_1 \ge 0, u_2 \ge 0$$Write down the dual to the program and solve.
(EMEA Exercise 19.2.3)
Duality is not limited to economic problems where we are maximizing profits or, as the dual, minimizing costs. Any problem you can pose as a linear program also has a dual formulation.
In real life situations a problem can involve many variables and many constraints. Writing out the problem in equation form then becomes unweildy. Luckily, linear programs, as well as their duals can easily be formulated in matrix form.
We can summarize our discussion of duality by stating, somewhat more formally, some general theorems of Duality
Consider the LP problem:
$$\max 2x + 7y$$$$s.t.$$$$4x + 5y \le 20$$$$3x + 7y \le 21$$$$x \ge 0, y \ge 0$$a.) Solve it by a graphical argument
b.) Write down the dual and solve it by a graphical argument
c.) Are the values of the objective functions equal?
(EMEA 19.3.1)
Recall in our baker problem that the solution involved finding the intersection of the flour and butter contstraints, but not the sugar constraint. The reasoning was that at the optimum, we used up all our flour and butter, but had some sugar left.
When we calculated our shadow values (the Dual problem), we ended up getting positive shadow prices for flour and butter, but the shadow value for sugar was zero. This makes sense. Since at the optimum production levels we still have excess sugar left, then our value of extra sugar is zero.
This pattern turns out to be generally true in linear programs and is called complementary slackness: When a constraint does not bind in the primal, then the shadow price will be zero in the dual. The inverse also is true (if a constraint in the dual is zero, then there will be 0 use of a resource).
It turns out that complementary slackness is a useful tool for finding solutions of linear programs by helping us identify which constraints bind.
Consider the following problem:
$$\min y_1 + 2y_2$$$$s.t.$$$$y_1 + 6y_2 \ge 15$$$$y_1 + y_2 \ge 5$$$$-y_1 + y_2 \ge -5$$$$y_1 -2y_2 \ge -20$$$$y_1 \ge 0$$$$y_2 \ge 0$$a.) Solve this LP graphically
b.) Write down the dual problem and solve it
c.) If the first constraint $y_1 + 6y_2 \ge 15$ is changed to $y_1 + 6y_2 >15.1$, what happens to the optimal values of the dual variables?
(EMEA exercise 19.5.2)
Python and the scipy package have some good tools for setting up and solving linear programming problems. We will take a very short look in this lab, leaving a fuller discussion for the next lab.
import numpy as np
from scipy.optimize import linprog
import matplotlib.pyplot as plt
from matplotlib.patches import Polygon
%matplotlib inline
Here we'll start by showing how to formulate the baker problem we defined and solved graphically and by hand, but this time letting scipy's simplex algorithm find the solution
Here we write our objective function as a vector of coefficients: we had the objective function:
$20x_1 + 30x_2:$
c_ex1 = np.array([20, 30])
Our ingredient constraints we will write as a matrix.
while the actual constraining amounts will be written as a vector
So our initial set of conditions
$3x_1 + 6x_2 \le 150$ (flour)
$x_1 + .5x_2 \le 22$ (Sugar)
$x_1 + x_2 \le 27.5$ (Butter)
becomes
A_ex1 = np.array([[3, 6],
[1, .5],
[1, 1]])
b_ex1 = np.array([150,22,27.5])
And then we solve, with one twist: our solver always assumes we are doing minimization, but here we are doing maximization. Easy solution - we just add a negative sign to our objective array
# Solve the problem
res_ex1 = linprog(-c_ex1, A_ub=A_ex1, b_ub=b_ex1, method='revised simplex')
res_ex1
con: array([], dtype=float64) fun: -775.0 message: 'Optimization terminated successfully.' nit: 2 slack: array([0. , 5.75, 0. ]) status: 0 success: True x: array([ 5. , 22.5])
We can confirm that this is the same solution we found earlier
Consider the LP problem:
$$\max x+2y$$$$s.t.$$$$x+y \le 4$$$$-x+y \le 1$$$$2x-y \le 3$$$$x\ge0, y\ge0$$a.) Find the solution
b.) Formulate and solve the dual problem
(EMEA Review Exercise 19.1)
Consider hte LP problem
$$\min 16y_1 + 6y_2 - 8y_3 -15y_4$$$$s.t.$$$$-y_1 + y_2 -2y_3 -4y_4 \ge -1$$$$2y_1 -2y_2-y_3 - 5y_4 \ge 1$$$$y_i \ge 0 \text{ for } i=1,2,3,4$$a.) Write down the dual problem and solve it
b.) Find the solution to the primal problem
c.) If the first constraint in the primal is changed to $-y_1 + y_2-2y_3-4y_4\ge k$, for what values of k will the solution of the dual occur at the same point as for k=-1
(EMEA Review Exercise 19.2)
The production of three goods requires using two machines. Machine 1 can be utilized for $b_1$ hours, while machine 2 can be utilized for $b_2$ hours. The time spent for the production of one unit of each good is given by the following table:
Machine 1 | Machine 2 | |
---|---|---|
Good 1 | 3 | 2 |
Good 2 | 1 | 2 |
Good 3 | 4 | 1 |
The profits per unit produced of the three goods are 6, 3, and 4, respectively.
a.) Write down the LP problem this leads to. Write down the dual.
b.) Solve the problem, using python, when $b_1=b_2=100$
c.) Consider a situation where a new machine is added to increase the efficiency of the process. The new machine can be utilized $b_3$ hours. The table below gives the new time spent for the production of one unit of each good.
Machine 1 | Machine 2 | Machine 3 | |
---|---|---|---|
Good 1 | 1.5 | 1 | 1 |
Good 2 | .5 | .5 | 1 |
Good 3 | .5 | 1 | .5 |
We allow each machine to be used $b_1 =b_2=b_3 = 66.7$ (So that the total amount of machine time is approximately the same)