Reduced cost
Encyclopedia
In linear programming
Linear programming
Linear programming is a mathematical method for determining a way to achieve the best outcome in a given mathematical model for some list of requirements represented as linear relationships...

, reduced cost, or opportunity cost, is the amount by which an objective function coefficient would have to improve (so increase for maximization problem, decrease for minimization problem) before it would be possible for a corresponding variable to assume a positive value in the optimal solution. It is the cost for increasing a variable by a small amount, i.e., the first derivative from a certain point on the polyhedron
Polyhedron
In elementary geometry a polyhedron is a geometric solid in three dimensions with flat faces and straight edges...

 that constrains the problem. When the point is a vertex in the polyhedron, the variable with the most extreme cost, negatively for minimisation and positively maximisation, is sometimes referred to as the steepest edge.

Given a system minimize subject to , the reduced cost vector can be computed as , where is the dual cost vector.

It follows directly that for a minimisation problem, any non-basic variables at their lower bounds with strictly negative reduced costs are eligible to enter that basis, while any basic variables must have a reduced cost that is exactly 0. For a maximisation problem, the non-basic variables at their lower bounds that are eligible for entering the basis have a strictly positive reduced cost.

Interpretation

For the case where x and y are optimal, the reduced costs can help explain why variables attain the value they do. For each variable, the corresponding sum of that stuff gives the reduced cost show which constraints forces the variable up and down. For non-basic variables, the distance to zero over the gives the minimal change in the object coefficient to change the solution vector x.

Reduced costs in pivot strategy

In principle, a good pivot strategy would be to select whichever variable has the greatest reduced cost. However, the steepest edge might ultimately not be the most attractive, as the edge might be very short, thus affording only a small betterment of the object function value. From a computational view, another problem is that to compute the steepest edge, an inner product must be computed for every variable in the system, making the computational cost too high in many cases. The Devex algorithm attempts to overcome the latter problem by estimating the reduced costs rather than calculating them at every pivot step, exploiting that a pivot step might not alter the reduced costs of all variables dramatically.
The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK