Robustness and duality in linear programming

Abstract In this paper, we consider a linear program in which the right hand sides of the constraints are uncertain and inaccurate. This uncertainty is represented by intervals, that is to say that each right hand side can take any value in its interval regardless of other constraints. The problem is then to determine a robust solution, which is satisfactory for all possible coefficient values. Classical criteria, such as the worst case and the maximum regret, are applied to define different robust versions of the initial linear program. More recently, Bertsimas and Sim have proposed a new model that generalizes the worst case criterion. The subject of this paper is to establish the relationships between linear programs with uncertain right hand sides and linear programs with uncertain objective function coefficients using the classical duality theory. We show that the transfer of the uncertainty from the right hand sides to the objective function coefficients is possible by establishing new duality relations. When the right hand sides are approximated by intervals, we also propose an extension of the Bertsimas and Sim's model and we show that the maximum regret criterion is equivalent to the worst case criterion.


Introduction and motivations
In a mathematical model, representing a real decision problem, it may be difficult to set a single value to some parameters due to some uncertainty or inaccuracy of factors. These factors may be different in nature (computation errors, random phenomena, uncertain future and so on), and can be represented by various models: scenarios, probability laws, intervals, fuzzy numbers and so on. The difficulty comes from the fact that there may not exist a solution that is optimal for all possible parameter values. Thus, in case of probabilistic models, the problem is often to determine the solution that optimizes the expected value criterion. Without probability laws, the goal is to determine a solution, that is relatively satisfactory for all possible parameter values, such a solution being described as robust. Several definitions of robustness are proposed in the literature (depending on the decision context); and several robustness criteria are introduced (see eg, Soyster, 1973;Kouvelis and Yu, 1997;Ben-Tal and Nemirovski, 1999;Vincke, 1999;Bertsimas and Sim, 2004;Roy, 2005).
In this paper, we are interested in two criteria commonly used in robust optimization: the worst case and the maximum regret criteria. We consider that the decision problem is * Correspondence: V Gabrel, Université Paris-Dauphine, LAMSADE, Place du Maréchal de Lattre de Tassigny, F-75016 Paris, France. E-mails: gabrel@lamsade.dauphine.fr; murat@lamsade.dauphine.fr represented by a general linear program denoted (P): involving n variables x j and m constraints. In this model, the uncertainty can affect all or part of exogenous parameters: the objective function coefficients c, the constraint matrix coefficients A and/or the right hand sides b. We will assume that the uncertainty can be represented by intervals, that is to say that an uncertain coefficient is represented by an interval [ −ˆ , +ˆ ] centered on , considered as a nominal value. A scenario is given by a single value for each uncertain coefficient (belonging to its interval), independently from each other.
In this context, a large number of results have already been obtained when the objective function coefficients are interval numbers (Inuiguchi and Sakawa, 1995;Laguna, 1998, 1999;Averbakh and Lebedev, 2005). The robust version according to the worst case criterion is an easy to solve linear program. Averbakh and Lebedev (2005) show that the robust version according to the maximum regret criterion is NP-hard. Several methods (exact and approximate) have already been proposed in Inuiguchi and Sakawa (1995) and in Laguna (1998, 1999) to solve the robust version according to the maximum regret criterion (see Section 4.1 below for a definition of maximum regret).
The case of uncertainty on the constraint matrix coefficients has also been studied see eg, Soyster (1973) and Chinneck and Ramadan (2000). The problem is often to determine a feasible solution in all the possible scenarios, which is similar to the worst case approach. More recently, Sim (2003, 2004) have proposed a new polynomial model, less conservative than the worst case criterion. The case of uncertainty only on the right hand sides of the constraints has not been specifically studied. Only Minoux (2009) has studied this case and showed that the classical duality relationships cannot be applied on the robust version.
We consider the case of uncertainty on the right hand sides only. We apply successively the worst case criterion and the Berstimas and Sim's model. Using the duality theory, we transfer the uncertainty on the objective function coefficients in order to exploit the many results already achieved in this context. We analyse, therefore, different versions of robust program and, for each version, we define the dual program. We find, as in Minoux (2009) that the dual of the robust linear program does not match with the robust dual linear program in which the uncertainty is on the objective function coefficients. However, we show that, in the dual of the robust linear program, the criterion of robustness has also (and logically) been dualized; the dual criterion of the worst case being the best case. It allows to transfer the uncertainty from the right hand sides towards the objective function coefficients. Considering the Bertsimas and Sim's model, these new duality relationships lead us to propose a variant of the original model, much more relevant to assess the robustness of a solution. Lastly, we analyse the maximum regret criterion; we show that, in the case of uncertainty on the right hand sides, the robust version according to this criterion is equivalent to the robust version according to the worst case criterion.
In the next section, we present the robust version according to the worst case criterion of a linear program with uncertain objective function coefficients, then with uncertain right hand sides. In each case, we introduce the reverse criterion, called the best case criterion, which we will be useful to establish the duality relationships between the different versions at the end of Section 2. In Section 3, we focus on the model of Bertsimas and Sim that we present as a generalization of the worst case criterion. After having presented the robust version of this model, we introduce a generalization of the best case criterion to establish new duality relations. We conclude the Section 4 by analysing the maximum regret criterion. Some notes on the notation used in this paper have been inserted at the end before the bibliography.

The robustness according to the worst case criterion
In the following, we assume that (P) has a non-empty feasible solution set and a bounded optimal value. We denote v(P) the optimal solution value of (P).

Uncertainty on the objective function coefficients
In (P), let us suppose that the objective function coefficients are uncertain and for all j = 1, . . . , n,c j −ĉ j c j c j +ĉ j andĉ j 0.
2.1.1. The worst case criterion The evaluation of a solution x on the worst case criterion is obtained by considering the worst scenario as follows: This value f wor (x) constitutes an absolute guarantee on the value of x, as its value is lower or equal to f wor (x) under any scenario. This is why f wor (x) can be considered as a robustness measure.
A solution minimizing f wor is obtained by solving the problem denoted (P obj W ) such that: As x 0, the worst case scenario corresponds to c =c +ĉ.
Thus, (P obj W ) corresponds to (P) in which the objective function coefficients are equal to their greatest value.
(P obj W ) is the robust version of (P) when the worst case criterion is considered. At the reverse of the worst case criterion, we can define the best case criterion. This criterion permits to determine the best feasible solution under all the possible scenarios.

The best case criterion The evaluation of a solution
x on the best case criterion is obtained by considering the most favourable scenario as follows: The problem, denoted (P

Uncertainty on right hand sides
In the problem (P), one assumes that each right hand side is approximated by an interval number, and for all i = 1, . . . , m, 2.2.1. The worst case criterion The uncertainty concerns now the feasibility of a solution. A scenario b is adverse to a solution x when x does not belong to the set of feasible solutions defined by: X b = {x ∈ R n : Ax b, x 0}. So, for a given solution x, two cases have to be considered: • either x is feasible under all the scenarios, or equivalently for all the possible values of b, and in this case, f wor (x)=cx under any b, • or there is at least one value of b for which x is not feasible and, in this case, f wor (x) = +∞.
As we have to minimize f wor (x), the solutions with an evaluation equals to +∞ may not be optimal, and the optimal solution belongs to those which are feasible under all the scenarios. According to the sense of the constraints, the set of the feasible solutions under all the scenarios is {x 0 : Ax b +b}. Therefore, to determine a solution optimizing the worst case criterion when each right hand side belongs to an interval, we have to solve the following linear program, denoted (P rhs W ): Example 1 Let us consider the following linear program: The robust version of (P) according to the worst case criterion is: Its optimal solution is x W 1 = 13/3, x W 2 = 1/3 and v(P rhs W ) = 53/3 = 17.667. We note, in Figure 1, that this solution is feasible under any scenario (b 1 , b 2 ). However, it has a bad value compared to the value 22/3 of the optimal solution, x N 1 = 5/3 and x N 2 = 2/3, of the nominal problem (in which The optimal solution of (P rhs W ) is robust in the sense that it is feasible under all the possible scenarios. In return, this solution may present a really bad evaluation compared to the best solution (feasible in at least one scenario). In order to measure this gap, we have to compute the optimal solution according to the best case criterion.

The best case criterion For any solution
We have to determine the optimal solution among the union of all the feasible solutions sets defined by {x 0 : Ax b −b}. Consequently, the best case problem with uncertainty on right hand sides, denoted (P rhs B ), can be written as follows: Example 2 Let us consider again the linear program in the Example 1. When the best case criterion is applied, the problem to solve is: Its optimal solution is x B 1 =0 and x B 2 =1/2 of value v(P rhs B )= 1/2. This value is the best possible value considering all the scenarios. Thus, any optimal solution value is included in the interval [1/2, 53/3].
When the right hand sides are uncertain, it seems natural to transfer the uncertainty on the objective function coefficients by applying the duality theory. This is the subject of the next section.

Duality relationships
The dual, denoted (D), of the problem (P) is: When (P) has uncertain right hand sides, the objective function coefficients in (D) are uncertain, and applying the worst case criterion leads to solve the following problem: In order to explain this value on the primal problem, we have to write the dual of (D Now, we do not recognize (P rhs W ) as expected but (P rhs B ). Therefore, applying the worst case criterion on a linear program in which the right hand sides are uncertain is not equivalent to applying the same criterion on its dual program, in which the objective function coefficients are uncertain. More precisely, the applied criterion is the dual criterion-the best case criterion. Thus, we infer the following relationships: In fact, we have to apply the best case criterion on (D) to obtain (P rhs W ) as shown in the following: In conclusion, if one wants to transfer the uncertainty on the right hand sides towards the objective function coefficients (or vice versa), one must apply the 'dual' criterion on the dual linear program. More recently, Bertsimas and Sim have proposed a model of robustness that remains polynomial while providing more qualified solutions; these solutions are not evaluated under the worst case scenario whose authors claim that it has very little chance to occur. This new model can be seen as a generalization of the worst case criterion and, in the next section, we generalize the dual relationships established previously.

Bertsimas and Sim approach for objective function coefficients
In (P), we suppose that for all j =1, . . . , n,c j −ĉ j c j c j + c j , withĉ j 0. Bertsimas and Sim (2004) consider that, for all j = 1, . . . , n,c j is a nominal value from which c j can get away because of uncertainty. With the worst case criterion, we suppose that all the coefficients will simultaneously reach their worst value. Bertsimas and Sim suppose that only a subset of parameters should reach simultaneously their worst value. Thus, their approach can be seen as a generalized worst case criterion. They introduce the parameter 0 (with 0 n), which defines the maximum number of coefficients that should get away from their nominal value. In the robust version, according to the Bertsimas and Sim's approach, the value of a solution is its worst value when at most 0 parameters may deviate from their nominal value. For the problem (P), it amounts to compute the largest value of a solution knowing that at most 0 coefficients of the objective function will deviate from their nominal valuec. The deviation on c j is represented by an additional variable z j as follows: c j =c j + z jĉ j with −1 z j 1. According to Bertsimas and Sim, the robust version of (P), denoted (P obj W G ) for worst case criterion generalized, is: .. ,n is a line vector andĈ a diagonal matrix such thatĈ j j =ĉ j for all j = 1, . . . , n. For a given x 0, a deviation which increases the objective function value is such that z j 0. Consequently, the problem can be rewritten as follows: For a given x, the set of feasible solutions for the z variables is non-empty and bounded. Thus, according to the duality theory, we get: which is equivalent to the following linear programme: The optimal solution of (P obj W G ) is robust according to the Bertsimas and Sim approach because, even if at most 0 coefficients of the objective function deviate from their nominal value (and reach their worst value), the value of this solution will be necessarily lower or equal to v(P obj W G ). That is why v(P obj W G ) represents an absolute guarantee for a given budget 0 . Moreover, we get the following result: In the Bertsimas and Sim's approach, the aim is to protect a solution against a possible deviation of some parameters' value. It generalizes the worst case criterion. Now, let us generalize the best case criterion in the same way. We define the problem, denoted (P obj BG ), such that: This problem is equivalent to the following quadratic program: Let us remark that if 0 = n, the optimal value of each z i variables equals to 1 as x 0 and we recognize the best case problem (P obj B ), a linear program easy to solve. On the other side, if 0 ∈]0, n[, the problem becomes difficult to solve as the quadratic function to minimize is concave.
In general, we can distinguish three cases: • 0 = 0, and all the coefficients are equal to their nominal value without any possible deviation. So, we get v(P

The Bertsimas and Sim's approach for right hand sides
We now consider that the right hand sides in (P) are uncertain andb −b b b +b. For all the possible values for b, we suppose that the linear program (P) has a bounded optimal solution value.
To model the uncertainty, we introduce a variable z i for each constraint i such that b i =b i + z ibi and −1 z i 1. The column vector z = (z i ) i=1, ... ,m represents all the possible deviations. For a direct application of the Bertsimas and Sim's approach, we have to introduce a parameter, i ∈ [0, 1] for each constraint i, and to determine the worst solution under these budgets. According to the sense of the constraints, the worst case is achieved when each right hand side increases, that corresponds to z i = i . Therefore, the robust version of (P) corresponds to the following linear program (P ber ): One recognizes the worst case criterion when i = 1 for all the constraints i. The problem (P ber ) is still polynomial like (P). However, in terms of the robustness analysis, (P ber ) is not really interesting because it leads to choose a solution on the basis of a single scenario induced by the vector .
As the Bertsimas and Sim's approach is more relevant when the uncertainty concerns the objective function coefficients, we have to transfer the uncertainty from the right hand sides to the objective function coefficient using the dual program. We recall that the dual (D) is: A t y c t y 0 As proposed in the previous section, the criterion to apply on the dual program is the generalized best case criterion in which 0 is the budget of uncertainty on the objective function coefficients of (D). For a maximization problem, the dual version is: withB a diagonal matrix such thatB ii =b i for all i =1, . . . , m. This problem is equivalent to: One can rewrite (D obj BG ) as follows: For a given z, let us denote (D z ) the following linear program: The dual of (D z ), denoted (P z ), corresponds to (P) for the scenario induced by z: By hypothesis, (P z ) has a bounded optimal solution value and, according to the duality theory, v(D z ) can be replaced Thus, applying the generalized best case criterion on the dual program (D) does not amount to applying the Bertsimas and Sim's approach on (P). We do not recognize (P ber ) and we manage another budget of uncertainty. This budget of uncertainty 0 covers all the right hand sides and determine the worst optimal solution in this single budget. (D obj BG ) is much more relevant for robustness analysis. It can really be interpreted as a generalized worst case criterion. Indeed, if 0 = m, the optimal value of z i is equal to 1 for all i = 1, . . . , m, and the problem is exactly (P rhs W ). However, this robust version is not linear, and the problem to solve, denoted (P rhs W G ), is NP-hard. In (P rhs W G ), at most 0 right hand sides may deviate from their nominal value (being more restricted). So, we get the following relationship: Example 3 Let us consider the Example 1 and choose three values for 0 : • 0 = 2: at most two constraints may be restricted and we recognize (P rhs W ) with value 53/3; • 0 = 0: no deviation is allowed that amounts to solving the nominal program with value 22/3; • 0 = 1: at most one constraint (among two) may be restricted-either one of them, or a part of both. The generalized worst case criterion leads to compute the worst value of a solution in this budget of uncertainty.
Moreover, (P obj W G ) (introduced in Section 3.1) with uncertainty on the objective function coefficients is a linear program for which it is possible to define the dual program. We get: This program amounts to applying the generalized best case criterion on the dual with uncertainty on the right hand sides for a budget 0 . In this context, the budget corresponds to the maximal number of constraints that can be relaxed. We obtain the following relationship: v(P Example 4 Let us consider again the Example 1 and choose two values for 0 : • 0 =2: all the constraints can be relaxed and we get (P rhs B ), with optimal value 1/2; • 0 = 1: a part of the two constraints (at most one) may be relaxed: either one of the constraints, or apart of both. We have to determine the best solution in this budget.
(P rhs BG ) with 0 = 1 can be written: Its optimal solution is x BG 1 = 0, x BG 2 = 5/4, z BG 1 = 1/4 and z BG 2 = 3/4 with value v(P rhs BG ) = 5/4. This value represents the best value for (P) when at most one constraint may be relaxed. Let us remark that in this case, the optimal values of the z i variables are not necessarily binary.
In conclusion, there exists strong dual relationships between the robust versions of (P) and (D) if we take care to consider the dual criterion: (generalized) worst case versus (generalized) best case.
To measure the robustness of a solution, the maximum regret criterion is also commonly applied. Many results have already been obtained when the uncertainty concerns the objective function coefficients. We recall these results in the next section. Then, we analyse the specific case of uncertainty on the right hand sides.

Uncertainty on objective function coefficients
In the problem (P), we assume that the objective function coefficients are uncertain andc −ĉ c c +ĉ, withĉ 0.
For a solution x and a scenario c, the regret, denoted reg(x, c), is defined by the difference between the value of the solution x under the scenario c and the value of the optimal solution for the scenario c: The evaluation of a solution x on the maximum regret criterion is obtained by considering the largest regret as follows: Averbakh and Lebedev (2005) show that the problem of computing f reg (x) for a given solution x is NP-hard.
According to the maximum regret criterion, the robust version of (P), denoted (P obj R ), is: In Inuiguchi and Sakawa (1995) authors show that the maximum regret of any x is obtained for the extreme values of the intervals. A scenario (c 1 , . . . , c n ) is extreme if ∀ j = 1, . . . , n, we have c j =c j or c j = c j . We denote S the set of the 2 n extreme scenarios. So, we have: However, for a scenario c, it is easy to compute the corresponding optimal solution denoted y c . It is then possible to linearize (P obj R ) by introducing a new variable r , and a constraint for each extreme scenario. We obtain: Example 5 Considering the problem of the Example 1, we analyse the dual program in which the objective function coefficients are interval numbers. The robust version of the dual program according to the maximum regret criterion is: v(D obj R ) = min y 1 +y 2 4 2y 1 −y 2 1 y 1 0,y 2 0 max 1 b 1 5 −2 b 2 4 t 1 +t 2 4 2t 1 −t 2 1 t 1 0,t 2 0 Let us consider all the extreme scenarios: • For b = (1, −2) the optimal solution is (1/2, 0) of value 1/2. Thus, any solution (y 1 , y 2 ) induces a regret equals to 1/2 − y 1 + 2y 2 for this extreme scenario. • For b=(1, 4) the optimal solution is (0,4) of value 16. Thus, any solution (y 1 , y 2 ) induces a regret equals to 16− y 1 −4y 2 for this extreme scenario.

Uncertainty on right hand sides
In the problem (P), one assumes that the right hand sides are approximated by intervals, namelyb −b b b + b, withb 0. To determine the regret of a solution x for a scenario b, we must consider two cases: For a solution x, which is not feasible on all the scenarios, Consequently, a solution which minimizes f reg (x) is necessarily a solution which is feasible on all the scenarios: it belongs to Xb +b . In this case, ∀x ∈ Xb +b we have: The problem, denoted (P rhs R ), is to minimize f reg (x) as follows: The optimal solution according to the maximum regret criterion corresponds exactly to the optimal solution according to the worst case criterion. As (P rhs W ) and (P rhs B ) are linear programs, these problems are easy to solve, whereas in the case of uncertainty on the objective function coefficients, (P obj R ) is NP-hard. Indeed, (P obj R ) is not equivalent to (P obj W ) and these programs give different ways on the robustness analysis. The optimal solutions of (P obj R ) may be interior solutions, and the 'transfer' of the uncertainty on the objective function coefficients towards the right hand sides by the duality relationships cannot be exploited.

Example 6
In the optimal solution of (D obj R ), we have y 1 and y 2 , the dual variables associated with the constraints of (P), strictly positive. This implies that the optimal solution of (P) is such that: x 1 + 2x 2 = b 1 and x 1 − x 2 = b 2 . Moreover, the maximum regret, of value 17/4, comes from either b = (1, −2) or b = (1, 4). Considering the first extreme scenario b = (1, −2), we obtain a non-feasible solution of (P) equals to (−1, 1). Considering now the second extreme scenario b = (1, 4), we obtain the solution (3, −1). This solution is also not feasible for (P).

Conclusion
We consider linear programs with uncertain parameters. The value of an uncertain coefficient is approximated by an interval of possible values. Several results have already been obtained when the objective function coefficients are approximated by interval numbers. The linear programs with only uncertain right hand sides have not been specifically studied. We show that the robust framework proposed by Bertsimas and Sim is not suited to this particular case.
Considering the worst case criterion, we prove in this paper that it is possible to transfer the uncertainty from the right hand sides to the objective function coefficients by establishing new duality relationships. We also propose an extension of the model of Bertsimas and Sim in the particular case of uncertainty on the right hand sides and we establish similar duality relationships. In addition, we are setting that the robust version according to the maximum regret criterion is equivalent to the robust version according to the worst case criterion.
We are showing that: • to optimize the robust version according to the (generalized) worst case criterion of a linear program with uncertain objective function coefficients is equivalent to optimizing the dual program (with uncertain right hand sides) according to the (generalized) best case criterion, • to optimize the robust version according to the (generalized) worst case criterion of a linear program with uncertain right hand sides is equivalent to optimizing the dual program (with uncertain objective function coefficients) according to the (generalized) best case criterion.  which are difficult to solve quadratic programs. But, these programs are really interesting for robustness analysis and we have to propose exact and approximated algorithms for solving them. This is left for future research.

Some notes on notation
In the mathematical programs considered, c and x are vectors of n dimension, b is vector of m dimension and A is an n by m matrix. The individual components of the vectors are x j , c j and b i where index j runs from 1 to n, and index i runs from 1 to m. The standard linear programming problem is defined as problem P in the paper. Such a problem under suitable conditions has an optimum solution. Every problem P can be transformed into a related problem D called the dual of P (which itself is then referred to as the primal problem) by following simple transformation rules but involving a new set of variables y. If this transformation is carried out correctly, then if the problem P has an optimum solution, the problem D will have an optimum solution, and they are equal in value. This is the famous Duality Theorem.
The problem P is a linear programming problem if the objective is to minimise the function cx where c is a vector of constants and x is a vector of variables. However if c is also a vector of variables (as happens in two problems in the paper), then we are multiplying an unknown variable by an unknown variable and this makes it a quadratic programming problem.
To classify the various problems considered, the paper then uses the following notation: P primal linear program D dual linear program v( ) optimal solution value of problem in the parenthesis obj problem in which objective function coefficients vary within defined ranges rhs problem in which right hand side coefficients vary within defined ranges f wor worst case criterion f bes best case criterion f reg maximum regret criterion W robust version according to the worst case criterion B version according to the best case criterion W G robust version according to the generalized worst case criterion (Bertsimas and Sim' approach) BG version according to the generalized best case criterion (Bertsimas and Sim' approach)