Strong duality

news/2025/9/24 23:12:06/文章来源:https://www.cnblogs.com/by-chance/p/19110220

Primal Problem and Dual Problem

Consider the standard form linear programming problem

\[\begin{aligned} \text{minimize }~~~&\mathbf c'\mathbf x\\ \text{subject to}~~~&\mathbf A\mathbf x=\mathbf b\\&\mathbf x\ge 0 \end{aligned} \]

which we call the primal problem.

Let \(x^{*}\) be an optimal solution, assumed to exist. We introduce a relaxed problem in which the constraint \(\mathbf A\mathbf x=\mathbf b\) is replaced by a penalty \(\mathbf p'(\mathbf b-\mathbf A\mathbf x)\), where \(\mathbf p\) is a price vector of the same dimension as \(\mathbf b\). We then faced with the problem

\[\begin{aligned} \text{minimize }~~~&\mathbf c'\mathbf x+\mathbf p'(\mathbf b-\mathbf A\mathbf x)\\\ \text{subject to}~~~&\mathbf x\ge 0 \end{aligned} \]

Let \(g(\mathbf p)\) be the optimal cost for the relaxed problem. Then we have

\[g(\mathbf p)=\min_{\mathbf x\ge 0}[\mathbf c'\mathbf x+\mathbf p'(\mathbf b-\mathbf A\mathbf x)]\le\mathbf c'\mathbf x^*+\mathbf p'(\mathbf b-\mathbf A\mathbf x^*)=\mathbf c'\mathbf x^* \]

Thus each \(\mathbf p\) leads to a lower bound \(g(\mathbf p)\) for the optimal cost \(\mathbf c'\mathbf x^*\). The problem

\[\begin{aligned} \text{minimize }~~~&g(\mathbf p)\\\ \text{subject to}~~~&\text{no constraints} \end{aligned} \]

can be then interpreted as a search for the tightest possible lower bound of this type, and is known as the dual problem.

Using the definition of \(g(\mathbf p)\), we have

\[g(\mathbf p)=\mathbf p'\mathbf b+\min_{\mathbf x\ge 0}(\mathbf c'-\mathbf p'\mathbf A)\mathbf x=\mathbf p'\mathbf b+\begin{cases}0,&\text{if }\mathbf c'-\mathbf p'\mathbf A\ge \mathbf 0'\\-\infty,&\text{otherwise}\end{cases} \]

Therefore the dual problem is the same as the linear programming problem

\[\begin{aligned} \text{maximize }~~~&\mathbf p'\mathbf b\\ \text{subject to}~~~&\mathbf p'\mathbf A\le \mathbf c' \end{aligned} \]

Weak duality

In the above derivation, we can easily find the following property:

If \(\mathbf x\) is a feasible solution to the primal problem and \(\mathbf p\) is a feasible solution to the dual problem, then

\[\mathbf p'\mathbf b\le \mathbf c'\mathbf x \]

There are two corollaries that are useful later.

(a) If the optimal cost in the primal is \(-\infty\), then the dual problem must be infeasible; if the optimal cost in the dual is \(+\infty\), then the primal problem must be infeasible.

(b) Let \(\mathbf x\) and \(\mathbf p\) be feasible solutions to the primal and the dual, respectively, and suppose \(\mathbf p'\mathbf b=\mathbf c'\mathbf x\). Then \(\mathbf x\) and \(\mathbf p\) are optimal solutions to the primal and the dual, respectively.

Strong duality

The above result is still too weak. The next theorem is the central result on linear programming duality:

If a linear programming problem has an optimal solution, so does its dual, and the respective optimal costs are equal.

The rest of this article is two proofs of this theorem.

Proof 1

Let us assume temporarily that the rows of \(\mathbf A\) are linearly independent and that there exists an optimal solution. Let us apply the simplex method to this problem.

As long as cycling is avoided, e.g., by using the lexicographic pivoting rule, the simplex method terminates with an optimal solution \(\mathbf x\) and an optimal basis \(\mathbf B\). Let \(\mathbf x_B=\mathbf B^{-1}\mathbf b\) be the corresponding vector of basic variables.

When the simplex method terminates, the reduced costs must be nonnegative and we obtain

\[\mathbf c'-\mathbf c'_B\mathbf B^{-1}\mathbf A\ge \mathbf 0' \]

where \(\mathbf c'_B\) is the vector with the costs of the basic variables. Let us define a vector \(\mathbf p\) by letting \(\mathbf p'=\mathbf c_B'B^{-1}\), then we have \(\mathbf p'\mathbf A\le \mathbf c'\), which shows that \(\mathbf p\) is a feasible solution to the dual problem. In addition,

\[\mathbf p'\mathbf b=\mathbf c'_B\mathbf B^{-1}\mathbf b=\mathbf c'_B\mathbf x_B=\mathbf c'\mathbf x \]

It follows that (by the corollary (b)) \(\mathbf p\) is an optimal solution to the dual, and the optimal dual cost is equal to the optimal primal cost.

If we are dealing with a general linear programming problem \(\Pi_1\) that has an optimal solution, we first transform it into an equivalent standard form problem \(\Pi_2\), with the same optimal cost, and in which the rows of the matrix \(\mathbf A\) are linearly independent. Let \(D_1\) and \(D_2\) be the duals of \(\Pi_1\) and \(\Pi_2\), respectively. Of course the dual problems \(D_1\) and \(D_2\) have the same optimal cost. We have already proved that \(\Pi_2\) and \(D_2\) have the same optimal cost. It follows that \(\Pi_1\) and \(D_1\) have the same optimal cost.

Proof 2

We need a lemma called Farkas’ lemma:

Let \(\mathbf A\) be a matrix of dimensions \(m\times n\) and let \(\mathbf b\) be a vector in \(\mathbb R^m\). Then, exactly one of the following two alternatives holds:

(a) There exists some \(\mathbf x\ge 0\) such that \(\mathbf A\mathbf x=\mathbf b\).

(b) There exists some vector \(\mathbf p\) such that \(\mathbf p'\mathbf A\ge \mathbf 0'\) and \(\mathbf p'\mathbf b < 0\).

Proof. One direction is easy. If there exists some \(\mathbf x\ge 0\) satisfying \(\mathbf A\mathbf x=\mathbf b\), and if \(\mathbf p'\mathbf A\ge 0'\), then \(\mathbf p'\mathbf b=\mathbf p'\mathbf A\mathbf x\ge 0\), which shows that the second alternative cannot hold.

Let us now assume that there exists no vector \(x\ge 0\) satisfying \(\mathbf A\mathbf x=\mathbf b\). Consider the pair of problems

\[\begin{aligned} \text{maximize }~~~&\mathbf 0'\mathbf x&&&\text{minimize }~~~&\mathbf p'\mathbf b\\ \text{subject to}~~~&\mathbf A\mathbf x=\mathbf b&&&\text{subject to }~~~&\mathbf p'\mathbf A\ge \mathbf 0\\&\mathbf x\ge 0 \end{aligned} \]

and note that the first is the dual of the second. The maximization problem is infeasible, which implies that the minimization problem is either unbounded (the optimal cost is \(-\infty\)) or infeasible (by the corollary (a)). Since \(\mathbf p=\mathbf 0\) is a feasible solution to the minimization problem, it follows that the minimization problem is unbounded. Therefore, there exists some \(\mathbf p\) which is feasible, that is, \(\mathbf p'\mathbf A\ge \mathbf 0\), and whose cost is negative, that is, \(\mathbf p'\mathbf b<0\).

If we rephrase Farkas’ Lemma, we get a corollary:

Let \(\mathbf A_1,\cdots,\mathbf A_n\) and \(\mathbf b\) be given vectors and suppose that any vector \(\mathbf p\) that satisfies \(\mathbf p'\mathbf A_i\ge 0\) for all \(i\), must also satisfies \(\mathbf p'\mathbf b\ge 0\). Then, \(\mathbf b\) can be expressed as a nonnegative linear combination of the vectors \(\mathbf A_1,\cdots,\mathbf A_n\).

Now return to the strong duality theorem. We only need to show that there exists a feasible dual variable \(\mathbf p\) that achieves the same loss as the optimal primal solution (by corollary (b)).

Let \(\mathbf x^{*}\) be the optimal solution to the primal problem, and let \(I=\{i\mid \mathbf a_i'\mathbf x^*=b_i\}\) be the set of active constraints. We claim that if \(\mathbf d\) satisfies that \(\mathbf a_i'\mathbf d\ge 0\) for all \(i\in I\), then we have \(\mathbf c'\mathbf d\ge 0\). Otherwise, \(\mathbf d\) is a feasible descent direction, and then \(\mathbf x^{*}\) is not optimal. By Farkas’ Lemma, there exists scalar \(p_i\ge 0\) such that \(\mathbf c=\sum_{i\in I} p_i\mathbf a_i\). If we set \(p_i=0\) for \(i\in [n]-I\), then

\[\mathbf p'\mathbf A=\sum_{i\in I}p_i\mathbf a_i=\mathbf c\\ \mathbf p'\mathbf b=\sum_{i\in I}p_ib_i=\sum_{i\in I}p_i\mathbf a_i'\mathbf x^*=\mathbf c'\mathbf x^* \]

Hence \(\mathbf p\) is the variable we need and the proof is done.

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.mzph.cn/news/916383.shtml

如若内容造成侵权/违法违规/事实不符,请联系多彩编程网进行投诉反馈email:809451989@qq.com,一经查实,立即删除!

相关文章

day003

今日完成:计算机求和程序登录系统(可视化界面) 明日完成:斗地主 遇到问题:主程序的动态数组无法在其他类中直接调用,可以传输过去

newDay03

1.做了几个数组的小练习,比如说打乱数组之中的每一个数据,在数组之间生成随机数并求和,简单了解了一下方法的定义和调用,依旧是继续背单词,完成部分作业 2.明天继续往下学 3.办事还是显得很粗糙,再多注重一些细节…

网站开发公司创业山西两学一做登录网站

7-8 德才论 (25 分) 宋代史学家司马光在《资治通鉴》中有一段著名的“德才论”&#xff1a;“是故才德全尽谓之圣人&#xff0c;才德兼亡谓之愚人&#xff0c;德胜才谓之君子&#xff0c;才胜德谓之小人。凡取人之术&#xff0c;苟不得圣人&#xff0c;君子而与之&#xff0c;…

青海网站建设怎么建设网站被百度收录吗

B或BL指令引起处理器转移到“子程序名”处开始执行。两者的不同之处在于BL指令在转移到子 程序执行之前&#xff0c;将其下一条指令的地址拷贝到R14&#xff08;LR,链接寄存器&#xff09;。由于BL指令保存了下条指令的地 址&#xff0c;因此使用指令“MOV PC ,LR”即可实现子…

网站php网站空间门户网站解决方案

181/2461/8938产品概述&#xff1a; MDO4034C混合域示波器&#xff1a;350 MHz模拟带宽&#xff0c;2.5 GS/s采样率&#xff0c;20 M 点记录长度&#xff0c;4模拟通道&#xff1b;MDO4000C混合域示波器是一款功能强大的高性能六合一示波器。MDO4000C混合域示波器是一款功能强…

Facebook怎么高效采集材料?

pre { white-space: pre !important; word-wrap: normal !important; overflow-x: auto !important; display: block !important; font-family: "Consolas", "Monaco", "Courier New", …

2025.9.24总结 - A

今天上午学习了离散数学,还有马原,收获颇丰

RAG 检索优化的五种常见手段及实现

概述 RAG(Retrieval-Augmented Generation)通过检索外部知识来增强大模型的生成效果。本文介绍五种常见的 RAG 检索优化手段,并通过一个纯 Python 实现的示例脚本展示其实现细节。 优化手段 1. 混合检索(Dense + S…

代码规范与数学之美

1:编码规范说明 编码规范是软件开发中为了提升代码可读性、可维护性和团队协作效率而制定的一系列规则,不同编程语言和团队会有差异,以Python语言常见编码规范(如PEP8)为例,本学期主要遵循的编码规范包括: 命名…

5网站开发之美网站注册页面怎么做

在KylinOS中使用udev修改网卡名称可以按照以下步骤进行操作: 查看网卡信息 在终端中输入ip a或ifconfig -a命令,查看当前系统中的网卡设备及其MAC地址等信息,记录要修改名称的网卡的MAC地址。创建udev规则文件 以root用户身份或使用sudo权限创建一个udev规则文件。可以在/e…

成都哪些公司做网站好设计上海展会2023

文章目录1. 题目2. 解题1. 题目 给你一个 n 个点组成的无向图边集 edgeList &#xff0c;其中 edgeList[i] [ui, vi, disi] 表示点 ui 和点 vi 之间有一条长度为 disi 的边。请注意&#xff0c;两个点之间可能有 超过一条边 。 给你一个查询数组queries &#xff0c;其中 qu…

vant

防止图片拉伸变形<van-image src="xxx" mode="aspectFit" />

给自己的网站增加在线客服功能,还能接入智能大模型知识库

我们系统可以接入 gofly.v1kf.com 联系vx:llike620 网页链接接入 PC网站或H5网站可以通过多种方式接入客服系统,直接访问或跳转聊天链接,是最简单的一种方式 获取聊天链接 前往【部署】【团队设置】【网站接入】【…

2025/9/24

2025/9/241.学习离散数学 2.学习算法

江门建站房产获客软件

开源编辑器Notepad今天发布了最新的6.0版本。 Notepad 是一款免费的开源跨平台代码编辑器。它支持包括中文在内的多国语言&#xff0c;功能强大&#xff0c;除了可以用来制作一般的纯文字说明文件外&#xff0c;也可以作为代码编辑器。Notepad不仅可以实现语法高亮显示&#x…

JavaScript原型链终极解析:彻底搞懂prototype和__proto__的区别 - 详解

JavaScript原型链终极解析:彻底搞懂prototype和__proto__的区别 - 详解pre { white-space: pre !important; word-wrap: normal !important; overflow-x: auto !important; display: block !important; font-family: …

Linux中修改主机名并立即生效的完整指南

1. 查看当前主机名 在开始修改之前,先了解如何查看当前的主机名: # 查看当前主机名 hostname# 或使用hostnamectl命令(Systemd系统) hostnamectl status# 查看所有类型的主机名 hostnamectl status --all# 查看简短…

项目经理最常见的10个管理失误,你中招了吗?

做过项目的人都知道一句话:项目从来不是“顺理成章”完成的,而是一路踩坑、一路爬坑。 明明开工时信心满满,结果一到执行就乱了套:https://s.fanruan.com/rawyp延期 超支 质量差 团队闹情绪……最后搞得项目经理疲…