Linear and Convex Optimization. Michael H. Veatch

Linear and Convex Optimization - Michael H. Veatch


Скачать книгу
it is very easy to explain to a mathematics student, both for computing and justifying that the algorithm works. When working examples, technology can be used to invert and multiply matrices. After completing the picture of why the method works (degeneracy, two‐phase simplex), Section 9.4 takes up the issue of making the simplex method more efficient, including the tableau form and revised simplex method. Instructors who wish to start with the tableau can use the material found here. Chapter 10 on sensitivity analysis, which depends Chapter 8, can be skipped without loss of continuity; however, the interpretation of dual variables as shadow prices and partial derivatives is enriching, even in an age when sensitivity analysis can be done quickly by solving modified linear programs. An interpretation of strong duality in terms of average costs is also given in Section 10.3. Chapter 11 presents three more algorithms for linear programming, all of which rely on duality: the dual simplex, transportation simplex, and a primal‐dual, or path following, interior point method. The transportation simplex method is presented first as a minimum cost flow problem, then specialized to transportation problems.

      Supplemental material will be available at the web site www.gordon.edu⁄michaelveatch⁄optimization for the book. A full solution manual will be made available to instructors.

      The book contains the amount of material covered in a typical two‐semester sequence of undergraduate classes. A semester course focusing on linear programming could cover Chapters 1, 2, Sections 3.1–3.2, 5, 6, Sections 7.1–7.4 and 8.1–8.3, 9, 10 plus some other topics from these chapters and Chapter 11. A course on linear and integer programming could cover Chapters 1, 2, Sections 3.1–3.2, 4, 5, 6, Sections 7.1–7.4 and 8.1–8.3, 9, 12, and 13. A somewhat more advanced course on linear and convex programming could cover Chapters 13, 5–7.4, 8, 9, Sections 11.1–11.13, 14, and 15.

      Several more advanced or specialized topics have been included at the end of chapters or sections that are optional and can be easily skipped. Section 3.3 shows that a dynamic program can be solved as a linear program, an approach that relates to machine learning. Section 5.3 on computational complexity, while not difficult, is only occasional mentioned in the later chapters. Section 7.5 extends the optimality conditions needed to solve linear programs to general polyhedra. Section 8.4 introduces Lagrangian duality for linear programs and shows that it is equivalent to the usual dual; it is only needed if convex programming (Chapter 14) is being covered. Farkas' lemma is presented in Section 8.5, providing another approach to duality theorems. The computational strategies in Section 9.4 are important to the simplex method but are not used in the sequel. The interior point algorithm in Section 11.4 is computationally more involved. It is closely related to Section 15.5.

      Most of all, I am grateful for my wife Cindy's confidence in me and acceptance of my working odd hours on the project. Now we are both authors.

      Michael H. Veatch

      Wenham, MA

      March, 2020

      About the Companion Website

       www.wiley.com/go/veatch/convexandlinearoptimization

      The website includes the instructor solutions manual.

      1.1 Who Uses Optimization?


Скачать книгу