Stanford EE364A Convex Optimization I Stephen Boyd I 2023 I Lecture 5

()
Stanford EE364A Convex Optimization I Stephen Boyd I 2023 I Lecture 5

Course Transition

  • The lecture marks a shift from theoretical foundations to practical content.
  • Students are encouraged to install the CVXPY Python package for practical implementation.
  • Homework 2 is the last challenging assignment, with subsequent homework focusing on interesting topics.

Log-Concave Functions

  • Log-concave functions have a concave logarithm.
  • Properties of log-concave functions:
    • Weighted geometric mean is greater than or equal to the weighted arithmetic mean.
    • Many common probability densities, such as the normal distribution, are log-concave.
    • Cumulative distribution function of a log-concave density is also log-concave.
    • Product of log-concave functions is log-concave, but the sum is not always log-concave.
    • Log-concave functions are not quasi-concave due to a bending down portion.
    • If a function is log-concave in two variables and integrated over one, the result is log-concave.
    • Log concavity is preserved under convolution.
    • If a random variable has a log-concave density and a convex set in RN, the probability of the sum being in the set is a log-concave function of the random variable.

Yield Function and Optimization

  • The yield function, representing the probability of a product meeting specifications, is log-concave.
  • Targeting the center of acceptable values for product parameters maximizes the yield.
  • The yield region, defined by a minimum yield requirement, is convex if the yield function is log-concave.

Convex Optimization

  • Convex optimization problems are tractable and solvable.
  • Optimization problems can be represented as objects with attributes like objective function and constraints.
  • Optimal value is the minimum objective value over all feasible points.
  • Infeasible and unbounded below problems are practical pathologies.
  • A feasible point is in the domain and satisfies constraints, while an optimal point is feasible with the optimal value.
  • The set of optimal points is convex, and local optimality means optimality in a restricted range.

Functions and Optimal Values

  • Examples of functions and their optimal values were discussed, including 1/x with a positive real number domain and -log(x) with a single minimum value.

Constraints and Problem Types

  • Implicit vs. explicit constraints and the problem's domain were discussed, emphasizing proposing points within the domain for evaluation.
  • A positive definite matrix represents a covariance matrix, while a nonsymmetric matrix with negative and complex eigenvalues cannot be evaluated for log likelihood.
  • An unconstrained problem has no inequalities, but the constraint AI transpose X is less than b is built into the log.
  • A feasibility problem finds an X satisfying constraints (hard constraints).
  • Feasibility problems can be written as a special case of a general problem with an always-zero objective function, making them either optimal or infeasible.
  • Feasibility problems are equivalent to master problems and can take values of zero or infinity.

Convex Optimization Properties

  • Convex optimization problems restrict the curvature of objective and constraint functions.
  • The objective is convex, constraints are upper-bounded by zero, and equality constraints are affine or linear.

Equivalence and Transformations

  • Reduction: Two problems are equivalent if solving one provides a simple method to solve the other.
  • Equivalence: Non-convex problems can be equivalent to convex problems.
  • Local and Global Optima: Any locally optimal point of a convex problem is globally optimal.
  • Optimality Condition: For a feasible point x to be optimal, the gradient of the objective function at x must be orthogonal to the feasible set.
  • No Constraints: If there are no constraints, the optimality condition reduces to the gradient of the objective function vanishing.
  • Equality Constraints: The optimal solution must satisfy both feasibility and the gradient's vanishing condition.
  • Equivalent Problems: One problem can be transformed into another while preserving the optimal solution.
  • Eliminating Equality Constraints: Introduce a new variable to satisfy equality constraints, transforming the problem into an equivalent one without equality constraints.
  • Introducing Equality Constraints: Add new variables and equality constraints, which can be useful and compatible with convexity.
  • Introducing Slack Variables: Rewrite linear inequalities as equalities by introducing slack variables, representing the amount by which a constraint is satisfied with strict inequality.
  • Epigraph Form: Introduce a new variable and represent the problem in terms of the epigraph of the objective function, allowing problems with nonlinear objectives to be solved using algorithms designed for linear objectives.
  • Selective Minimization: Analytically minimize the objective function with respect to a subset of variables, applicable when the objective function is separable or has a special structure.
  • Quasi-convex Optimization: Problems with convex inequality constraints, linear equality constraints, and a quasi-convex objective function.
    • Quasi-convex functions can have local optima that are not global optima.
    • To solve, find a parameterized set of convex functions that approximate the quasi-convex function and then use bisection to find the optimal solution.

Overwhelmed by Endless Content?