Sum-of-squares optimization

A sum-of-squares optimization program is an optimization problem with a linear cost function and a particular type of constraint on the decision variables. These constraints are of the form that when the decision variables are used as coefficients in certain polynomials, those polynomials should have the polynomial SOS property. When fixing the maximum degree of the polynomials involved, sum-of-squares optimization is also known as the Lasserre hierarchy of relaxations in semidefinite programming.

Sum-of-squares optimization techniques have been applied across a variety of areas, including control theory (in particular, for searching for polynomial Lyapunov functions for dynamical systems described by polynomial vector fields), statistics, finance and machine learning.[1][2][3][4]

Optimization problem edit

Given a vector   and polynomials   for  ,  , a sum-of-squares optimization problem is written as

 

Here "SOS" represents the class of sum-of-squares (SOS) polynomials. The quantities   are the decision variables. SOS programs can be converted to semidefinite programs (SDPs) using the duality of the SOS polynomial program and a relaxation for constrained polynomial optimization using positive-semidefinite matrices, see the following section.

Dual problem: constrained polynomial optimization edit

Suppose we have an  -variate polynomial   , and suppose that we would like to minimize this polynomial over a subset  . Suppose furthermore that the constraints on the subset   can be encoded using   polynomial equalities of degree at most  , each of the form   where   is a polynomial of degree at most  . A natural, though generally non-convex program for this optimization problem is the following:

 
subject to:
 

(1)

 
where   is the  -dimensional vector with one entry for every monomial in   of degree at most  , so that for each multiset    ,   is a matrix of coefficients of the polynomial   that we want to minimize, and   is a matrix of coefficients of the polynomial   encoding the  -th constraint on the subset  . The additional, fixed constant index in our search space,  , is added for the convenience of writing the polynomials   and   in a matrix representation.

This program is generally non-convex, because the constraints (1) are not convex. One possible convex relaxation for this minimization problem uses semidefinite programming to replace the rank-one matrix of variables   with a positive-semidefinite matrix  : we index each monomial of size at most   by a multiset   of at most   indices,  . For each such monomial, we create a variable   in the program, and we arrange the variables   to form the matrix  , where  is the set of real matrices whose rows and columns are identified with multisets of elements from   of size at most  . We then write the following semidefinite program in the variables  :

 
subject to:
 
 
 
 

where again   is the matrix of coefficients of the polynomial   that we want to minimize, and   is the matrix of coefficients of the polynomial   encoding the  -th constraint on the subset  .

The third constraint ensures that the value of a monomial that appears several times within the matrix is equal throughout the matrix, and is added to make   respect the symmetries present in the quadratic form  .

Duality edit

One can take the dual of the above semidefinite program and obtain the following program:

 
subject to:
 

We have a variable   corresponding to the constraint   (where   is the matrix with all entries zero save for the entry indexed by  ), a real variable  for each polynomial constraint   and for each group of multisets  , we have a dual variable  for the symmetry constraint  . The positive-semidefiniteness constraint ensures that   is a sum-of-squares of polynomials over  : by a characterization of positive-semidefinite matrices, for any positive-semidefinite matrix  , we can write   for vectors  . Thus for any  ,

 

where we have identified the vectors   with the coefficients of a polynomial of degree at most  . This gives a sum-of-squares proof that the value   over  .

The above can also be extended to regions   defined by polynomial inequalities.

Sum-of-squares hierarchy edit

The sum-of-squares hierarchy (SOS hierarchy), also known as the Lasserre hierarchy, is a hierarchy of convex relaxations of increasing power and increasing computational cost. For each natural number   the corresponding convex relaxation is known as the  th level or  -th round of the SOS hierarchy. The  st round, when  , corresponds to a basic semidefinite program, or to sum-of-squares optimization over polynomials of degree at most  . To augment the basic convex program at the  st level of the hierarchy to  -th level, additional variables and constraints are added to the program to have the program consider polynomials of degree at most  .

The SOS hierarchy derives its name from the fact that the value of the objective function at the  -th level is bounded with a sum-of-squares proof using polynomials of degree at most   via the dual (see "Duality" above). Consequently, any sum-of-squares proof that uses polynomials of degree at most   can be used to bound the objective value, allowing one to prove guarantees on the tightness of the relaxation.

In conjunction with a theorem of Berg, this further implies that given sufficiently many rounds, the relaxation becomes arbitrarily tight on any fixed interval. Berg's result[5][6] states that every non-negative real polynomial within a bounded interval can be approximated within accuracy   on that interval with a sum-of-squares of real polynomials of sufficiently high degree, and thus if   is the polynomial objective value as a function of the point  , if the inequality   holds for all   in the region of interest, then there must be a sum-of-squares proof of this fact. Choosing   to be the minimum of the objective function over the feasible region, we have the result.

Computational cost edit

When optimizing over a function in   variables, the  -th level of the hierarchy can be written as a semidefinite program over   variables, and can be solved in time   using the ellipsoid method.

Sum-of-squares background edit

A polynomial   is a sum of squares (SOS) if there exist polynomials   such that  . For example,

 
is a sum of squares since
 
where
 
Note that if   is a sum of squares then   for all  . Detailed descriptions of polynomial SOS are available.[7][8][9]

Quadratic forms can be expressed as   where   is a symmetric matrix. Similarly, polynomials of degree ≤ 2d can be expressed as

 
where the vector   contains all monomials of degree  . This is known as the Gram matrix form. An important fact is that   is SOS if and only if there exists a symmetric and positive-semidefinite matrix   such that  . This provides a connection between SOS polynomials and positive-semidefinite matrices.

Software tools edit

References edit

  1. ^ Sum of squares : theory and applications : AMS short course, sum of squares : theory and applications, January 14-15, 2019, Baltimore, Maryland. Parrilo, Pablo A.; Thomas, Rekha R. Providence, Rhode Island: American Mathematical Society. 2020. ISBN 978-1-4704-5025-0. OCLC 1157604983.{{cite book}}: CS1 maint: others (link)
  2. ^ Tan, W., Packard, A., 2004. "Searching for control Lyapunov functions using sums of squares programming". In: Allerton Conf. on Comm., Control and Computing. pp. 210–219.
  3. ^ Tan, W., Topcu, U., Seiler, P., Balas, G., Packard, A., 2008. Simulation-aided reachability and local gain analysis for nonlinear dynamical systems. In: Proc. of the IEEE Conference on Decision and Control. pp. 4097–4102.
  4. ^ A. Chakraborty, P. Seiler, and G. Balas, "Susceptibility of F/A-18 Flight Controllers to the Falling-Leaf Mode: Nonlinear Analysis," AIAA Journal of Guidance, Control, and Dynamics, vol. 34 no. 1 (2011), pp. 73–85.
  5. ^ Berg, Christian (1987). Landau, Henry J. (ed.). "The multidimensional moment problem and semigroups". Proceedings of Symposia in Applied Mathematics. 37: 110–124. doi:10.1090/psapm/037/921086. ISBN 9780821801147.
  6. ^ Lasserre, J. (2007-01-01). "A Sum of Squares Approximation of Nonnegative Polynomials". SIAM Review. 49 (4): 651–669. arXiv:math/0412398. doi:10.1137/070693709. ISSN 0036-1445.
  7. ^ Parrilo, P., (2000) Structured semidefinite programs and semialgebraic geometry methods in robustness and optimization. Ph.D. thesis, California Institute of Technology.
  8. ^ Parrilo, P. (2003) "Semidefinite programming relaxations for semialgebraic problems". Mathematical Programming Ser. B 96 (2), 293–320.
  9. ^ Lasserre, J. (2001) "Global optimization with polynomials and the problem of moments". SIAM Journal on Optimization, 11 (3), 796{817.