Abstract
We analyze the proximal Newton method for minimizing a sum of a self-concordant function and a convex function with an inexpensive proximal operator. We present new results on the global and local convergence of the method when inexact search directions are used. The method is illustrated with an application to L1-regularized covariance selection, in which prior constraints on the sparsity pattern of the inverse covariance matrix are imposed. In the numerical experiments the proximal Newton steps are computed by an accelerated proximal gradient method, and multifrontal algorithms for positive definite matrices with chordal sparsity patterns are used to evaluate gradients and matrix-vector products with the Hessian of the smooth component of the objective.
Original language | English |
---|---|
Journal | Mathematical Methods of Operations Research |
Volume | 85 |
Issue number | 1 |
Pages (from-to) | 19–41 |
ISSN | 1432-2994 |
DOIs | |
Publication status | Published - 2016 |
Keywords
- Proximal Newton method
- Self-concordance
- Convex optimization
- Chordal sparsity
- Covariance selection