Inexact proximal Newton methods for self-concordant functions

Jinchao Li, Martin Skovgaard Andersen, Lieven Vandenberghe

Research output: Contribution to journalJournal articleResearchpeer-review

226 Downloads (Pure)


We analyze the proximal Newton method for minimizing a sum of a self-concordant function and a convex function with an inexpensive proximal operator. We present new results on the global and local convergence of the method when inexact search directions are used. The method is illustrated with an application to L1-regularized covariance selection, in which prior constraints on the sparsity pattern of the inverse covariance matrix are imposed. In the numerical experiments the proximal Newton steps are computed by an accelerated proximal gradient method, and multifrontal algorithms for positive definite matrices with chordal sparsity patterns are used to evaluate gradients and matrix-vector products with the Hessian of the smooth component of the objective.
Original languageEnglish
JournalMathematical Methods of Operations Research
Issue number1
Pages (from-to)19–41
Publication statusPublished - 2016


  • Proximal Newton method
  • Self-concordance
  • Convex optimization
  • Chordal sparsity
  • Covariance selection

Fingerprint Dive into the research topics of 'Inexact proximal Newton methods for self-concordant functions'. Together they form a unique fingerprint.

Cite this