Correlative sparsity in primal-dual interior-point methods for LP, SDP, and SOCP

Kazuhiro Kobayashi, Sunyoung Kim, Masakazu Kojima

Research output: Contribution to journalArticlepeer-review

21 Scopus citations


Exploiting sparsity has been a key issue in solving large-scale optimization problems. The most time-consuming part of primal-dual interior-point methods for linear programs, second-order cone programs, and semidefinite programs is solving the Schur complement equation at each iteration, usually by the Cholesky factorization. The computational efficiency is greatly affected by the sparsity of the coefficient matrix of the equation which is determined by the sparsity of an optimization problem (linear program, semidefinite program or second-order cone program). We show if an optimization problem is correlatively sparse, then the coefficient matrix of the Schur complement equation inherits the sparsity, and a sparse Cholesky factorization applied to the matrix results in no fill-in.

Original languageEnglish
Pages (from-to)69-88
Number of pages20
JournalApplied Mathematics and Optimization
Issue number1
StatePublished - Aug 2008

Bibliographical note

Funding Information:
S. Kim’s research was supported by Kosef R01-2005-000-10271-0 and KRF-2006-312-C00062.


  • Chordal graph
  • Correlative sparsity
  • Linear program
  • Partial separability
  • Primal-dual interior-point method
  • Second-order cone program
  • Semidefinite program


Dive into the research topics of 'Correlative sparsity in primal-dual interior-point methods for LP, SDP, and SOCP'. Together they form a unique fingerprint.

Cite this