# Linear-Time Algorithm for Learning Large-Scale Sparse Graphical Models.

The sparse inverse covariance estimation problem is commonly solved using an$\ell_{1}$-regularized Gaussian maximum likelihood estimator known as"graphical lasso", but its computational cost becomes prohibitive for largedata sets. A recent line of results showed--under mild assumptions--that thegraphical lasso estimator can be retrieved by soft-thresholding the samplecovariance matrix and solving a maximum determinant matrix completion (MDMC)problem. This paper proves an extension of this result, and describes aNewton-CG algorithm to efficiently solve the MDMC problem. Assuming that thethresholded sample covariance matrix is sparse with a sparse Choleskyfactorization, we prove that the algorithm converges to an $\epsilon$-accuratesolution in $O(n\log(1/\epsilon))$ time and $O(n)$ memory. The algorithm ishighly efficient in practice: we solve the associated MDMC problems with asmany as 200,000 variables to 7-9 digits of accuracy in less than an hour on astandard laptop computer running MATLAB.