Electronic Journal of Statistics

On the existence of the weighted bridge penalized Gaussian likelihood precision matrix estimator

Adam J. Rothman and Liliana Forzani

Full-text: Open access

Abstract

We establish a necessary and sufficient condition for the existence of the precision matrix estimator obtained by minimizing the negative Gaussian log-likelihood plus a weighted bridge penalty. This condition enables us to connect the literature on Gaussian graphical models to the literature on penalized Gaussian likelihood.

Article information

Source
Electron. J. Statist., Volume 8, Number 2 (2014), 2693-2700.

Dates
First available in Project Euclid: 22 December 2014

Permanent link to this document
https://projecteuclid.org/euclid.ejs/1419258191

Digital Object Identifier
doi:10.1214/14-EJS973

Mathematical Reviews number (MathSciNet)
MR3292954

Zentralblatt MATH identifier
1309.62099

Subjects
Primary: 62H12: Estimation
Secondary: 62H20: Measures of association (correlation, canonical correlation, etc.)

Keywords
High-dimensional data precision matrix ridge penalty sparsity

Citation

Rothman, Adam J.; Forzani, Liliana. On the existence of the weighted bridge penalized Gaussian likelihood precision matrix estimator. Electron. J. Statist. 8 (2014), no. 2, 2693--2700. doi:10.1214/14-EJS973. https://projecteuclid.org/euclid.ejs/1419258191


Export citation

References

  • Banerjee, O., El Ghaoui, L. and d’Aspremont, A. (2008). Model selection through sparse maximum likelihood estimation., Journal of Machine Learning Research 9 485–516.
  • Bien, J. and Tibshirani, R. J. (2011). Sparse estimation of a covariance matrix., Biometrika 98 807–820.
  • Buhl, S. L. (1993). On the existence of maximum likelihood estimators for graphical Gaussian models., Scandinavian Journal of Statistics 20 263–270.
  • Friedman, J., Hastie, T. and Tibshirani, R. (2008). Sparse inverse covariance estimation with the graphical lasso., Biostatistics 9 432– 441.
  • Hsieh, C.-J., Sustik, M. A., Dhillon, I. S. and Ravikumar, P. K. (2011). Sparse Inverse Covariance Matrix Estimation Using Quadratic Approximation. In, Advances in Neural Information Processing Systems, 24 2330–2338. MIT Press, Cambridge, MA.
  • Hsieh, C.-J., Banerjee, A., Dhillon, I. S. and Ravikumar, P. K. (2012). A Divide-and-Conquer Method for Sparse Inverse Covariance Estimation. In, Advances in Neural Information Processing Systems, 25 (F. Pereira, C. J. C. Burges, L. Bottou and K. Q. Weinberger, eds.) 2330–2338. Curran Associates, Inc.
  • Lam, C. and Fan, J. (2009). Sparsistency and rates of convergence in large covariance matrices estimation., Annals of Statistics 37 4254–4278.
  • Ledoit, O. and Wolf, M. (2003). A well-conditioned estimator for large-dimensional covariance matrices., Journal of Multivariate Analysis 88 365–411.
  • Lu, Z. (2010). Adaptive first-order methods for general sparse inverse covariance selection., SIAM Journal on Matrix Analysis and Its Applications 31 2000–2016.
  • Mazumder, R. and Hastie, T. (2012). Exact covariance thresholding into connected components for large-scale Graphical Lasso., Journal of Machine Learning Research 13 781–794.
  • Meinshausen, N. (2008). A note on the lasso for graphical Gaussian model selection., Statistics and Probability Letters 78 880–884.
  • Pourahmadi, M. (2011). Modeling covariance matrices: The GLM and regularization perspectives., Statistical Science 26 369–387.
  • Ravikumar, P., Wainwright, M. J., Raskutti, G. and Yu, B. (2011). High-dimensional covariance estimation by minimizing l1-penalized log-determinant divergence., Electronic Journal of Statistics 5 935–980.
  • Rothman, A. J., Levina, E. and Zhu, J. (2009). Generalized thresholding of large covariance matrices., Journal of the American Statistical Association 104 177–186.
  • Rothman, A. J., Levina, E. and Zhu, J. (2010a). A new approach to Cholesky-based covariance regularization in high dimensions., Biometrika 97 539–550.
  • Rothman, A. J., Levina, E. and Zhu, J. (2010b). Sparse multivariate regression with covariance estimation., Journal of Computational and Graphical Statistics 19 947–962.
  • Rothman, A. J., Bickel, P. J., Levina, E. and Zhu, J. (2008). Sparse permutation invariant covariance estimation., Electronic Journal of Statistics 2 494–515.
  • Uhler, C. (2012). Geometry of maximum likelihood estimation in Gaussian graphical models., Annals of Statistics 40 238–261.
  • Witten, D. M., Friedman, J. H. and Simon, N. (2011). New insights and faster computations for the graphical lasso., Journal of Computational and Graphical Statistics 20 892–900.
  • Witten, D. M. and Tibshirani, R. (2009). Covariance-regularized regression and classification for high-dimensional problems., Journal of The Royal Statistical Society Series B 71 615–636.
  • Yuan, M. (2008). Efficient computation of L1 regularized estimates in Gaussian graphical models., Journal of Computational and Graphical Statistics 17 809–826.
  • Yuan, M. and Lin, Y. (2007). Model selection and estimation in the Gaussian graphical model., Biometrika 94 19–35.