Electronic Journal of Statistics

Further asymptotic properties of the generalized information criterion

ChangJiang Xu and A. Ian McLeod

Full-text: Open access


Asymptotic properties of the generalized information criterion for model selection are examined and new conditions under which this criterion is overfitting, consistent, or underfitting are derived.

Article information

Electron. J. Statist., Volume 6 (2012), 656-663.

First available in Project Euclid: 18 April 2012

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 62J02: General nonlinear regression
Secondary: 62J12: Generalized linear models

Variable selection model selection information criterion consistency


Xu, ChangJiang; McLeod, A. Ian. Further asymptotic properties of the generalized information criterion. Electron. J. Statist. 6 (2012), 656--663. doi:10.1214/12-EJS685. https://projecteuclid.org/euclid.ejs/1334754009

Export citation


  • [1] Akaike, H. (1974). A New Look at the Statistical Model Identification., IEEE Transactions on Automatic Control 19 716–723.
  • [2] Akaike, H. (1979). A Bayesian Extension of the Minimum AIC Procedure of Autoregressive Model Fitting., Biometrika 66 237–242.
  • [3] Fan, J. and Li, R. (2001). Variable Selection via Nonconcave Penalized Likelihood and Its Oracle Properties., Journal of the American Statistical Association 96 1348–1360.
  • [4] McQuarrie, A. D. R. and Tsai, C. L. (1998)., Regression and Time Series Model Selection. World Scientific Publishing Company, Singapore.
  • [5] Nishii, R. (1984). Asymptotic properties of criteria for selection of variables in multiple regression., The Annals of Statistics 12 758–765.
  • [6] Schwarz, G. (1978). Estimating the Dimension of a Model., The Annals of Statistics 6 461–464.
  • [7] Shao, J. (1997). An Asymptotic Theory for Linear Model Selection., Statistica Sinica 7 221–262.
  • [8] Shibata, R. (1984). Approximate Efficiency of a Selection Procedure for the Number of Regression Variables., Biometrika 71 43-49.
  • [9] Tibshirani, R. (1996). Regression Shrinkage and Selection via the Lasso., Journal of the Royal Statistical Society, Series B 58 267–288.
  • [10] Wald, A. (1949). Note on the consistency of the maximum likelihood estimate., Annals of Mathematical Statistics 20 595–601.
  • [11] Wang, H., Li, B. and Leng, C. (2009). Shrinkage tuning parameter selection with a diverging number of parameters., Journal of the Royal Statistical Society, Series B 71 671–683.
  • [12] Yang, Y. (2005). Can the Strengths of AIC and BIC be Shared?, Biometrika 92 937–950.
  • [13] Zhang, C.-H. (2010). Nearly Unbiased Variable Selection Under Minimax Concave Penalty., The Annals of Statistics 38 894–942.
  • [14] Zhang, Y., Li, R. and Tsai, C.-L. (2010). Regularization parameter selections via generalized information criterion., Journal of the American Statistical Association 105 312–323.