## Electronic Journal of Statistics

### A consistency property of the AIC for multivariate linear models when the dimension and the sample size are large

#### Abstract

It is common knowledge that Akaike’s information criterion (AIC) is not a consistent model selection criterion, and Bayesian information criterion (BIC) is. These have been confirmed from an asymptotic selection probability evaluated from a large-sample framework. However, when a high-dimensional asymptotic framework, such that the dimension of the response variables and the sample size are approaching $\infty$, is used for evaluating the selection probability, there are cases that the AIC for selecting variables in multivariate linear models is consistent, but the BIC is not. The AIC and BIC are included in a family of information criteria defined by adding a penalty term expressing the complexity of the model to a negative twofold maximum log-likelihood. By clarifying the condition of the penalty term to ensure the consistency, we derive conditions for consistency of the AIC, BIC and other information criteria under the high-dimensional asymptotic framework.

#### Article information

Source
Electron. J. Statist., Volume 9, Number 1 (2015), 869-897.

Dates
First available in Project Euclid: 21 April 2015

Permanent link to this document
https://projecteuclid.org/euclid.ejs/1429625725

Digital Object Identifier
doi:10.1214/15-EJS1022

Mathematical Reviews number (MathSciNet)
MR3338666

Zentralblatt MATH identifier
1328.62455

Subjects
Primary: 62J05: Linear regression
Secondary: 62E20: Asymptotic distribution theory

#### Citation

Yanagihara, Hirokazu; Wakaki, Hirofumi; Fujikoshi, Yasunori. A consistency property of the AIC for multivariate linear models when the dimension and the sample size are large. Electron. J. Statist. 9 (2015), no. 1, 869--897. doi:10.1214/15-EJS1022. https://projecteuclid.org/euclid.ejs/1429625725

#### References

• [1] Akaike, H. (1973). Information theory and an extension of the maximum likelihood principle. In, 2nd. International Symposium on Information Theory (eds. B. N. Petrov and F. Csáki) 267–281. Akadémiai Kiadó, Budapest.
• [2] Akaike, H. (1974). A new look at the statistical model identification., IEEE Trans. Automatic Control AC-19 716–723.
• [3] Bedrick, E. J. and Tsai, C.-L. (1994). Model selection for multivariate regression in small samples., Biometrics 50 226–231.
• [4] Bosq, D. (2000)., Linear Processes in Function Spaces. Theory and Applications. Springer-Verlag, New York.
• [5] Bosq, D. and Blanke, D. (2007)., Inference and Prediction in Large Dimensions. John Wiley & Sons, Ltd., Paris.
• [6] Bozdogan, H. (1987). Model selection and Akaike’s information criterion (AIC): The general theory and its analytical extensions., Psychometrika 52 345–370.
• [7] Christakos, G. (2000)., Modern Spatiotemporal Geostatistics. Oxford University Press, New York.
• [8] Cressie, N. and Wikle, C. K. (2011)., Statistics for Spatio-Temporal Data. John Wiley & Sons, Inc., Hoboken.
• [9] Davies, S. J., Neath, A. A. and Cavanaugh, J. E. (2006). Estimation optimality of corrected AIC and modified $C_p$ in linear regression model., International Statist. Review 74 161–168.
• [10] Dien, S. J. V., Iwatani, S., Usuda, Y. and Matsui, K. (2006). Theoretical analysis of amino acid-producing, Eschenrichia coli using a stoixhiometrix model and multivariate linear regression. J. Biosci. Bioeng. 102 34–40.
• [11] Fujikoshi, Y. (1983). A criterion for variable selection in multiple discriminant analysis., Hiroshima Math. J. 13 203–214.
• [12] Fujikoshi, Y. (1985). Selection of variables in two-group discriminant analysis by error rate and Akaike’s information criteria., J. Multivariate Anal. 17 27–37.
• [13] Fujikoshi, Y. and Sakurai, T. (2009). High-dimensional asymptotic expansions for the distributions of canonical correlations., J. Multivariate Anal. 100 231–242.
• [14] Fujikoshi, Y. and Satoh, K. (1997). Modified AIC and $C_p$ in multivariate linear regression., Biometrika 84 707–716.
• [15] Fujikoshi, Y. and Seo, T. (1998). Asymptotic approximations for EPMC’s of the linear and the quadratic discriminant functions when the sample sizes and the dimension are large., Random Oper. Stochastic Equations 6 269–280.
• [16] Fujikoshi, Y., Shimizu, R. and Ulyanov, V. V. (2010)., Multivariate Statistics: High-Dimensional and Large-Sample Approximations. John Wiley & Sons, Inc., Hoboken, New Jersey.
• [17] Fujikoshi, Y., Yanagihara, H. and Wakaki, H. (2005). Bias corrections of some criteria for selection multivariate linear regression models in a general case., Amer. J. Math. Management Sci. 25 221–258.
• [18] Harville, D. A. (1997)., Matrix Algebra from a Statistician’s Perspective. Springer-Verlag, New York.
• [19] Kim, Y., Kwon, S. and Choi, H. (2012). Consistent model selection criteria on high dimensions., J. Mach. Learn. Res. 13 1037–1057.
• [20] Kullback, S. and Leibler, R. A. (1951). On information and sufficiency., Ann. Math. Statist. 22 79–86.
• [21] Muirhead, R. J. (1982)., Aspects of Multivariate Statistical Theory. John Wiley & Sons, Inc., New York.
• [22] Nishii, R. (1984). Asymptotic properties of criteria for selection of variables in multiple regression., Ann. Statist. 12 758–765.
• [23] Ramsay, J. O. and Silverman, B. W. (2005)., Functional Data Analysis (2nd. ed.). Springer, New York.
• [24] Sârbu, C., Onişor, C., Posa, M., Kevresan, S. and Kuhajda, K. (2008). Modeling and prediction (correction) of partition coefficients of bile acids and their derivatives by multivariate regression methods., Talanta 75 651–657.
• [25] Saxén, R. and Sundell, J. (2006). $^137$Cs in freshwater fish in Finland since 1986 – a statistical analysis with multivariate linear regression models., J. Environ. Radioactiv. 87 62–76.
• [26] Schwarz, G. (1978). Estimating the dimension of a model., Ann. Statist. 6 461–464.
• [27] Shao, J. (1997). An asymptotic theory for linear model selection., Statist. Sinica 7 221–264.
• [28] Shibata, R. (1976). Selection of the order of an autoregressive model by Akaike’s information criterion., Biometrika 63 117–126.
• [29] Shibata, R. (1980). Asymptotically efficient selection of the order of the model for estimating parameters of a linear process., Ann. Statist. 8 147–164.
• [30] Siotani, M., Hayakawa, T. and Fujikoshi, Y. (1985)., Modern Multivariate Statistical Analysis: A Graduate Course and Handbook. American Sciences Press, Columbus, Ohio.
• [31] Srivastava, M. S. (2002)., Methods of Multivariate Statistics. John Wiley & Sons, New York.
• [32] Sugiura, N. (1978). Further analysis of the data by Akaike’s information criterion and the finite corrections., Commun. Statist. Theory Methods A7 13–26.
• [33] Timm, N. H. (2002)., Applied Multivariate Analysis. Springer-Verlag, New York.
• [34] Wakaki, H. (2006). Edgeworth expansion of Wilks’ lambda statistic., J. Multivariate Anal. 97 1958–1964.
• [35] Woodroofe, M. (1982). On model selection and the arc sine laws., Ann. Statist. 10 1182–1194.
• [36] Yamamura, M., Yanagihara, H. and Srivastava, M. S. (2010). Variable selection in multivariate linear regression models with fewer observations than the dimension., Japanese J. Appl. Statist. 39 1–19.
• [37] Yanagihara, H. (2006). Corrected version of $AIC$ for selecting multivariate normal linear regression models in a general nonnormal case., J. Multivariate Anal. 97 1070–1089.
• [38] Yanagihara, H., Kamo, K. and Tonda, T. (2011). Second-order bias-corrected AIC in multivariate normal linear models under nonnormality., Canad. J. Statist. 39 126–146.
• [39] Yang, Y. (2005). Can the strengths of AIC and BIC be shared? A conflict between model indentification and regression estimation., Biometrika 92 937–950.
• [40] Yoshimoto, A., Yanagihara, H. and Ninomiya, Y. (2005). Finding factors affecting a forest stand growth through multivariate linear modeling., J. Jpn. For. Soc. 87 504–512 (in Japanese).