Open Access
October 2008 Profile-kernel likelihood inference with diverging number of parameters
Clifford Lam, Jianqing Fan
Ann. Statist. 36(5): 2232-2260 (October 2008). DOI: 10.1214/07-AOS544

Abstract

The generalized varying coefficient partially linear model with a growing number of predictors arises in many contemporary scientific endeavor. In this paper we set foot on both theoretical and practical sides of profile likelihood estimation and inference. When the number of parameters grows with sample size, the existence and asymptotic normality of the profile likelihood estimator are established under some regularity conditions. Profile likelihood ratio inference for the growing number of parameters is proposed and Wilk’s phenomenon is demonstrated. A new algorithm, called the accelerated profile-kernel algorithm, for computing profile-kernel estimator is proposed and investigated. Simulation studies show that the resulting estimates are as efficient as the fully iterative profile-kernel estimates. For moderate sample sizes, our proposed procedure saves much computational time over the fully iterative profile-kernel one and gives stabler estimates. A set of real data is analyzed using our proposed algorithm.

Citation

Download Citation

Clifford Lam. Jianqing Fan. "Profile-kernel likelihood inference with diverging number of parameters." Ann. Statist. 36 (5) 2232 - 2260, October 2008. https://doi.org/10.1214/07-AOS544

Information

Published: October 2008
First available in Project Euclid: 13 October 2008

zbMATH: 1274.62289
MathSciNet: MR2458186
Digital Object Identifier: 10.1214/07-AOS544

Subjects:
Primary: 62G08
Secondary: 62F12 , 62J12

Keywords: asymptotic normality , generalized likelihood ratio tests , generalized linear models , high dimensionality , profile likelihood , varying coefficients

Rights: Copyright © 2008 Institute of Mathematical Statistics

Vol.36 • No. 5 • October 2008
Back to Top