We consider a linear model where the coefficients – intercept and slopes – are random and independent from the regressors. The law of the coefficients is nonparametric. Without further restriction, nonparametric identification requires the regressors to have a support which is the whole space. This is hardly ever the case in practice. It is possible to handle regressors with limited variation when the coefficients can have a compact support. This is not compatible with unbounded error terms as usual in regression models. In this paper, the regressors can have a support which is a proper subset but the slopes do not have heavy-tails. Lower bounds on the minimax risk for the estimation of the joint density of the random coefficients density are obtained for a wide range of smoothness. Some allow for polynomial and nearly parametric rates of convergence. We present a minimax optimal estimator and a data-driven rule for adaptive estimation. A package is available to implement this estimator.
The authors acknowledge financial support from the grants ERC POEMH 337665 and ANR-17-EURE-0010. Christophe Gaillac thanks CREST/ENSAE where this research was partly conducted. The authors are grateful to the seminar participants at Berkeley, Brown, CREST, Duke, Harvard-MIT, Rice, TSE, ULB, University of Tokyo, those of 2016 SFDS, ISNPS, Recent Advances in Econometrics, and 2017 IAAE conferences for comments.
"Adaptive estimation in the linear random coefficients model when regressors have limited variation." Bernoulli 28 (1) 504 - 524, February 2022. https://doi.org/10.3150/21-BEJ1354