Open Access
Translator Disclaimer
June 2010 Approximation of conditional densities by smooth mixtures of regressions
Andriy Norets
Ann. Statist. 38(3): 1733-1766 (June 2010). DOI: 10.1214/09-AOS765


This paper shows that large nonparametric classes of conditional multivariate densities can be approximated in the Kullback–Leibler distance by different specifications of finite mixtures of normal regressions in which normal means and variances and mixing probabilities can depend on variables in the conditioning set (covariates). These models are a special case of models known as “mixtures of experts” in statistics and computer science literature. Flexible specifications include models in which only mixing probabilities, modeled by multinomial logit, depend on the covariates and, in the univariate case, models in which only means of the mixed normals depend flexibly on the covariates. Modeling the variance of the mixed normals by flexible functions of the covariates can weaken restrictions on the class of the approximable densities. Obtained results can be generalized to mixtures of general location scale densities. Rates of convergence and easy to interpret bounds are also obtained for different model specifications. These approximation results can be useful for proving consistency of Bayesian and maximum likelihood density estimators based on these models. The results also have interesting implications for applied researchers.


Download Citation

Andriy Norets. "Approximation of conditional densities by smooth mixtures of regressions." Ann. Statist. 38 (3) 1733 - 1766, June 2010.


Published: June 2010
First available in Project Euclid: 24 March 2010

zbMATH: 1189.62060
MathSciNet: MR2662358
Digital Object Identifier: 10.1214/09-AOS765

Primary: 62G07
Secondary: 41A30

Keywords: Bayesian conditional density estimation , Finite mixtures of normal distributions , mixtures of experts , smoothly mixing regressions

Rights: Copyright © 2010 Institute of Mathematical Statistics


Vol.38 • No. 3 • June 2010
Back to Top