Bernoulli

  • Bernoulli
  • Volume 23, Number 1 (2017), 379-404.

Posterior asymptotics of nonparametric location-scale mixtures for multivariate density estimation

Antonio Canale and Pierpaolo De Blasi

Full-text: Access denied (no subscription detected) We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text

Abstract

Density estimation represents one of the most successful applications of Bayesian nonparametrics. In particular, Dirichlet process mixtures of normals are the gold standard for density estimation and their asymptotic properties have been studied extensively, especially in the univariate case. However, a gap between practitioners and the current theoretical literature is present. So far, posterior asymptotic results in the multivariate case are available only for location mixtures of Gaussian kernels with independent prior on the common covariance matrix, while in practice as well as from a conceptual point of view a location-scale mixture is often preferable. In this paper, we address posterior consistency for such general mixture models by adapting a convergence rate result which combines the usual low-entropy, high-mass sieve approach with a suitable summability condition. Specifically, we establish consistency for Dirichlet process mixtures of Gaussian kernels with various prior specifications on the covariance matrix. Posterior convergence rates are also discussed.

Article information

Source
Bernoulli Volume 23, Number 1 (2017), 379-404.

Dates
Received: July 2014
Revised: June 2015
First available in Project Euclid: 27 September 2016

Permanent link to this document
https://projecteuclid.org/euclid.bj/1475001358

Digital Object Identifier
doi:10.3150/15-BEJ746

Mathematical Reviews number (MathSciNet)
MR3556776

Zentralblatt MATH identifier
06673481

Keywords
Bayesian nonparametrics density estimation Dirichlet mixture factor model posterior asymptotics sparse random eigenmatrices

Citation

Canale, Antonio; De Blasi, Pierpaolo. Posterior asymptotics of nonparametric location-scale mixtures for multivariate density estimation. Bernoulli 23 (2017), no. 1, 379--404. doi:10.3150/15-BEJ746. https://projecteuclid.org/euclid.bj/1475001358.


Export citation

References

  • [1] Barron, A., Schervish, M.J. and Wasserman, L. (1999). The consistency of posterior distributions in nonparametric problems. Ann. Statist. 27 536–561.
  • [2] Bhattacharya, A. and Dunson, D.B. (2011). Sparse Bayesian infinite factor models. Biometrika 98 291–306.
  • [3] Carvalho, C.M., Chang, J., Lucas, J.E., Nevins, J.R., Wang, Q. and West, M. (2008). High-dimensional sparse factor modeling: Applications in gene expression genomics. J. Amer. Statist. Assoc. 103 1438–1456.
  • [4] Chen, M., Silva, J., Paisley, J., Wang, C., Dunson, D. and Carin, L. (2010). Compressive sensing on manifolds using a nonparametric mixture of factor analyzers: Algorithm and performance bounds. IEEE Trans. Signal Process. 58 6140–6155.
  • [5] Cron, A. and West, M. (2016). Models of random sparse eigenmatrices and Bayesian analysis of multivariate structure. In Statistical Analysis for High-Dimensional Data: The Abel Symposium 2014 (A. Frigessi, P. Bühlmann, K.I. Glad, M. Langaas, S. Richardson and M. Vannucci, eds.) 125–153. Cham: Springer International Publishing.
  • [6] Devroye, L. and Györfi, L. (1985). Nonparametric Density Estimation. The $L{_{1}}$ View. Wiley Series in Probability and Mathematical Statistics: Tracts on Probability and Statistics. New York: Wiley.
  • [7] De Blasi, P., Favaro, S., Lijoi, A., Mena, R.H., Prunster, I. and Ruggiero, M. (2015). Are Gibbs-type priors the most natural generalization of the Dirichlet process? IEEE Trans. Pattern Anal. Mach. Intell. 37 212–229.
  • [8] Edelman, A. and Sutton, B.D. (2005). Tails of condition number distributions. SIAM J. Matrix Anal. Appl. 27 547–560.
  • [9] Escobar, M.D. and West, M. (1995). Bayesian density estimation and inference using mixtures. J. Amer. Statist. Assoc. 90 577–588.
  • [10] Ferguson, T.S. (1973). A Bayesian analysis of some nonparametric problems. Ann. Statist. 1 209–230.
  • [11] Ghosal, S., Ghosh, J.K. and Ramamoorthi, R.V. (1999). Posterior consistency of Dirichlet mixtures in density estimation. Ann. Statist. 27 143–158.
  • [12] Ghosal, S., Ghosh, J.K. and van der Vaart, A.W. (2000). Convergence rates of posterior distributions. Ann. Statist. 28 500–531.
  • [13] Ghosal, S. and van der Vaart, A. (2007). Posterior convergence rates of Dirichlet mixtures at smooth densities. Ann. Statist. 35 697–723.
  • [14] Ghosal, S. and van der Vaart, A.W. (2001). Entropies and rates of convergence for maximum likelihood and Bayes estimation for mixtures of normal densities. Ann. Statist. 29 1233–1263.
  • [15] Gnedin, A. and Pitman, J. (2005). Exchangeable Gibbs partitions and Stirling triangles. Zap. Nauchn. Sem. S.-Peterburg. Otdel. Mat. Inst. Steklov. (POMI) 325 83–102, 244–245.
  • [16] Goldenshluger, A. and Lepski, O. (2014). On adaptive minimax density estimation on $R^{d}$. Probab. Theory Related Fields 159 479–543.
  • [17] Gorur, D. and Rasmussen, C. (2009). Nonparametric mixtures of factor analyzers. In Signal Processing and Communications Applications Conference, 2009. SIU 2009. IEEE 17th 708–711. Cambridge, MA: MIT Press.
  • [18] Kruijer, W., Rousseau, J. and van der Vaart, A. (2010). Adaptive Bayesian density estimation with location-scale mixtures. Electron. J. Stat. 4 1225–1257.
  • [19] Lijoi, A., Mena, R.H. and Prünster, I. (2005). Hierarchical mixture modeling with normalized inverse-Gaussian priors. J. Amer. Statist. Assoc. 100 1278–1291.
  • [20] Lijoi, A., Prünster, I. and Walker, S.G. (2005). On consistency of nonparametric normal mixtures for Bayesian density estimation. J. Amer. Statist. Assoc. 100 1292–1296.
  • [21] Lo, A.Y. (1984). On a class of Bayesian nonparametric estimates. I. Density estimates. Ann. Statist. 12 351–357.
  • [22] MacEachern, S.N. and Müller, P. (1998). Estimating mixture of Dirichlet process models. J. Comput. Graph. Statist. 7 223–238.
  • [23] Matthaiou, M., McKay, M.R., Smith, P.J. and Nossek, J.A. (2010). On the condition number distribution of complex Wishart matrices. IEEE Trans. Commun. 58 1705–1717.
  • [24] Muirhead, R.J. (1982). Aspects of Multivariate Statistical Theory. New York: Wiley.
  • [25] Müller, P., Erkanli, A. and West, M. (1996). Bayesian curve fitting using multivariate normal mixtures. Biometrika 83 67–79.
  • [26] Perman, M., Pitman, J. and Yor, M. (1992). Size-biased sampling of Poisson point processes and excursions. Probab. Theory Related Fields 92 21–39.
  • [27] Shen, W., Tokdar, S.T. and Ghosal, S. (2013). Adaptive Bayesian multivariate density estimation with Dirichlet mixtures. Biometrika 100 623–640.
  • [28] Tokdar, S.T. (2006). Posterior consistency of Dirichlet location-scale mixture of normals in density estimation and regression. Sankhyā 68 90–110.
  • [29] Walker, S.G., Lijoi, A. and Prünster, I. (2007). On rates of convergence for posterior distributions in infinite-dimensional models. Ann. Statist. 35 738–746.
  • [30] West, M. (2003). Bayesian factor regression models in the “large $p$, small $n$” paradigm. In Bayesian Statistics, 7 (Tenerife, 2002) 733–742. Oxford Univ. Press, New York.
  • [31] Wu, Y. and Ghosal, S. (2008). Kullback Leibler property of kernel mixture priors in Bayesian density estimation. Electron. J. Stat. 2 298–331.
  • [32] Wu, Y. and Ghosal, S. (2010). The $L_{1}$-consistency of Dirichlet mixtures in multivariate Bayesian density estimation. J. Multivariate Anal. 101 2411–2419.