The Annals of Statistics

Nonasymptotic bounds for vector quantization in Hilbert spaces

Clément Levrard

Full-text: Open access

Abstract

Recent results in quantization theory show that the mean-squared expected distortion can reach a rate of convergence of $\mathcal{O}(1/n)$, where $n$ is the sample size [see, e.g., IEEE Trans. Inform. Theory 60 (2014) 7279–7292 or Electron. J. Stat. 7 (2013) 1716–1746]. This rate is attained for the empirical risk minimizer strategy, if the source distribution satisfies some regularity conditions. However, the dependency of the average distortion on other parameters is not known, and these results are only valid for distributions over finite-dimensional Euclidean spaces.

This paper deals with the general case of distributions over separable, possibly infinite dimensional, Hilbert spaces. A condition is proposed, which may be thought of as a margin condition [see, e.g., Ann. Statist. 27 (1999) 1808–1829], under which a nonasymptotic upper bound on the expected distortion rate of the empirically optimal quantizer is derived. The dependency of the distortion on other parameters of distributions is then discussed, in particular through a minimax lower bound.

Article information

Source
Ann. Statist., Volume 43, Number 2 (2015), 592-619.

Dates
First available in Project Euclid: 24 February 2015

Permanent link to this document
https://projecteuclid.org/euclid.aos/1424787429

Digital Object Identifier
doi:10.1214/14-AOS1293

Mathematical Reviews number (MathSciNet)
MR3316191

Zentralblatt MATH identifier
1314.62143

Subjects
Primary: 62H30: Classification and discrimination; cluster analysis [See also 68T10, 91C20]

Keywords
Quantization localization fast rates margin conditions

Citation

Levrard, Clément. Nonasymptotic bounds for vector quantization in Hilbert spaces. Ann. Statist. 43 (2015), no. 2, 592--619. doi:10.1214/14-AOS1293. https://projecteuclid.org/euclid.aos/1424787429


Export citation

References

  • [1] Antos, A. (2005). Improved minimax bounds on the test and training distortion of empirically designed vector quantizers. IEEE Trans. Inform. Theory 51 4022–4032.
  • [2] Antos, A., Györfi, L. and György, A. (2005). Individual convergence rates in empirical vector quantizer design. IEEE Trans. Inform. Theory 51 4013–4022.
  • [3] Auder, B. and Fischer, A. (2012). Projection-based curve clustering. J. Stat. Comput. Simul. 82 1145–1168.
  • [4] Bartlett, P. L., Linder, T. and Lugosi, G. (1998). The minimax distortion redundancy in empirical quantizer design. IEEE Trans. Inform. Theory 44 1802–1813.
  • [5] Bartlett, P. L. and Mendelson, S. (2002). Rademacher and Gaussian complexities: Risk bounds and structural results. J. Mach. Learn. Res. 3 463–482.
  • [6] Biau, G., Devroye, L. and Lugosi, G. (2008). On the performance of clustering in Hilbert spaces. IEEE Trans. Inform. Theory 54 781–790.
  • [7] Blanchard, G., Bousquet, O. and Massart, P. (2008). Statistical performance of support vector machines. Ann. Statist. 36 489–531.
  • [8] Brezis, H. (2011). Functional Analysis, Sobolev Spaces and Partial Differential Equations. Springer, New York.
  • [9] Cañas, G. D., Poggio, T. and Rosasco, L. (2012). Learning manifolds with $k$-means and $k$-flats. CoRR abs/1209.1121.
  • [10] Chichignoud, M. and Loustau, S. (2014). Adaptive noisy clustering. IEEE Trans. Inform. Theory 60 7279–7292.
  • [11] Chou, P. A. (1994). The distortion of vector quantizers trained on $n$ vectors decreases to the optimum as $\mathcalO_p(1/n)$. In Proc. IEEE Int. Symp. Inf. Theory. IEEE, Trondheim.
  • [12] Fischer, A. (2010). Quantization and clustering with Bregman divergences. J. Multivariate Anal. 101 2207–2221.
  • [13] Gersho, A. and Gray, R. M. (1991). Vector Quantization and Signal Compression. Kluwer Academic, Norwell, MA.
  • [14] Graf, S. and Luschgy, H. (2000). Foundations of Quantization for Probability Distributions. Lecture Notes in Math. 1730. Springer, Berlin.
  • [15] Graf, S., Luschgy, H. and Pagès, G. (2007). Optimal quantizers for Radon random vectors in a Banach space. J. Approx. Theory 144 27–53.
  • [16] Koltchinskii, V. (2006). Local Rademacher complexities and oracle inequalities in risk minimization. Ann. Statist. 34 2593–2656.
  • [17] Ledoux, M. and Talagrand, M. (1991). Probability in Banach Spaces: Isoperimetry and Processes. Ergebnisse der Mathematik und Ihrer Grenzgebiete (3) [Results in Mathematics and Related Areas (3)] 23. Springer, Berlin.
  • [18] Levrard, C. (2013). Fast rates for empirical vector quantization. Electron. J. Stat. 7 1716–1746.
  • [19] Levrard, C. (2015). Supplement to “Nonasymptotic bounds for vector quantization in Hilbert spaces.” DOI:10.1214/14-AOS1293SUPP.
  • [20] Linder, T. (2002). Learning-theoretic methods in vector quantization. In Principles of Nonparametric Learning (Udine, 2001). CISM Courses and Lectures 434 163–210. Springer, Vienna.
  • [21] Linder, T., Lugosi, G. and Zeger, K. (1994). Rates of convergence in the source coding theorem, in empirical quantizer design, and in universal lossy source coding. IEEE Trans. Inform. Theory 40 1728–1740.
  • [22] Mammen, E. and Tsybakov, A. B. (1999). Smooth discrimination analysis. Ann. Statist. 27 1808–1829.
  • [23] Massart, P. (2007). Concentration Inequalities and Model Selection. Lecture Notes in Math. 1896. Springer, Berlin.
  • [24] Massart, P. and Nédélec, É. (2006). Risk bounds for statistical learning. Ann. Statist. 34 2326–2366.
  • [25] Pollard, D. (1982). A central limit theorem for $k$-means clustering. Ann. Probab. 10 919–926.
  • [26] Tsybakov, A. B. (2009). Introduction to Nonparametric Estimation. Springer, New York. Revised and extended from the 2004 French original. Translated by Vladimir Zaiats.

Supplemental materials