Electronic Communications in Probability

Concentration inequalities for order statistics

Stéphane Boucheron and Maud Thomas

Full-text: Open access


This note describes non-asymptotic variance and tail bounds for order statistics of samples of independent identically distributed random variables. When the sampling distribution belongs to a maximum domain of attraction, these bounds are checked to be asymptotically tight. When the sampling distribution has a non decreasing hazard rate, we derive an exponential Efron-Stein inequality for order statistics, that is  an inequality connecting the logarithmic moment generating function of order statistics with exponential moments of Efron-Stein (jackknife) estimates of variance. This connection is used to derive variance and tail bounds for order statistics of Gaussian samples that are not within the scope of the Gaussian concentration inequality. Proofs are elementary and combine Rényi's representation of order statistics with the entropy approach to concentration of measure popularized by M. Ledoux.

Article information

Electron. Commun. Probab., Volume 17 (2012), paper no. 51, 12 pp.

Accepted: 1 November 2012
First available in Project Euclid: 7 June 2016

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 60E15: Inequalities; stochastic orderings
Secondary: 60F10: Large deviations 60G70: Extreme value theory; extremal processes 62G30: Order statistics; empirical distribution functions 62G32: Statistics of extreme values; tail inference

concentration inequalities entropy method order statistics

This work is licensed under a Creative Commons Attribution 3.0 License.


Boucheron, Stéphane; Thomas, Maud. Concentration inequalities for order statistics. Electron. Commun. Probab. 17 (2012), paper no. 51, 12 pp. doi:10.1214/ECP.v17-2210. https://projecteuclid.org/euclid.ecp/1465263184

Export citation


  • S. Boucheron, O. Bousquet, and G. Lugosi, Concentration inequalities, Machine Learning Summer School 2003, Lecture Notes in Artificial Intelligence, 3176, Springer, Berlin (2004), 169–207.
  • Boucheron, Stéphane; Lugosi, Gábor; Massart, Pascal. Concentration inequalities using the entropy method. Ann. Probab. 31 (2003), no. 3, 1583–1614.
  • de Haan, Laurens; Ferreira, Ana. Extreme value theory. An introduction. Springer Series in Operations Research and Financial Engineering. Springer, New York, 2006. xviii+417 pp. ISBN: 978-0-387-23946-0; 0-387-23946-4
  • Efron, B.; Stein, C. The jackknife estimate of variance. Ann. Statist. 9 (1981), no. 3, 586–596.
  • Ledoux, Michel. The concentration of measure phenomenon. Mathematical Surveys and Monographs, 89. American Mathematical Society, Providence, RI, 2001. x+181 pp. ISBN: 0-8218-2864-9
  • Ledoux, Michel. A remark on hypercontractivity and tail inequalities for the largest eigenvalues of random matrices. Séminaire de Probabilités XXXVII, 360–369, Lecture Notes in Math., 1832, Springer, Berlin, 2003.
  • Massart, Pascal. About the constants in Talagrand's concentration inequalities for empirical processes. Ann. Probab. 28 (2000), no. 2, 863–884.
  • Massart, Pascal. Concentration inequalities and model selection. Lectures from the 33rd Summer School on Probability Theory held in Saint-Flour, July 6–23, 2003. With a foreword by Jean Picard. Lecture Notes in Mathematics, 1896. Springer, Berlin, 2007. xiv+337 pp. ISBN: 978-3-540-48497-4; 3-540-48497-3
  • Miller, Rupert G. The jackknife—a review. Biometrika 61 (1974), 1–15.
  • Shao, Jun; et al. A general theory for jackknife variance estimation. Ann. Statist. 17 (1989), no. 3, 1176–1197.
  • Tillich, Jean-Pierre; Zémor, Gilles. Discrete isoperimetric inequalities and the probability of a decoding error. Combin. Probab. Comput. 9 (2000), no. 5, 465–479.
  • van der Vaart, A. W. Asymptotic statistics. Cambridge Series in Statistical and Probabilistic Mathematics, 3. Cambridge University Press, Cambridge, 1998. xvi+443 pp. ISBN: 0-521-49603-9; 0-521-78450-6