Open Access
2016 An improved global risk bound in concave regression
Sabyasachi Chatterjee
Electron. J. Statist. 10(1): 1608-1629 (2016). DOI: 10.1214/16-EJS1151

Abstract

A new risk bound is presented for the problem of convex/concave function estimation, using the least squares estimator. The best known risk bound, as had appeared in Guntuboyina and Sen, scaled like $\log(en)\:n^{-4/5}$ under the mean squared error loss, up to a constant factor. The authors in [8] had conjectured that the logarithmic term may be an artifact of their proof. We show that indeed the logarithmic term is unnecessary and prove a risk bound which scales like $n^{-4/5}$ up to constant factors. Our proof technique has one extra peeling step than in a usual chaining type argument. Our risk bound holds in expectation as well as with high probability and also extends to the case of model misspecification, where the true function may not be concave.

Citation

Download Citation

Sabyasachi Chatterjee. "An improved global risk bound in concave regression." Electron. J. Statist. 10 (1) 1608 - 1629, 2016. https://doi.org/10.1214/16-EJS1151

Information

Received: 1 May 2016; Published: 2016
First available in Project Euclid: 18 July 2016

zbMATH: 1349.62126
MathSciNet: MR3522655
Digital Object Identifier: 10.1214/16-EJS1151

Subjects:
Primary: 60K35 , 60K35
Secondary: 60K35

Rights: Copyright © 2016 The Institute of Mathematical Statistics and the Bernoulli Society

Vol.10 • No. 1 • 2016
Back to Top