Open Access
2017 Errors-in-variables models with dependent measurements
Mark Rudelson, Shuheng Zhou
Electron. J. Statist. 11(1): 1699-1797 (2017). DOI: 10.1214/17-EJS1234

Abstract

Suppose that we observe $y\in\mathbb{R}^{n}$ and $X\in\mathbb{R}^{n\times m}$ in the following errors-in-variables model: \begin{eqnarray*}y&=&X_{0}\beta^{*}+\epsilon\\X&=&X_{0}+W\end{eqnarray*} where $X_{0}$ is an $n\times m$ design matrix with independent subgaussian row vectors, $\epsilon\in\mathbb{R}^{n}$ is a noise vector and $W$ is a mean zero $n\times m$ random noise matrix with independent subgaussian column vectors, independent of $X_{0}$ and $\epsilon$. This model is significantly different from those analyzed in the literature in the sense that we allow the measurement error for each covariate to be a dependent vector across its $n$ observations. Such error structures appear in the science literature when modeling the trial-to-trial fluctuations in response strength shared across a set of neurons.

Under sparsity and restrictive eigenvalue type of conditions, we show that one is able to recover a sparse vector $\beta^{*}\in\mathbb{R}^{m}$ from the model given a single observation matrix $X$ and the response vector $y$. We establish consistency in estimating $\beta^{*}$ and obtain the rates of convergence in the $\ell_{q}$ norm, where $q=1,2$ for the Lasso-type estimator, and for $q\in [1,2]$ for a Dantzig-type Conic programming estimator. We show error bounds which approach that of the regular Lasso and the Dantzig selector in case the errors in $W$ are tending to 0. We analyze the convergence rates of the gradient descent methods for solving the nonconvex programs and show that the composite gradient descent algorithm is guaranteed to converge at a geometric rate to a neighborhood of the global minimizers: the size of the neighborhood is bounded by the statistical error in the $\ell_{2}$ norm. Our analysis reveals interesting connections between computational and statistical efficiency and the concentration of measure phenomenon in random matrix theory. We provide simulation evidence illuminating the theoretical predictions.

Citation

Download Citation

Mark Rudelson. Shuheng Zhou. "Errors-in-variables models with dependent measurements." Electron. J. Statist. 11 (1) 1699 - 1797, 2017. https://doi.org/10.1214/17-EJS1234

Information

Received: 1 December 2015; Published: 2017
First available in Project Euclid: 25 April 2017

zbMATH: 1364.62179
MathSciNet: MR3639561
Digital Object Identifier: 10.1214/17-EJS1234

Subjects:
Primary: 60K35

Keywords: Errors-in-variable models , matrix variate distributions , measurement error data , nonconvexity , subgaussian concentration

Vol.11 • No. 1 • 2017
Back to Top