## The Annals of Statistics

### On almost Linearity of Low Dimensional Projections from High Dimensional Data

#### Abstract

This paper studies the shapes of low dimensional projections from high dimensional data. After standardization, let $\mathbf{x}$ be a $p$-dimensional random variable with mean zero and identity covariance. For a projection $\beta'\mathbf{x}, \|\beta\| = 1$, find another direction $b$ so that the regression curve of $b'\mathbf{x}$ against $\beta'\mathbf{x}$ is as nonlinear as possible. We show that when the dimension of $\mathbf{x}$ is large, for most directions $\beta$ even the most nonlinear regression is still nearly linear. Our method depends on the construction of a pair of $p$-dimensional random variables, $\mathbf{w}_1, \mathbf{w}_2$, called the rotational twin, and its density function with respect to the standard normal density. With this, we are able to obtain closed form expressions for measuring deviation from normality and deviation from linearity in a suitable sense of average. As an interesting by-product, from a given set of data we can find simple unbiased estimates of $E(f_{\beta'\mathbf{x}}(t)/\phi_1(t) - 1)^2$ and $E\lbrack (\|E(\mathbf{x} \mid \beta, \beta'\mathbf{x} = t)\|^2 - t^2)f^2_{\beta'\mathbf{x}}(t)/\phi^2_1(t)\rbrack$, where $\phi_1$ is the standard normal density, $f_{\beta'\mathbf{x}}$ is the density for $\beta'\mathbf{x}$ and the $"E"$ is taken with respect to the uniformly distributed $\beta$. This is achieved without any smoothing and without resorting to any laborious projection procedures such as grand tours. Our result is related to the work of Diaconis and Freedman. The impact of our result on several fronts of data analysis is discussed. For example, it helps establish the validity of regression analysis when the link function of the regression model may be grossly wrong. A further generalization, which replaces $\beta'\mathbf{x}$ by $B'\mathbf{x}$ with $B = (\beta_1,\ldots, \beta_k)$ for $k$ randomly selected orthonormal vectors $(\beta_i, i = 1,\ldots, k)$, helps broaden the scope of application of sliced inverse regression (SIR).

#### Article information

Source
Ann. Statist., Volume 21, Number 2 (1993), 867-889.

Dates
First available in Project Euclid: 12 April 2007

https://projecteuclid.org/euclid.aos/1176349155

Digital Object Identifier
doi:10.1214/aos/1176349155

Mathematical Reviews number (MathSciNet)
MR1232523

Zentralblatt MATH identifier
0782.62065

JSTOR