Open Access
October, 1969 A Decomposition Theorem for Vector Variables with a Linear Structure
C. Radhakrishna Rao
Ann. Math. Statist. 40(5): 1845-1849 (October, 1969). DOI: 10.1214/aoms/1177697400


A vector variable $\mathbf{X}$ is said to have a linear structure if it can be written as $\mathbf{X} = \mathbf{AY}$ where $\mathbf{A}$ is a matrix and $\mathbf{Y}$ is a vector of independent random variables called structural variables. In earlier papers the conditions under which a vector random variable admits different structural representations have been studied. It is shown, among other results, that complete non-uniqueness, in some sense, of the linear structure characterizes a multivariate normal variable. In the present paper we prove a general decomposition theorem which states that any vector variable $\mathbf{X}$ with a linear structure can be expressed as the sum $(\mathbf{X}_1 + \mathbf{X}_2)$ of two independent vector variables $\mathbf{X}_1, \mathbf{X}_2$ of which $\mathbf{X}_1$ is non-normal and has a unique linear structure, and $\mathbf{X}_2$ is multivariate normal variable with a nonunique linear structure.


Download Citation

C. Radhakrishna Rao. "A Decomposition Theorem for Vector Variables with a Linear Structure." Ann. Math. Statist. 40 (5) 1845 - 1849, October, 1969.


Published: October, 1969
First available in Project Euclid: 27 April 2007

zbMATH: 0183.48703
MathSciNet: MR251860
Digital Object Identifier: 10.1214/aoms/1177697400

Rights: Copyright © 1969 Institute of Mathematical Statistics

Vol.40 • No. 5 • October, 1969
Back to Top