Abstract
We study a problem of estimation of a Hermitian nonnegatively definite matrix ρ of unit trace (e.g., a density matrix of a quantum system) based on n i.i.d. measurements (X1, Y1), …, (Xn, Yn), where
Yj = tr(ρXj) + ξj, j = 1, …, n,
{Xj} being random i.i.d. Hermitian matrices and {ξj} being i.i.d. random variables with ${\mathbb{E}}(\xi_{j}|X_{j})=0$. The estimator $$\hat{\rho}^{\varepsilon }:=\mathop{\arg\min}_{S\in{\mathcal{S}}}\Biggl[n^{-1}\sum_{j=1}^{n}\bigl(Y_{j}-\operatorname{tr}(SX_{j})\bigr)^{2}+\varepsilon \operatorname{tr}(S\log S)\Biggr] $$ is considered, where ${\mathcal{S}}$ is the set of all nonnegatively definite Hermitian m × m matrices of trace 1. The goal is to derive oracle inequalities showing how the estimation error depends on the accuracy of approximation of the unknown state ρ by low-rank matrices.
Citation
Vladimir Koltchinskii. "Von Neumann entropy penalization and low-rank matrix estimation." Ann. Statist. 39 (6) 2936 - 2973, December 2011. https://doi.org/10.1214/11-AOS926
Information