Finding anonymization mechanisms to protect personal data is at the heart of recent machine learning research. Here, we consider the consequences of local differential privacy constraints on goodness-of-fit testing, that is, the statistical problem assessing whether sample points are generated from a fixed density , or not. The observations are kept hidden and replaced by a stochastic transformation satisfying the local differential privacy constraint. In this setting, we propose a testing procedure which is based on an estimation of the quadratic distance between the density f of the unobserved samples and . We establish an upper bound on the separation distance associated with this test, and a matching lower bound on the minimax separation rates of testing under non-interactive privacy in the case that is uniform, in discrete and continuous settings. To the best of our knowledge, we provide the first minimax optimal test and associated private transformation under a local differential privacy constraint over Besov balls in the continuous setting, quantifying the price to pay for data privacy. We also present a test that is adaptive to the smoothness parameter of the unknown density and remains minimax optimal up to a logarithmic factor. Finally, we note that our results can be translated to the discrete case, where the treatment of probability vectors is shown to be equivalent to that of piecewise constant densities in our setting. That is why we work with a unified setting for both the continuous and the discrete cases.
The authors would like to thank the anonymous referee, the Associate Editor and the Editor for their constructive comments that improved the quality of this paper.
B. Laurent and J-M. Loubes recognize the funding by ANITI ANR-19-PI3A-0004.
"Minimax optimal goodness-of-fit testing for densities and multinomials under a local differential privacy constraint." Bernoulli 28 (1) 579 - 600, February 2022. https://doi.org/10.3150/21-BEJ1358