Abstract
We propose a general optimization-based framework for computing differentially private M-estimators and a new method for constructing differentially private confidence regions. First, we show that robust statistics can be used in conjunction with noisy gradient descent or noisy Newton methods in order to obtain optimal private estimators with global linear or quadratic convergence, respectively. We establish local and global convergence guarantees, under both local strong convexity and self-concordance, showing that our private estimators converge with high probability to a small neighborhood of the nonprivate M-estimators. Second, we tackle the problem of parametric inference by constructing differentially private estimators of the asymptotic variance of our private M-estimators. This naturally leads to approximate pivotal statistics for constructing confidence regions and conducting hypothesis testing. We demonstrate the effectiveness of a bias correction that leads to enhanced small-sample empirical performance in simulations. We illustrate the benefits of our methods in several numerical examples.
Acknowledgments
The authors thank the Associate Editor and referees for suggestions which improved the quality of this manuscript.
Citation
Marco Avella-Medina. Casey Bradshaw. Po-Ling Loh. "Differentially private inference via noisy optimization." Ann. Statist. 51 (5) 2067 - 2092, October 2023. https://doi.org/10.1214/23-AOS2321
Information