Abstract
In this paper we observe a set, possibly a continuum, of signals corrupted by noise. Each signal is a finite mixture of an unknown number of features belonging to a continuous dictionary. The continuous dictionary is parametrized by a real non-linear parameter. We shall assume that the signals share an underlying structure by assuming that each signal has its active features included in a finite and sparse set. We formulate regularized optimization problem to estimate simultaneously the linear coefficients in the mixtures and the non-linear parameters of the features. The optimization problem is composed of a data fidelity term and a -penalty. We call its solution the Group-Nonlinear-Lasso and provide high probability bounds on the prediction error using certificate functions. Following recent works on the geometry of off-the-grid methods, we show that such functions can be constructed provided the parameters of the active features are pairwise separated by a constant with respect to a Riemannian metric. When the number of signals is finite and the noise is assumed Gaussian, we give refinements of our results for and using tail bounds on suprema of Gaussian and random processes. When , our prediction error reaches the rates obtained by the Group-Lasso estimator in the multi-task linear regression model. Furthermore, for these prediction rates are faster than for when all signals share most of the non-linear parameters.
Funding Statement
This work was partially supported by the ANRT grant N°2019/1260 and the grant Investissements d’Avenir (ANR11-IDEX0003/Labex Ecodec/ANR-11-LABX-00).
Acknowledgements
The authors are grateful to the Associate Editor and the referees for their useful comments.
Citation
Cristina Butucea. Jean-François Delmas. Anne Dutfoy. Clément Hardy. "Simultaneous off-the-grid learning of mixtures issued from a continuous dictionary." Bernoulli 31 (1) 187 - 212, February 2025. https://doi.org/10.3150/24-BEJ1724
Information