We revisit the problem of designing an efficient binary classifier in a challenging high-dimensional framework. The model under study assumes some local dependence structure among feature variables represented by a block-diagonal covariance matrix with a growing number of blocks of an arbitrary, but fixed size. The blocks correspond to non-overlapping independent groups of strongly correlated features. To assess the relevance of a particular block in predicting the response, we introduce a measure of “signal strength” pertaining to each feature block. This measure is then used to specify a sparse model of our interest. We further propose a threshold-based feature selector which operates as a screen-and-clean scheme integrated into a linear classifier: the data is subject to screening and hard threshold cleaning to filter out the blocks that contain no signals. Asymptotic properties of the proposed classifiers are studied when the sample size n depends on the number of feature blocks b, and the sample size goes to infinity with b at a slower rate than b. The new classifiers, which are fully adaptive to unknown parameters of the model, are shown to perform asymptotically optimally in a large part of the classification region. The numerical study confirms good analytical properties of the new classifiers that compare favorably to the existing threshold-based procedure used in a similar context.
T. Pavlenko was supported in part by a grant AI4Research from Uppsala University. N. Stepanova was supported by an NSERC grant. L. Thompson was supported in part by an NSERC grant.
The authors are grateful to the Editor Domenico Marinucci and anonymous referee for many helpful comments on this work.
"Adaptive threshold-based classification of sparse high-dimensional data." Electron. J. Statist. 16 (1) 1952 - 1996, 2022. https://doi.org/10.1214/22-EJS1998