Open Access
February 2021 Transfer learning for nonparametric classification: Minimax rate and adaptive classifier
T. Tony Cai, Hongji Wei
Ann. Statist. 49(1): 100-128 (February 2021). DOI: 10.1214/20-AOS1949

Abstract

Human learners have the natural ability to use knowledge gained in one setting for learning in a different but related setting. This ability to transfer knowledge from one task to another is essential for effective learning. In this paper, we study transfer learning in the context of nonparametric classification based on observations from different distributions under the posterior drift model, which is a general framework and arises in many practical problems.

We first establish the minimax rate of convergence and construct a rate-optimal two-sample weighted $K$-NN classifier. The results characterize precisely the contribution of the observations from the source distribution to the classification task under the target distribution. A data-driven adaptive classifier is then proposed and is shown to simultaneously attain within a logarithmic factor of the optimal rate over a large collection of parameter spaces. Simulation studies and real data applications are carried out where the numerical results further illustrate the theoretical analysis. Extensions to the case of multiple source distributions are also considered.

Citation

Download Citation

T. Tony Cai. Hongji Wei. "Transfer learning for nonparametric classification: Minimax rate and adaptive classifier." Ann. Statist. 49 (1) 100 - 128, February 2021. https://doi.org/10.1214/20-AOS1949

Information

Received: 1 April 2019; Revised: 1 November 2019; Published: February 2021
First available in Project Euclid: 29 January 2021

Digital Object Identifier: 10.1214/20-AOS1949

Subjects:
Primary: 62F30
Secondary: 62B10 , 62F12

Keywords: Adaptivity , ‎classification‎ , domain adaptation , Minimax rate , transfer learning

Rights: Copyright © 2021 Institute of Mathematical Statistics

Vol.49 • No. 1 • February 2021
Back to Top