The Annals of Statistics

Functional additive regression

Yingying Fan, Gareth M. James, and Peter Radchenko

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text

Abstract

We suggest a new method, called Functional Additive Regression, or FAR, for efficiently performing high-dimensional functional regression. FAR extends the usual linear regression model involving a functional predictor, $X(t)$, and a scalar response, $Y$, in two key respects. First, FAR uses a penalized least squares optimization approach to efficiently deal with high-dimensional problems involving a large number of functional predictors. Second, FAR extends beyond the standard linear regression setting to fit general nonlinear additive models. We demonstrate that FAR can be implemented with a wide range of penalty functions using a highly efficient coordinate descent algorithm. Theoretical results are developed which provide motivation for the FAR optimization criterion. Finally, we show through simulations and two real data sets that FAR can significantly outperform competing methods.

Article information

Source
Ann. Statist., Volume 43, Number 5 (2015), 2296-2325.

Dates
Received: November 2014
Revised: May 2015
First available in Project Euclid: 16 September 2015

Permanent link to this document
https://projecteuclid.org/euclid.aos/1442364153

Digital Object Identifier
doi:10.1214/15-AOS1346

Mathematical Reviews number (MathSciNet)
MR3396986

Zentralblatt MATH identifier
1327.62252

Subjects
Primary: 62G08: Nonparametric regression
Secondary: 62G20: Asymptotic properties

Keywords
Functional regression shrinkage single index model variable selection

Citation

Fan, Yingying; James, Gareth M.; Radchenko, Peter. Functional additive regression. Ann. Statist. 43 (2015), no. 5, 2296--2325. doi:10.1214/15-AOS1346. https://projecteuclid.org/euclid.aos/1442364153


Export citation

References

  • [1] Ait-Saïdi, A., Ferraty, F., Kassa, R. and Vieu, P. (2008). Cross-validated estimations in the single-functional index model. Statistics 42 475–494.
  • [2] Alter, O., Brown, P. O. and Botstein, D. (2000). Singular value decomposition for genome-wide expression data processing and modeling. Proc. Natl. Acad. Sci. USA 97 10101–10106.
  • [3] Amato, U., Antoniadis, A. and De Feis, I. (2006). Dimension reduction in functional regression with applications. Comput. Statist. Data Anal. 50 2422–2446.
  • [4] Bongiorno, E., Goia, A. and Salinelli, E. (2014). Contributions in infinite-dimensional statistics and related topics. Societa Editrice Esculapio.
  • [5] Bühlmann, P. and van de Geer, S. (2011). Statistics for High-Dimensional Data. Methods, Theory and Applications. Springer, Heidelberg.
  • [6] Cardot, H., Ferraty, F. and Sarda, P. (2003). Spline estimators for the functional linear model. Statist. Sinica 13 571–591.
  • [7] Chen, D., Hall, P. and Müller, H.-G. (2011). Single and multiple index functional regression models with nonparametric link. Ann. Statist. 39 1720–1747.
  • [8] Fan, J. and Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties. J. Amer. Statist. Assoc. 96 1348–1360.
  • [9] Fan, J. and Lv, J. (2011). Nonconcave penalized likelihood with NP-dimensionality. IEEE Trans. Inform. Theory 57 5467–5484.
  • [10] Fan, J. and Peng, H. (2004). Nonconcave penalized likelihood with a diverging number of parameters. Ann. Statist. 32 928–961.
  • [11] Fan, Y., Foutz, N., James, G. M. and Jank, W. (2014). Functional response additive model estimation with online virtual stock markets. Ann. Appl. Stat. 8 2435–2460.
  • [12] Fan, Y., James, G. M. and Radchenko, P. (2015). Supplement to “Functional additive regression.” DOI:10.1214/15-AOS1346SUPP.
  • [13] Febrero-Bande, M. and González-Manteiga, W. (2013). Generalized additive models for functional data. TEST 22 278–292.
  • [14] Ferraty, F., Goia, A., Salinelli, E. and Vieu, P. (2013). Functional projection pursuit regression. TEST 22 293–320.
  • [15] Ferraty, F. and Vieu, P. (2003). Curves discrimination: A nonparametric functional approach. Comput. Statist. Data Anal. 44 161–173.
  • [16] Ferraty, F. and Vieu, P. (2009). Additive prediction and boosting for functional data. Comput. Statist. Data Anal. 53 1400–1413.
  • [17] Goia, A. (2012). A functional linear model for time series prediction with exogenous variables. Statist. Probab. Lett. 82 1005–1011.
  • [18] Hall, P., Poskitt, D. S. and Presnell, B. (2001). A functional data-analytic approach to signal discrimination. Technometrics 43 1–9.
  • [19] Hall, P., Reimann, J. and Rice, J. (2000). Nonparametric estimation of a periodic function. Biometrika 87 545–557.
  • [20] Hastie, T. and Mallows, C. (1993). Comment on “A statistical view of some chemometrics regression tools.” Technometrics 35 140–143.
  • [21] Huang, J., Horowitz, J. L. and Wei, F. (2010). Variable selection in nonparametric additive models. Ann. Statist. 38 2282–2313.
  • [22] James, G. M. (2002). Generalized linear models with functional predictors. J. R. Stat. Soc. Ser. B. Stat. Methodol. 64 411–432.
  • [23] James, G. M. and Silverman, B. W. (2005). Functional adaptive model estimation. J. Amer. Statist. Assoc. 100 565–576.
  • [24] Li, K.-C. (1991). Sliced inverse regression for dimension reduction. J. Amer. Statist. Assoc. 86 316–342.
  • [25] Lian, H. (2011). Functional partial linear model. J. Nonparametr. Stat. 23 115–128.
  • [26] Loh, P.-L. and Wainwright, M. J. (2015). Regularized $M$-estimators with nonconvexity: Statistical and algorithmic theory for local optima. J. Mach. Learn. Res. 16 559–616.
  • [27] Lv, J. and Fan, Y. (2009). A unified approach to model selection and sparse recovery using regularized least squares. Ann. Statist. 37 3498–3528.
  • [28] Mas, A. and Pumo, B. (2007). The ARHD model. J. Statist. Plann. Inference 137 538–553.
  • [29] Meier, L., van de Geer, S. and Bühlmann, P. (2009). High-dimensional additive modeling. Ann. Statist. 37 3779–3821.
  • [30] Müller, H.-G. and Stadtmüller, U. (2005). Generalized functional linear models. Ann. Statist. 33 774–805.
  • [31] Müller, H.-G. and Yao, F. (2008). Functional additive models. J. Amer. Statist. Assoc. 103 1534–1544.
  • [32] Ramsay, J. O. and Silverman, B. W. (2005). Functional Data Analysis, 2nd ed. Springer, New York.
  • [33] Ravikumar, P., Lafferty, J., Liu, H. and Wasserman, L. (2009). Sparse additive models. J. R. Stat. Soc. Ser. B. Stat. Methodol. 71 1009–1030.
  • [34] Schumaker, L. L. (2007). Spline Functions: Basic Theory, 3rd ed. Cambridge Univ. Press, Cambridge.
  • [35] Simon, N. and Tibshirani, R. (2012). Standardization and the group Lasso penalty. Statist. Sinica 22 983–1001.
  • [36] Storey, J. D., Xiao, W., Leek, J. T., Tompkins, R. G. and Davis, R. W. (2005). Significance analysis of time course microarray experiments. Proc. Natl. Acad. Sci. USA 102 12837–12842.
  • [37] Wasserman, L. (2006). All of Nonparametric Statistics. Springer, New York.
  • [38] Yu, Y. and Ruppert, D. (2002). Penalized spline estimation for partially linear single-index models. J. Amer. Statist. Assoc. 97 1042–1054.
  • [39] Zhou, S., Shen, X. and Wolfe, D. A. (1998). Local asymptotics for regression splines and confidence regions. Ann. Statist. 26 1760–1782.
  • [40] Zhu, H., Vannucci, M. and Cox, D. D. (2010). A Bayesian hierarchical model for classification with selection of functional predictors. Biometrics 66 463–473.
  • [41] Zou, H., Hastie, T. and Tibshirani, R. (2007). On the “degrees of freedom” of the lasso. Ann. Statist. 35 2173–2192.
  • [42] Zou, H. and Li, R. (2008). One-step sparse estimates in nonconcave penalized likelihood models. Ann. Statist. 36 1509–1533.

Supplemental materials

  • Supplementary material for: Functional additive regression. Due to space constraints, the proofs of Theorems 1 and 2 and Lemma 1 are relegated to the supplement [12].