Bayesian Analysis

Extrinsic Gaussian Processes for Regression and Classification on Manifolds

Lizhen Lin, Niu Mu, Pokman Cheung, and David Dunson

Full-text: Open access


Gaussian processes (GPs) are very widely used for modeling of unknown functions or surfaces in applications ranging from regression to classification to spatial processes. Although there is an increasingly vast literature on applications, methods, theory and algorithms related to GPs, the overwhelming majority of this literature focuses on the case in which the input domain corresponds to a Euclidean space. However, particularly in recent years with the increasing collection of complex data, it is commonly the case that the input domain does not have such a simple form. For example, it is common for the inputs to be restricted to a non-Euclidean manifold, a case which forms the motivation for this article. In particular, we propose a general extrinsic framework for GP modeling on manifolds, which relies on embedding of the manifold into a Euclidean space and then constructing extrinsic kernels for GPs on their images. These extrinsic Gaussian processes (eGPs) are used as prior distributions for unknown functions in Bayesian inferences. Our approach is simple and general, and we show that the eGPs inherit fine theoretical properties from GP models in Euclidean spaces. We consider applications of our models to regression and classification problems with predictors lying in a large class of manifolds, including spheres, planar shape spaces, a space of positive definite matrices, and Grassmannians. Our models can be readily used by practitioners in biological sciences for various regression and classification problems, such as disease diagnosis or detection. Our work is also likely to have impact in spatial statistics when spatial locations are on the sphere or other geometric spaces.

Article information

Bayesian Anal., Volume 14, Number 3 (2019), 887-906.

First available in Project Euclid: 11 June 2019

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

extrinsic Gaussian process (eGP) manifold-valued predictors neuro-imaging regression on manifold

Creative Commons Attribution 4.0 International License.


Lin, Lizhen; Mu, Niu; Cheung, Pokman; Dunson, David. Extrinsic Gaussian Processes for Regression and Classification on Manifolds. Bayesian Anal. 14 (2019), no. 3, 887--906. doi:10.1214/18-BA1135.

Export citation


  • Alexander, A., Lee, J. E., Lazar, M., and Field, A. S. (2007). “Diffusion Tensor Imaging of the Brain.” Neurotherapeutics, 4(3): 316–329.
  • Bartsch, T. (2012). The Clinical Neurobiology of the Hippocampus: An Integrative View. OUP Oxford.
  • Bhattacharya, A. and Bhattacharya, R. (2012). Nonparametric Inference on Manifolds: with Applications to Shape Spaces. Cambridge University Press. IMS monographs #2.
  • Bhattacharya, A. and Dunson, D. (2010a). “Nonparametric Bayes regression and classification through mixtures of product kernels.” Bayesian Analysis, 9: 145–164.
  • Bhattacharya, A. and Dunson, D. B. (2010b). “Nonparametric Bayesian density estimation on manifolds with applications to planar shapes.” Biometrika, 97(4): 851–865.
  • Bhattacharya, R. and Lin, L. (2017). “Omnibus CLTs for Fréchet means and nonparametric inference on non-Euclidean spaces.” The Proceedings of the American Mathematical Society, 145: 413–428.
  • Bhattacharya, R. and Patrangenaru, V. (2005). “Large sample theory of intrinsic and extrinsic sample means on manifolds. II.” The Annals of Statistics, 33(3): 1225–1259.
  • Bhattacharya, R. N. and Patrangenaru, V. (2003). “Large sample theory of intrinsic and extrinsic sample means on manifolds.” The Annals of Statistics, 31: 1–29.
  • Bookstein, F. (1978). The Measurement of Biological Shape and Shape Change. Lecture Notes in Biomathematics, Springer, Berlin.
  • Castillo, I., Kerkyacharian, G., and Picard, D. (2014). “Thomas Bayes’s walk on manifolds.” Probability Theory and Related Fields, 158(3–4): 665–710.
  • Cheng, M. and Wu, H. (2013). “Local Linear Regression on Manifolds and Its Geometric Interpretation.” Journal of the American Statistical Association, 108(504): 1421–1434.
  • Chikuse, Y. (2003). Statistics on Special Manifolds. Springer, New York.
  • Downs, T., Liebman, J., and Mackay, W. (1971). “Statistical methods for vectorcardiogram orientations.” In Vectorcardiography 2: Proc. XIth International Symposium on Vectorcardiography, 216–222. North-Holland, Amsterdam.
  • Dryden, I. L. and Mardia, K. V. (1998). Statistical Shape Analysis. Wiley, New York.
  • Du, J., Ma, C., and Li, Y. (2013). “Isotropic Variogram Matrix Functions on Spheres.” Mathematical Geosciences, 45(3): 341–357.
  • Duane, S., Kennedy, A. D., Pendleton, B. J., and Roweth, D. (1987). “Hybrid Monte Carlo.” Physics letters B, 195(2): 216–222.
  • Gneiting, T. (2013). “Strictly and non-strictly positive definite functions on spheres.” Bernoulli, 19(4): 1327–1349.
  • Guinness, J. and Fuentes, M. (2016). “Isotropic covariance functions on spheres: Some properties and modeling considerations.” Journal of Multivariate Analysis, 143: 143–152.
  • Hitczenko, M. and Stein, M. (2012). “Some theory for anisotropic processes on the sphere.” Statistical Methodology, 9(1–2): 211–227. Special Issue on Astrostatistics + Special Issue on Spatial Statistics.
  • Ho, J., Lee, K.-C., Yang, M.-H., and Kriegman, D. (2004). “Visual tracking using learned linear subspaces.” In CVPR 2004. Proceedings of the 2004 IEEE Computer Society Conference on, volume 1, 782–789.
  • Huang, C., Zhang, H., and Robeson, S. (2011). “On the Validity of Commonly Used Covariance and Variogram Functions on the Sphere.” Mathematical Geosciences, 43(6): 721–733.
  • Jun, M. and Stein, M. L. (2008). “Nonstationary covariance models for global data.” The Annals of Applied Statistics, 2(4): 1271–1289.
  • Kendall, D. G. (1977). “The diffusion of shape.” Advances in Applied Probability, 9: 428–430.
  • Kendall, D. G. (1984). “Shape Manifolds, Procrustean Metrics, and Complex Projective Spaces.” Bulletin of the London Mathematical Society, 16: 81–121.
  • Kolaczyk, E., Lin, L., Rosenberg, S., and Walters, J. (2017). “Averages of Unlabeled Networks: Geometric Characterization and Asymptotic Behavior.” ArXiv e-prints.
  • Kutyniok, G., Pezeshki, A., Calderbank, R., and Liu, T. (2009). “Robust dimension reduction, fusion frames, and Grassmannian packings.” Applied and Computational Harmonic Analysis, 26(1): 64–76.
  • Lin, L., Rao, V., and Dunson, D. B. (2017). “Bayesian nonparametric inference on the Stiefel manifold.” Statistics Sinica, 27: 535–553.
  • Lin, L., Thomas, B. S., Zhu, H., and Dunson, D. B. (2017). “Extrinsic Local Regression on Manifold-Valued Data.” Journal of the American Statistical Association, 112(519): 1261–1273.
  • Neal, R. M. (2012). Bayesian learning for neural networks, volume 118. Springer Science & Business Media.
  • Pelletier, B. (2005). “Kernel density estimation on Riemannian manifolds.” Statistics and Probability Letters, 73(3): 297–304.
  • Rasmussen, C. E. (2004). “Gaussian Processes in Machine Learning.” Advanced Lectures on Machine Learning, 63–71.
  • Rasmussen, C. E. and Williams, C. K. I. (2005). Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning). The MIT Press.
  • St. Thomas, B., Lin, L., Lim, L.-H., and Mukherjee, S. (2014). “Learning subspaces of different dimension.” ArXiv e-prints, 1404.6841.
  • Teja, G. and Ravi, S. (2012). “Face recognition using subspaces techniques.” In Recent Trends In Information Technology (ICRTIT), 2012 International Conference on, 103–107.
  • van der Vaart, A. W. and van Zanten, J. H. (2008). “Rates of contraction of posterior distributions based on Gaussian process priors.” The Annals of Statistics, 36(3): 1435–1463.
  • van der Vaart, A. W. and van Zanten, J. H. (2009). “Adaptive Bayesian estimation using a Gaussian random field with inverse Gamma bandwidth.” The Annals of Statistics, 37(5B): 2655–2675.
  • Williams, C. K. and Barber, D. (1998). “Bayesian classification with Gaussian processes.” IEEE Transactions on Pattern Analysis and Machine Intelligence, 20(12): 1342–1351.
  • Williams, C. K. and Rasmussen, C. E. (1996). “Gaussian processes for regression.” Advances in neural information processing systems 8, 514–520.
  • Yang, Y. and Dunson, D. B. (2016). “Bayesian manifold regression.” The Annals of Statistics, 44(2): 876–905.
  • Yuan, Y., Zhu, H., Lin, W., and Marron, J. S. (2012). “Local polynomial regression for symmetric positive definite matrices.” Journal of the Royal Statistical Society: Series B, 74: 697–719.