Open Access
December 2017 A Bayesian Nonparametric Approach to Testing for Dependence Between Random Variables
Sarah Filippi, Chris C. Holmes
Bayesian Anal. 12(4): 919-938 (December 2017). DOI: 10.1214/16-BA1027

Abstract

Nonparametric and nonlinear measures of statistical dependence between pairs of random variables are important tools in modern data analysis. In particular the emergence of large data sets can now support the relaxation of linearity assumptions implicit in traditional association scores such as correlation. Here we describe a Bayesian nonparametric procedure that leads to a tractable, explicit and analytic quantification of the relative evidence for dependence vs independence. Our approach uses Pólya tree priors on the space of probability measures which can then be embedded within a decision theoretic test for dependence. Pólya tree priors can accommodate known uncertainty in the form of the underlying sampling distribution and provides an explicit posterior probability measure of both dependence and independence. Well known advantages of having an explicit probability measure include: easy comparison of evidence across different studies; encoding prior information; quantifying changes in dependence across different experimental conditions, and the integration of results within formal decision analysis.

Citation

Download Citation

Sarah Filippi. Chris C. Holmes. "A Bayesian Nonparametric Approach to Testing for Dependence Between Random Variables." Bayesian Anal. 12 (4) 919 - 938, December 2017. https://doi.org/10.1214/16-BA1027

Information

Published: December 2017
First available in Project Euclid: 21 September 2016

zbMATH: 1384.62146
MathSciNet: MR3724973
Digital Object Identifier: 10.1214/16-BA1027

Keywords: Bayesian nonparametrics , dependence measure , Hypothesis testing , Pólya tree

Vol.12 • No. 4 • December 2017
Back to Top