Translator Disclaimer
September 2020 Active matrix factorization for surveys
Chelsea Zhang, Sean J. Taylor, Curtiss Cobb, Jasjeet Sekhon
Ann. Appl. Stat. 14(3): 1182-1206 (September 2020). DOI: 10.1214/20-AOAS1322

Abstract

Amid historically low response rates, survey researchers seek ways to reduce respondent burden while measuring desired concepts with precision. We propose to ask fewer questions of respondents and impute missing responses via probabilistic matrix factorization. A variance-minimizing active learning criterion chooses the most informative questions per respondent. In simulations of our matrix sampling procedure on real-world surveys as well as a Facebook survey experiment, we find active question selection achieves efficiency gains over baselines. The reduction in imputation error is heterogeneous across questions and depends on the latent concepts they capture. Modeling responses with the ordered logit likelihood improves imputations and yields an adaptive question order. We find for the Facebook survey that potential biases from order effects are likely to be small. With our method, survey researchers obtain principled suggestions of questions to retain and, if desired, can automate the design of shorter instruments.

Citation

Download Citation

Chelsea Zhang. Sean J. Taylor. Curtiss Cobb. Jasjeet Sekhon. "Active matrix factorization for surveys." Ann. Appl. Stat. 14 (3) 1182 - 1206, September 2020. https://doi.org/10.1214/20-AOAS1322

Information

Received: 1 June 2019; Revised: 1 December 2019; Published: September 2020
First available in Project Euclid: 18 September 2020

MathSciNet: MR4152129
Digital Object Identifier: 10.1214/20-AOAS1322

Rights: Copyright © 2020 Institute of Mathematical Statistics

JOURNAL ARTICLE
25 PAGES


SHARE
Vol.14 • No. 3 • September 2020
Back to Top