Annals of Applied Statistics

Distributed multinomial regression

Matt Taddy

Full-text: Open access


This article introduces a model-based approach to distributed computing for multinomial logistic (softmax) regression. We treat counts for each response category as independent Poisson regressions via plug-in estimates for fixed effects shared across categories. The work is driven by the high-dimensional-response multinomial models that are used in analysis of a large number of random counts. Our motivating applications are in text analysis, where documents are tokenized and the token counts are modeled as arising from a multinomial dependent upon document attributes. We estimate such models for a publicly available data set of reviews from Yelp, with text regressed onto a large set of explanatory variables (user, business, and rating information). The fitted models serve as a basis for exploring the connection between words and variables of interest, for reducing dimension into supervised factor scores, and for prediction. We argue that the approach herein provides an attractive option for social scientists and other text analysts who wish to bring familiar regression tools to bear on text data.

Article information

Ann. Appl. Stat., Volume 9, Number 3 (2015), 1394-1414.

Received: December 2013
Revised: April 2015
First available in Project Euclid: 2 November 2015

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Distributed computing MapReduce logistic regression lasso text analysis multinomial inverse regression computational social science


Taddy, Matt. Distributed multinomial regression. Ann. Appl. Stat. 9 (2015), no. 3, 1394--1414. doi:10.1214/15-AOAS831.

Export citation


  • Birch, M. W. (1963). Maximum likelihood in three-way contingency tables. J. Roy. Statist. Soc. Ser. B 25 220–233.
  • Cook, R. D. (2007). Fisher lecture: Dimension reduction in regression. Statist. Sci. 22 1–26.
  • Dean, J. and Ghemawat, S. (2008). MapReduce: Simplified data processing on large clusters. Comm. ACM 51 107–113.
  • Friedman, J., Hastie, T. and Tibshirani, R. (2010). Regularization paths for generalized linear models via coordinate descent. J. Stat. Softw. 33 1–22.
  • Gopalan, P., Hofman, J. M. and Blei, D. M. (2013). Scalable recommendation with Poisson factorization. Available at arXiv:1311.1704.
  • Hodges, J. L. Jr. and Le Cam, L. (1960). The Poisson approximation to the Poisson binomial distribution. Ann. Math. Statist. 31 737–740.
  • Hurvich, C. M. and Tsai, C.-L. (1989). Regression and time series model selection in small samples. Biometrika 76 297–307.
  • McDonald, D. R. (1980). On the Poisson approximation to the multinomial distribution. Canad. J. Statist. 8 115–118.
  • Palmgren, J. (1981). The Fisher information matrix for log linear models arguing conditionally on observed explanatory variables. Biometrika 68 563–566.
  • Taddy, M. (2013a). Multinomial inverse regression for text analysis. J. Amer. Statist. Assoc. 108 755–770.
  • Taddy, M. (2013b). Rejoinder: Efficiency and structure in MNIR. J. Amer. Statist. Assoc. 108 772–774.
  • Taddy, M. (2013c). Measuring political sentiment on Twitter: Factor optimal design for multinomial inverse regression. Technometrics 55 415–425.
  • Taddy, M. (2014). One-step estimator paths for concave regularization. Available at arXiv:1308.5623.
  • Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. J. Roy. Statist. Soc. Ser. B 58 267–288.
  • Venables, W. N. and Ripley, B. D. (2002). Modern Applied Statistics with S, 4th ed. Springer, New York.