The Annals of Applied Statistics

Compared to what? Variation in the impacts of early childhood education by alternative care type

Avi Feller, Todd Grindal, Luke Miratrix, and Lindsay C. Page

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text

Abstract

Early childhood education research often compares a group of children who receive the intervention of interest to a group of children who receive care in a range of different care settings. In this paper, we estimate differential impacts of an early childhood intervention by alternative care type, using data from the Head Start Impact Study, a large-scale randomized evaluation. To do so, we utilize a Bayesian principal stratification framework to estimate separate impacts for two types of Compliers: those children who would otherwise be in other center-based care when assigned to control and those who would otherwise be in home-based care. We find strong, positive short-term effects of Head Start on receptive vocabulary for those Compliers who would otherwise be in home-based care. By contrast, we find no meaningful impact of Head Start on vocabulary for those Compliers who would otherwise be in other center-based care. Our findings suggest that alternative care type is a potentially important source of variation in early childhood education interventions.

Article information

Source
Ann. Appl. Stat. Volume 10, Number 3 (2016), 1245-1285.

Dates
Received: May 2015
Revised: January 2016
First available in Project Euclid: 28 September 2016

Permanent link to this document
https://projecteuclid.org/euclid.aoas/1475069607

Digital Object Identifier
doi:10.1214/16-AOAS910

Mathematical Reviews number (MathSciNet)
MR3553224

Keywords
Principal stratification early childhood education treatment effect variation Head Start

Citation

Feller, Avi; Grindal, Todd; Miratrix, Luke; Page, Lindsay C. Compared to what? Variation in the impacts of early childhood education by alternative care type. Ann. Appl. Stat. 10 (2016), no. 3, 1245--1285. doi:10.1214/16-AOAS910. https://projecteuclid.org/euclid.aoas/1475069607


Export citation

References

  • Abadie, A. (2003). Semiparametric instrumental variable estimation of treatment response models. J. Econometrics 113 231–263.
  • Administration for Children and Families (2014). Head Start Program Facts, Fiscal Year 2013. Available at https://eclkc.ohs.acf.hhs.gov/hslc/data/factsheets/docs/hs-program-fact-sheet-2013.pdf.
  • Angrist, J. D. (2004). Treatment effect heterogeneity in theory and practice. Econ. J. 114 C52–C83.
  • Angrist, J. D., Imbens, G. W. and Rubin, D. B. (1996). Identification of causal effects using instrumental variables. J. Amer. Statist. Assoc. 91 444–455.
  • Angrist, J. D. and Pischke, J.-S. (2008). Mostly Harmless Econometrics: An Empiricist’s Companion. Princeton University Press, Princeton, NJ.
  • Bafumi, J. and Gelman, A. E. (2006). Fitting multilevel models when predictors and group effects correlate. Unpublished manuscript.
  • Barnard, J., Frangakis, C. E., Hill, J. L. and Rubin, D. B. (2003). Principal stratification approach to broken randomized experiments: A case study of school choice vouchers in New York City. J. Amer. Statist. Assoc. 98 299–323.
  • Barnett, W. S. (1995). Long-term effects of early childhood programs on cognitive and school outcomes. Future Child. 5 25.
  • Barnett, W. S. (2011). Effectiveness of early educational intervention. Science 333 975–978.
  • Barnett, W. S. and Haskins, R. (2010). Investing in Young Children: New Directions in Federal Preschool and Early Childhood Policy. The Brookings Institute, Washington, DC.
  • Barnett, W. S., Carolan, M. E., Squires, J. H. and Brown, K. C. (2014). State of Preschool 2013: First Look. U.S. Dept. Education, National Center for Education Statistics, Washington, DC.
  • Bassok, D., Fitzpatrick, M. and Loeb, S. (2013). Does state preschool crowd-out private provision? The impact of universal preschool on the childcare sector in Oklahoma and Georgia. NBER Working Paper 18605.
  • Bitler, M., Gelbach, J. and Hoynes, H. (2003). What mean impacts miss: Distributional effects of welfare reform experiments. Am. Econ. Rev. 96 988–1012.
  • Bitler, M., Hoynes, H. and Domina, T. (2014). Experimental evidence on distributional effects of Head Start. Working paper.
  • Bloom, H. S. and Unterman, R. (2014). Can small high schools of choice improve educational prospects for disadvantaged students? J. Policy Anal. Manage. 33 290–319.
  • Bloom, H. S. and Weiland, C. (2014). To what extent do the effects of Head Start on enrolled children vary across sites? Working paper.
  • Bordes, L., Mottelet, S. and Vandekerkhove, P. (2006). Semiparametric estimation of a two-component mixture model. Ann. Statist. 34 1204–1232.
  • Burgess, K., Chien, N., Morrissey, T. and Swenson, K. (2014). Trends in the use of early care and education, 1995–2011: Descriptive analysis of child care arrangements from national survey data. Report from the Office of the Assistant Secretary for Plannng and Evaluation, US Department of Health and Human Services.
  • Carneiro, P. and Ginja, R. (2014). Long term impacts of compensatory preschool on health and behavior: Evidence from Head Start. Am. Econ. J. Appl. Econ. 6 135–173.
  • Cascio, E. U. and Schanzenbach, D. W. (2013). The impacts of expanding access to high-quality preschool education. In Brookings Papers on Economic Activity 127–192. Brookings Institution, Washington, DC.
  • Westinghouse Learning Corporation (1969). The Impact of Head Start: An Evaluation of the Effects of Head Start on Children’s Cognitive and Affective Development, Vol. 1: Report to the Office of Economic Opportunity. Westinghouse Learning Corporation and Ohio Univ., Athens, Ohio.
  • Coulson, A. J. (2013). Preschool’s Anvil Chorus. Cato Institute, Washington, DC.
  • Cox, D. R. and Donnelly, C. A. (2011). Principles of Applied Statistics. Cambridge Univ. Press, Cambridge.
  • Currie, J. and Thomas, D. (1995). Does Head Start make a difference? Am. Econ. Rev. 85 341–364.
  • Day, N. E. (1969). Estimating the components of a mixture of normal distributions. Biometrika 56 463–474.
  • Deming, D. (2009). Early childhood intervention and life-cycle skill development: Evidence from Head Start. Am. Econ. J. Appl. Econ. 1 111–134.
  • Ding, P., Geng, Z., Yan, W. and Zhou, X.-H. (2011). Identifiability and estimation of causal effects by principal stratification with outcomes truncated by death. J. Amer. Statist. Assoc. 106 1578–1591.
  • Duncan, G. J. and Magnuson, K. (2013). Investing in preschool programs. J. Econ. Perspect. 27 109–132.
  • Elango, S., García, J. L., Heckman, J. J. and Hojman, A. (2015). Early childhood education. Technical report, National Bureau of Economic Research Working Paper No. 21766.
  • Feller, A. (2015). Essays in public policy and causal inference. Ph.D. thesis, Harvard Univ., Cambridge, MA.
  • Feller, A., Greif, E., Miratrix, L. and Pillai, N. (2016). Principal stratification in the Twilight Zone: Weakly separated components in finite mixture models. Available at arXiv:1602.06595.
  • Feller, A., Grindal, T., Miratrix, L. and Page, L. C. (2016). Supplement to “Compared to what? Variation in the impacts of early childhood education by alternative care type.” DOI:10.1214/16-AOAS910SUPP.
  • Frangakis, C. E. and Rubin, D. B. (2002). Principal stratification in causal inference. Biometrics 58 21–29.
  • Frühwirth-Schnatter, S. (2006). Finite Mixture and Markov Switching Models. Springer, New York.
  • Frumento, P., Mealli, F., Pacini, B. and Rubin, D. B. (2012). Evaluating the effect of training on wages in the presence of noncompliance, nonemployment, and missing outcome data. J. Amer. Statist. Assoc. 107 450–466.
  • Frumento, P., Mealli, F., Pacini, B. and Rubin, D. B. (2016). The fragility of standard inferential approaches in principal stratification models relative to direct likelihood approaches. Stat. Anal. Data Min. 9 58–70.
  • Fryer, R. G. and Levitt, S. D. (2004). Understanding the black–white test score gap in the first two years of school. Rev. Econ. Stat. 86 447–464.
  • Gallop, R., Small, D. S., Lin, J. Y., Elliott, M. R., Joffe, M. and Ten Have, T. R. (2009). Mediation analysis with principal stratification. Stat. Med. 28 1108–1130.
  • Garces, E., Thomas, D. and Currie, J. (2002). Longer-term effects of Head Start. Am. Econ. Rev. 92 999–1012.
  • Gelber, A. and Isen, A. (2013). Children’s schooling and parents’ behavior: Evidence from the Head Start Impact Study. J. Public Econ. 101 25–38.
  • Gelman, A., Carlin, J. B., Stern, H. S., Dunson, D. B., Vehtari, A. and Rubin, D. B. (2013). Bayesian Data Analysis. CRC press, Boca Raton, FL.
  • Gibbs, C., Ludwig, J. and Miller, D. L. (2013). Does Head Start do any lasting good? In The War on Poverty: A 50-Year Retrospective (M. J. Bailey andSheldon Danziger, eds.). Russell Sage Foundation, New York.
  • Gormley, W. T. (2007). Early childhood care and education: Lessons and puzzles. J. Policy Anal. Manage. 26 633–671.
  • Gormley, W. T., Phillips, D., Adelstein, S. and Shaw, C. (2010). Head Start’s comparative advantage: Myth or reality? Policy Stud. J. 38 397–418.
  • Griffin, B. A., McCaffrey, D. F. and Morral, A. R. (2008). An application of principal stratification to control for institutionalization at follow-up in studies of substance abuse treatment programs. Ann. Appl. Stat. 2 1034–1055.
  • Hall, P. and Zhou, X.-H. (2003). Nonparametric estimation of component distributions in a multivariate mixture. Ann. Statist. 31 201–224.
  • Heckman, J. J. (2006). Skill formation and the economics of investing in disadvantaged children. Science 312 1900–1902.
  • Heckman, J., Hohmann, N., Smith, J. and Khoo, M. (2000). Substitution and dropout bias in social experiments: A study of an influential social experiment. Q. J. Econ. 115 651–694.
  • Hill, J., Waldfogel, J. and Brooks-Gunn, J. (2002). Differential effects of high-quality child care. J. Policy Anal. Manage. 21 601–627.
  • Hirano, K., Imbens, G. W., Rubin, D. B. and Zhou, X. H. (2000). Assessing the effect of an influenza vaccine in an encouragement design. Biostatistics 1 69–88.
  • Hoffman, M. D. and Gelman, A. (2014). The no-U-turn sampler: Adaptively setting path lengths in Hamiltonian Monte Carlo. J. Mach. Learn. Res. 15 1593–1623.
  • Hunter, D. R., Wang, S. and Hettmansperger, T. P. (2007). Inference for mixtures of symmetric distributions. Ann. Statist. 35 224–251.
  • Huston, A. C., Chang, Y. E. and Gennetian, L. (2002). Family and individual predictors of child care use by low-income families in different policy contexts. Early Child. Res. Q. 17 441–469.
  • Imai, K., King, G. and Stuart, E. A. (2008). Misunderstanding between experimentalists and observationalists about causal inference. J. Roy. Statist. Soc. Ser. A 171 481–502.
  • Imbens, G. W. and Rubin, D. B. (1997a). Estimating outcome distributions for compliers in instrumental variables models. Rev. Econ. Stud. 64 555–574.
  • Imbens, G. W. and Rubin, D. B. (1997b). Bayesian inference for causal effects in randomized experiments with noncompliance. Ann. Statist. 25 305–327.
  • Imbens, G. W. and Rubin, D. B. (2015). Causal Inference for Statistics, Social, and Biomedical Sciences: An Introduction. Cambridge Univ. Press, New York.
  • Jenkins, J. M., Farkas, G., Duncan, G. J., Burchinal, M. and Vandell, D. L. (2014). Head Start at ages 3 and 4 versus Head Start followed by state pre-k: Which is more effective? Working paper.
  • Jin, H. and Rubin, D. B. (2009). Public schools versus private schools: Causal inference with partial compliance. J. Educ. Behav. Stat. 34 24–45.
  • Jo, B. (2002). Estimation of intervention effects with noncompliance: Alternative model specifications. Journal of Educational and Behavioral Statistics 27 385–409.
  • Jo, B. and Muthén, B. O. (2001). Modeling of intervention effects with noncompliance: A latent variable approach for randomized trials. In New Developments and Techniques in Structural Equation Modeling (G. A. Marcoulides andR. E. Schumacker, eds.) 57–87. Erlbaum Associates, Mahwah, NJ.
  • Jo, B. and Stuart, E. A. (2009). On the use of propensity scores in principal causal effect estimation. Stat. Med. 28 2857–2875.
  • Joffe, M. M., Small, D. and Hsu, C.-Y. (2007). Defining and estimating intervention effects for groups that will develop an auxiliary outcome. Statist. Sci. 22 74–97.
  • Kagan, S. L. (1991). Examining profit and nonprofit child care: An odyssey of quality and auspices. J. Soc. Issues 47 87–104.
  • Kline, P. and Walters, C. (2016). Evaluating public programs with close substitutes: The case of Head Start. Q. J. Econ. To appear. DOI:10.1093/qje/qjw027.
  • Kling, J. R., Liebman, J. B. and Katz, L. F. (2007). Experimental analysis of neighborhood effects. Econometrica 75 83–119.
  • Leak, J., Duncan, G. J., Li, W., Magnuson, K. A., Schindler, H. and Yoshikawa, H. (2010). Is timing everything? How early childhood education program impacts vary by starting age, program duration and time since the end of the program. Working paper.
  • Ludwig, J. and Miller, D. L. (2007). Does Head Start improve children’s life chances? Evidence from a regression discontinuity design. Q. J. Econ. 122 159–208.
  • Ludwig, J. and Phillips, D. A. (2010). Leave no (young) child behind: Prioritizing access in early childhood education. In Investing in Young Children: New Directions in Federal Preschool and Early Childhood Policy (R. Haskin andW. S. Barnett, eds.). Brookings and NIEER.
  • Magnuson, K. A., Ruhm, C. and Waldfogel, J. (2007). The persistence of preschool effects: Do subsequent classroom experiences matter?. Early Child. Res. Q. 22 18–38.
  • Mattei, A., Li, F. and Mealli, F. (2013). Exploiting multiple outcomes in Bayesian principal stratification analysis with application to the evaluation of a job training program. Ann. Appl. Stat. 7 2336–2360.
  • McCoy, D. C., Connors, M. C., Morris, P. A., Yoshikawa, H. and Friedman-Krauss, A. H. (2015). Neighborhood economic disadvantage and children’s cognitive and social-emotional development: Exploring Head Start classroom quality as a mediating mechanism. Early Childhood Research Quarterly 32 150–159.
  • Mealli, F. and Pacini, B. (2008). Comparing principal stratification and selection models in parametric causal inference with nonignorable missingness. Comput. Statist. Data Anal. 53 507–516.
  • Mealli, F. and Pacini, B. (2013). Using secondary outcomes to sharpen inference in randomized experiments with noncompliance. J. Amer. Statist. Assoc. 108 1120–1131.
  • Miller, E. B., Farkas, G., Vandell, D. L. and Duncan, G. J. (2014). Do the effects of Head Start vary by parental preacademic stimulation? Child Dev. 85 1385–1400.
  • Morris, J. R. and Helburn, S. W. (2000). Child care center quality differences: The role of profit status, client preferences, and trust. Nonprofit Volunt. Sect. Q. 29 377–399.
  • National Forum on Early Childhood Policy and Programs (2010). Understanding the Head Start impact study. Available at http://developingchild.harvard.edu.
  • Page, L. C. (2012). Principal stratification as a framework for investigating mediational processes in experimental settings. Journal of Research on Educational Effectiveness 5 215–244.
  • Page, L. C., Feller, A., Grindal, T., Miratrix, L. and Somers, M. A. (2015). Principal stratification: A tool for understanding variation in program effects across endogenous subgroups. Am J. Eval. 36 514–531.
  • Pearson, K. (1894). Contributions to the mathematical theory of evolution. Philos. Trans. R. Soc. Lond., A 185 71–110.
  • Puma, M., Bell, S. H., Cook, R., Heid, C. and Shapiro, G. (2010a). Head Start impact study. Final report, HHS, Administration for Children and Families.
  • Puma, M., Bell, S. H., Cook, R., Heid, C. and Shapiro, G. (2010b). Head Start impact study. Technical report, HHS, Administration for Children and Families.
  • Raudenbush, S. W. (2015). Estimation of means and covariance components in multi-site randomized trials. Unpublished manuscript.
  • Raudenbush, S. W., Reardon, S. F. and Nomi, T. (2012). Statistical analysis for multisite trials using instrumental variables with random coefficients. Journal of Research on Educational Effectiveness 5 303–332.
  • Reardon, S. F. and Raudenbush, S. W. (2013). Under what assumptions do site-by-treatment instruments identify average causal effects? Sociol. Methods Res. 42 143–163.
  • Romano, E., Babchishin, L., Pagani, L. S. and Kohen, D. (2010). School readiness and later achievement: Replication and extension using a nationwide Canadian survey. Dev. Psychol. 46 995–1007.
  • Rose, K. K. and Elicker, J. (2010). Maternal child care preferences for infants, toddlers, and preschoolers: The disconnect between policy and preference in the USA. Community Work Fam. 13 205–229.
  • Rubin, D. B. (1974). Estimating causal effects of treatments in randomized and nonrandomized studies. J. Educ. Psychol. 66 688.
  • Rubin, D. B. (1976). Inference and missing data. Biometrika 63 581–592.
  • Rubin, D. B. (1980). Comment on “Randomization analysis of experimental data: The Fisher randomization test”. J. Amer. Statist. Assoc. 75 591–593.
  • Rubin, D. B. (1984). Bayesianly justifiable and relevant frequency calculations for the applied statistician. Ann. Statist. 12 1151–1172.
  • Schmit, S., Matthews, H., Smith, S. and Robbins, T. (2013). Investing in young children: A fact sheet on early care and education participation, access, and quality. Fact sheet, New York, NY: National Center for Children in Poverty, Washington, DC: Center for Law and Social Policy.
  • Schochet, P. Z. (2013). Student mobility, dosage, and principal stratification in school-based RCTs. J. Educ. Behav. Stat. 38 323–354.
  • Schochet, P. Z. and Burghardt, J. (2007). Using propensity scoring to estimate program-related subgroup impacts in experimental program evaluations. Eval. Rev. 31 95–120.
  • Schochet, P. Z., Burghardt, J. and McConnell, S. (2008). Does job corps work? Impact findings from the national job corps study. Am. Econ. Rev. 98 1864–1886.
  • Schochet, P., Puma, M. and Deke, J. (2014). Understanding variation in treatment effects in education impact evaluations: An overview of quantitative methods (NCEE 2014–4017), Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Analytic Technical Assistance and Development.
  • Scott-Clayton, J. and Minaya, V. (2014). Should student employment be subsidized? Conditional counterfactuals and the outcomes of work-study participation. Working Paper w20329, National Bureau of Economic Research.
  • Shager, H. M., Schindler, H. S., Magnuson, K. A., Duncan, G. J., Yoshikawa, H. and Hart, C. M. D. (2013). Can research design explain variation in Head Start research results? A meta-analysis of cognitive and achievement outcomes. Educ. Eval. Policy Anal. 35 76–95.
  • Splawa-Neyman, J. (1990). On the application of probability theory to agricultural experiments. Essay on principles. Section 9. Statist. Sci. 5 465–472.
  • Stan Development Team (2014). Stan: A C${+}{+}$ library for probability and sampling, Version 2.3.
  • Walters, C. R. (2015). Inputs in the production of early childhood human capital: Evidence from Head Start. Am. Econ. J. Appl. Econ. 7 76–102.
  • Whitehurst, G. J. (2013a). Obama’s preschool plan. Brookings Institution.
  • Whitehurst, G. J. (2013b). Can we be hard-headed about preschool? A look at Head Start. Brookings Institution.
  • Zhai, F., Brooks-Gunn, J. and Waldfogel, J. (2011). Head Start and urban children’s school readiness: A birth cohort study in 18 cities. Dev. Psychol. 47 134–152.
  • Zhai, F., Brooks-Gunn, J. and Waldfogel, J. (2014). Head Start’s impact is contingent on alternative type of care in comparison group. Dev. Psychol. 50 2572–2586.
  • Zhang, J. L. and Rubin, D. B. (2003). Estimation of causal effects via principal stratification when some outcomes are truncated by “death”. J. Educ. Behav. Stat. 28 353–368.
  • Zhang, J. L., Rubin, D. B. and Mealli, F. (2009). Likelihood-based analysis of causal effects of job-training programs using principal stratification. J. Amer. Statist. Assoc. 104 166–176.
  • Zigler, E. and Muenchow, S. (1992). Head Start: The Inside Story of America’s Most Successful Educational Experiment. Basic Books.

Supplemental materials

  • Supplement to “Compared to what? Variation in the impacts of early childhood education by alternative care type”. This files contains supporting material, additional results and proofs.