Addressing Unit Missingness in Social Policy Survey Research

Meeting Topic

Background and context

The Office of Planning, Research, and Evaluation (OPRE) routinely administers surveys to evaluate Administration for Children and Families’ programs and better understand the communities in which programs operate. Surveys enable OPRE to collect and summarize perspectives from large numbers of program staff, partners, and participants with the goal of improving the implementation and delivery of human services. However, declining survey participation rates over the past decade,[1] exacerbated by the COVID-19 pandemic, threaten to undermine the validity of survey data to inform important program improvements. When participants do not respond to a survey request, uncertainty about the accuracy or bias of estimates increases, leading to the possibility that survey findings may not adequately represent the needs, perspectives, or experiences of the population of focus. Such nonresponse bias can occur if the attitudes, characteristics, and experiences of respondents are systematically different from those of nonrespondents.

Of particular concern is that marginalized populations or communities with specific communication considerations (e.g., communities where English is not the primary language or Internet access is limited) may be less likely to participate in surveys than other groups. For instance, in 2020, researchers found that households with high incomes were more likely to respond to the U.S. Census Bureau’s Current Population Survey Annual Social and Economic Supplement than those with lower incomes, upwardly skewing reported income data.[2] When survey respondents do not represent the full population of focus, data collection is inequitable and the data can be biased, limiting the survey’s utility for informing equitable programmatic and policy decisions. Therefore, it is important for OPRE to understand how to minimize unit missingness to ensure evaluations are rigorous and resulting data are high-quality and representative.

Researchers can, however, act at each step of the study design process to avoid, mitigate, and address the impact of nonresponse. Practices to improve outreach and recruitment of high priority populations—such as engaging community members to build trust and rapport, addressing privacy considerations, and using a range of methods and tools to reach participants—can significantly reduce nonresponse. Innovative questionnaire design and administration techniques, combined with usability and cognitive testing to design more respondent-centered and accessible surveys, can also encourage response.

Data from other sources can also be used to supplement or fill gaps left by missing survey data. For example, researchers can use observational methods to collect data, such as monitoring behaviors, observing interpersonal interactions, collecting passive data using sensors and geolocation, and collecting social media and digital trace data. Researchers may also link administrative data from private, federal, state, and program sources into their studies to augment missing data. Finally, many analytic techniques are available to measure and mitigate the impact of missing data on survey estimates, assess nonresponse bias, and adjust estimates for nonresponse. However, each of these strategies is associated with specific underlying assumptions, trade-offs, and limitations that researchers should carefully consider.

The 2023 OPRE Methods Meeting convened researchers, evaluators, federal staff, and others working in human services research to address nonresponse in survey data. Experts in the field discussed the reasons behind declining survey response rates and the potential for increased nonresponse bias before discussing the implications for human services research, programmatic decisions, and policy decision-making. Meeting presenters also addressed research design strategies to reduce nonresponse and mitigate the impact of missing data on resulting estimates. They also explored the use of other data sources to supplement and minimize the risk of missing data on surveys.

Meeting Topics and Goals

The meeting had the following goals:

  • Define unit missing data and discuss potential implications for human services research and evaluation.
  • Introduce strategies attendees can use to reduce survey nonresponse and mitigate the impact of missing data in their work.
  • Identify data collection methods and non-survey data sources that can be used to supplement or replace survey data.
  • Describe projects that successfully implemented strategies to mitigate survey nonresponse.

The meeting included presentations and discussions on the following questions:

  • Why is nonresponse an important issue in survey research?
  • How can researchers identify patterns of missing data, including which populations have disproportionate nonresponse rates?
  • How do missing data affect the quality of survey estimates and what implications does this have for policymaking and programmatic decisions?
  • How can study and questionnaire designs reduce survey nonresponse?
  • What data analysis techniques and strategies can researchers use to address missing data in their projects?
  • What strategies can researchers use to improve outreach to populations that experience barriers to completing surveys.

Footnotes

[1] Office of Survey Methods Research. (2022). Household and establishment survey response rates. U.S. Bureau of Labor Statistics. https://www.bls.gov/osmr/response-rates/home.htm

[2] Rothbaum, J., & Bee, A. (2021). Coronavirus infects surveys, too: Survey nonresponse bias and the Coronavirus pandemic. https://www.census.gov/content/dam/Census/library/working-papers/2020/demo/sehsd-wp2020-10.pdf

Agenda and Presentations

Wednesday, October 18

Welcome

1:00-1:10 pm EDT

Lauren Supplee, Deputy Assistant Secretary for Planning, Research, and Evaluation, OPRE

Statement of the Problem

1:10 -1:35 pm EDT

Brad Edwards, Vice President & Lead Scientific/Methodology Advisor, Westat

Slides: Statement of the Problem

Emilia Peytcheva, Research Survey Methodologist, RTI International

Slides: Statement of the Problem

 

Sampling and Recruitment

1:40-2:40 pm EDT

Mike Brick, Senior Vice President, Westat

Slides: Probability Sampling, Non-Probability Sampling and Nonresponse

Courtney Kennedy, Vice President of Methods and Innovation, Pew Research Center

Slides: Non-probability Sampling

Brady West, Research Professor, University of Michigan

Slides: Best Practices for Survey Recruitment

Measuring Nonresponse

2:50-3:20 pm EDT

Andy Peytchev, Fellow, Survey Methodology, RTI International

Slide: Measuring Nonresponse

Design and Administration

3:30-4:50 pm EDT

Ting Yan, Vice President and Associate Director, Westat

Slides: Design Features to Reduce Unit Nonresponse

James Wagner, Research Professor, University of Michigan

Slides: Addressing Nonresponse Using Responsive/Adaptive Survey Design

Wrap-up

4:50-5:00 pm EDT

Li Wang, Social Science Research Analyst, Office of Planning, Research and Evaluation

Thursday, October 19

Welcome

1:00-1:15 pm EDT

Caitlin Lowery, National Poverty Fellow, Office of Planning, Research and Evaluation

Nonresponse Reduction and Adjustment Techniques

1:15-2:30 pm EDT

Raphael Nishimura, Director of Sampling Operations at the Survey Research Operations, University of Michigan

Slides: Nonresponse Reduction and Adjustment Techniques

Uses of Administrative Data to Address Unit Missingness: Experiences from Past OPRE Projects

2:40-3:40 pm EDT

Rupa Datta, Distinguished Senior Fellow, NORC at the University of Chicago

Slides: Using Administrative Data to Mitigate Missing Data in Surveys: Design through Survey Data File Preparation

David Judkins, Principal Associate, Social & Economic Policy, Abt Associates

Slides: Addressing Unit Nonresponse with Administrative Data for PACE and HPOG 2.0

Reflection: Where Do We Go From Here?

3:50-4:50 pm EDT

Panel:
Lauren Supplee, Deputy Assistant Secretary for Planning, Research, and Evaluation, OPRE
Brad Edwards, Vice President & Lead Scientific/Methodology Advisor, Westat
Emilia Peytcheva, Research Survey Methodologist, RTI International
Raphael Nishimura, Director of Sampling Operations at the Survey Research Operations, University of Michigan

Moderator:
Kristyn Wong Vandahm, Social Science Research Analyst, Office of Planning, Research and Evaluation

Wrap-up

4:50-5:00 pm EDT

Kelly McGowan, Office of Planning, Research and Evaluation

Meeting Products

Check back soon for 2023 Methods Meeting products.