Background and context
The Office of Planning, Research, and Evaluation (OPRE) routinely administers surveys to evaluate Administration for Children and Families’ programs and better understand the communities in which programs operate. Surveys enable OPRE to collect and summarize perspectives from large numbers of program staff, partners, and participants with the goal of improving the implementation and delivery of human services. However, declining survey participation rates over the past decade,[1] exacerbated by the COVID-19 pandemic, threaten to undermine the validity of survey data to inform important program improvements. When participants do not respond to a survey request, uncertainty about the accuracy or bias of estimates increases, leading to the possibility that survey findings may not adequately represent the needs, perspectives, or experiences of the population of focus. Such nonresponse bias can occur if the attitudes, characteristics, and experiences of respondents are systematically different from those of nonrespondents.
Of particular concern is that marginalized populations or communities with specific communication considerations (e.g., communities where English is not the primary language or Internet access is limited) may be less likely to participate in surveys than other groups. For instance, in 2020, researchers found that households with high incomes were more likely to respond to the U.S. Census Bureau’s Current Population Survey Annual Social and Economic Supplement than those with lower incomes, upwardly skewing reported income data.[2] When survey respondents do not represent the full population of focus, data collection is inequitable and the data can be biased, limiting the survey’s utility for informing equitable programmatic and policy decisions. Therefore, it is important for OPRE to understand how to minimize unit missingness to ensure evaluations are rigorous and resulting data are high-quality and representative.
Researchers can, however, act at each step of the study design process to avoid, mitigate, and address the impact of nonresponse. Practices to improve outreach and recruitment of high priority populations—such as engaging community members to build trust and rapport, addressing privacy considerations, and using a range of methods and tools to reach participants—can significantly reduce nonresponse. Innovative questionnaire design and administration techniques, combined with usability and cognitive testing to design more respondent-centered and accessible surveys, can also encourage response.
Data from other sources can also be used to supplement or fill gaps left by missing survey data. For example, researchers can use observational methods to collect data, such as monitoring behaviors, observing interpersonal interactions, collecting passive data using sensors and geolocation, and collecting social media and digital trace data. Researchers may also link administrative data from private, federal, state, and program sources into their studies to augment missing data. Finally, many analytic techniques are available to measure and mitigate the impact of missing data on survey estimates, assess nonresponse bias, and adjust estimates for nonresponse. However, each of these strategies is associated with specific underlying assumptions, trade-offs, and limitations that researchers should carefully consider.
The 2023 OPRE Methods Meeting convened researchers, evaluators, federal staff, and others working in human services research to address nonresponse in survey data. Experts in the field discussed the reasons behind declining survey response rates and the potential for increased nonresponse bias before discussing the implications for human services research, programmatic decisions, and policy decision-making. Meeting presenters also addressed research design strategies to reduce nonresponse and mitigate the impact of missing data on resulting estimates. They also explored the use of other data sources to supplement and minimize the risk of missing data on surveys.
Meeting Topics and Goals
The meeting had the following goals:
The meeting included presentations and discussions on the following questions:
[1] Office of Survey Methods Research. (2022). Household and establishment survey response rates. U.S. Bureau of Labor Statistics. https://www.bls.gov/osmr/response-rates/home.htm
[2] Rothbaum, J., & Bee, A. (2021). Coronavirus infects surveys, too: Survey nonresponse bias and the Coronavirus pandemic. https://www.census.gov/content/dam/Census/library/working-papers/2020/demo/sehsd-wp2020-10.pdf
1:00-1:10 pm EDT
Lauren Supplee, Deputy Assistant Secretary for Planning, Research, and Evaluation, OPRE
1:10 -1:35 pm EDT
Brad Edwards, Vice President & Lead Scientific/Methodology Advisor, Westat
Slides: Statement of the Problem
Emilia Peytcheva, Research Survey Methodologist, RTI International
Slides: Statement of the Problem
1:40-2:40 pm EDT
Mike Brick, Senior Vice President, Westat
Slides: Probability Sampling, Non-Probability Sampling and Nonresponse
Courtney Kennedy, Vice President of Methods and Innovation, Pew Research Center
Slides: Non-probability Sampling
Brady West, Research Professor, University of Michigan
2:50-3:20 pm EDT
Andy Peytchev, Fellow, Survey Methodology, RTI International
Slide: Measuring Nonresponse
3:30-4:50 pm EDT
Ting Yan, Vice President and Associate Director, Westat
Slides: Design Features to Reduce Unit Nonresponse
James Wagner, Research Professor, University of Michigan
Slides: Addressing Nonresponse Using Responsive/Adaptive Survey Design
4:50-5:00 pm EDT
Li Wang, Social Science Research Analyst, Office of Planning, Research and Evaluation
1:00-1:15 pm EDT
Caitlin Lowery, National Poverty Fellow, Office of Planning, Research and Evaluation
1:15-2:30 pm EDT
Raphael Nishimura, Director of Sampling Operations at the Survey Research Operations, University of Michigan
2:40-3:40 pm EDT
Rupa Datta, Distinguished Senior Fellow, NORC at the University of Chicago
David Judkins, Principal Associate, Social & Economic Policy, Abt Associates
Slides: Addressing Unit Nonresponse with Administrative Data for PACE and HPOG 2.0
3:50-4:50 pm EDT
Panel:
Lauren Supplee, Deputy Assistant Secretary for Planning, Research, and Evaluation, OPRE
Brad Edwards, Vice President & Lead Scientific/Methodology Advisor, Westat
Emilia Peytcheva, Research Survey Methodologist, RTI International
Raphael Nishimura, Director of Sampling Operations at the Survey Research Operations, University of MichiganModerator:
Kristyn Wong Vandahm, Social Science Research Analyst, Office of Planning, Research and Evaluation
4:50-5:00 pm EDT
Kelly McGowan, Office of Planning, Research and Evaluation
Addressing Unit Missingness in Social Policy Research: Summary of 2023 OPRE Methods Meeting
Speakers’ remarks represent their own views and not those of ACF/OPRE.
Session 1: Welcome
Closed caption version
Session 2: Statement of the Problem
Closed caption version
Audio description version
Session 3: Sampling and Recruitment
Closed caption version
Audio description version
Session 4: Measuring Nonresponse
Closed caption version
Audio description version
Session 5: Design and Administration
Closed caption version
Audio description version
Session 6: Nonresponse Reduction and Adjustment Techniques
Closed caption version
Audio description version
Session 7: Uses of Administrative Data to Address Unit Missingness: Experiences from Past OPRE Projects
Closed captioned version
Audio description version