Background and Context. There is increasing desire for data-driven approaches to examine program implementation, impact, and improvement efforts faster – even in real-time – to provide feedback and input for program modification. Numerous approaches exist to help support such rapid learning (i.e. Rapid Cycle Evaluation and Continuous Quality Improvement, including specific strategies like Plan Do Study Act cycles). All use data, some employ methods to determine causality, and some incorporate advanced statistical methods to make predictions. Unfortunately, this broad array of methods—along with overlapping and sometimes unclear terminology—may make it difficult to determine which methods are most appropriate given the specific challenges facing a program.
OPRE’s 2018 research methods meeting sought to increase the field’s understanding of how rapid learning methods can be helpful, and to share information about methods for rapid learning so that researchers and practitioners can use the most appropriate methods to evaluate programs. We focus on rapid learning methods designed to quickly test program improvements and evaluate program implementation or impact. These approaches often use program data to monitor and measure improvements, and are typically embedded in a cycle of learning. Selecting the most appropriate methodology depends on numerous factors including: the research questions being asked; what data are available to address the questions; who is asking the questions and conducting the research; how the results will be used; the required level of confidence in analysis results; the level of disruption to the program; whether the program being evaluated is new or established; potential costs of implementing the method; and the potential decision cycle.
Meeting Topics and Goals. The 2018 research methods meeting provided attendees with a deeper understanding of rapid learning methods and their underlying methodologies. Speakers described key questions that can be addressed by rapid learning methods, along with relevant considerations for selecting the most appropriate method for different contexts. They shared their expertise in understanding and supporting rapid learning efforts, and provided examples of successful implementation among Federally-funded programs as part of a broader conversation about supporting program improvement and decision-making. The meeting included presentations and discussions on the following questions:
The goals of the meeting were to:
9:00 – 9:30 a.m.
Naomi Goldstein (Deputy Assistant Secretary for Planning, Research, and Evaluation)
Emily Schmitt (Deputy Director for the Office of Planning, Research, and Evaluation)
9:30 – 10:30 a.m.
Scott Cody (Insight Policy Research)
MaryCatherine Arbour (Harvard University)
10:45 a.m. – 12:00 p.m.
Slide Deck: The Rapid Cycle Evaluation Coach: Building Capacity and Informing Decisions
Kate Place (Mathematica Policy Research)
Slide Deck: The Breakthrough Series Collaborative on Trauma Informed Early Care and Education
Anne Douglass (University of Massachusetts – Boston)
Slide Deck: Using Rapid Learning Methods to Design and Test Promising Interventions for Low-Income Families: Jefferson County (CO) Department of Human Services
Michelle Derr, Annalisa Mastri (Mathematica Policy Research)
1:30 – 3:00 p.m.
Slide Deck: Improvement Science with a Twist: Embedding an Experimental Test of Improvement Strategies into Routines of Practice
Rebecca Maynard (University of Pennsylvania)
Slide Deck: Supporting Multisystemic Therapy & Functional Family Therapy Implementation with CQI & Evaluation in Maryland
Jill Farrell (University of Maryland)
Rekha Balu (MDRC)
3:15 – 5:00 p.m.
Jodi Sandfort (University of Minnesota)
Tyson Barker (University of Oregon)
Robert Goerge (Chapin Hall)
Bi Vuong (Harvard University)
Virginia Knox (MDRC)
All are welcome to continue the discussion at the 21st Amendment Bar & Grill (inside hotel). This is an informal, no-host gathering; drinks and food are available for purchase.
9:00 – 10:45 a.m.
Kinsey Dinan (NYC Department of Social Services)
Nick Hart (Bipartisan Policy Center)
Jennifer Lloyd (The Centers for Medicare & Medicaid Services)
Erica Zielewski (Office of Management & Budget)
11:00 a.m. – 12:00 p.m.
Mary Mackrain (Education Development Center)
Julia Heany (Michigan Public Health Institute)
Rapid Learning: Methods for Testing and Evaluating Change in Social Service Programs (Methods Meeting Summary)
Rapid Learning: Methods to Examine and Improve Social Programs (Video)
Audio Description Version
Closed Caption Version