The American Rescue Plan Act (ARPA) provides funding for states to address immediate challenges related to the COVID-19 pandemic and to develop sustainable solutions to help their economies recover. To meet these goals, the law includes $195 billion in flexible funding for states to use for various services and capacity-building efforts. And in a push to improve the efficacy of state programs, the money can be used for “data analysis, targeted consumer outreach, improvements to data or technology infrastructure, and impact evaluations.”
Since ARPA’s enactment in March, Results First has encouraged states to focus on that last potential use of the federal money: investing in impact evaluations. These are rigorous assessments of programs’ effectiveness that can inform decisions about which interventions to support, revise, or eliminate; they are key components of evidence-based policymaking efforts.
In recent months, leaders in Results First partner states such as Colorado and New Mexico have made their first round of decisions about how they intend to spend the new flexible federal funds to support program evaluation and more general evidence-based policymaking goals.
Colorado decision-makers plan to invest ARPA funds broadly to support evaluation work across agencies. The state statute governing this spending explicitly allows agencies to invest in impact evaluations. Such clear language ensures that agencies understand that the governor and the legislature both see evaluation and use of evidence in the decision-making process as priorities.
In addition, guidance to agencies from the Office of State Planning and Budgeting asks each to submit an ARPA spending plan for this fiscal year to the governor’s office for review and approval. These plans must include a section detailing anticipated evaluation efforts and evidence of effectiveness for specific proposals. The governor’s office then will determine if the plans comply with the state’s prioritization of evidence use.
Colorado legislators have passed bills this year allocating ARPA funds with set-asides for evaluations of specific groups of programs from a variety of areas, such as behavioral health, workforce development, and education. These measures also require agencies to report on program outcomes and evaluation results.
Lawmakers also established interim legislative committees focused on several policy areas, such as economic recovery and education, to appropriate funding that has not yet been spent. These committees also can encourage agencies to invest in Colorado’s evaluation efforts in similar ways. By investing in evaluation in multiple policy areas across multiple strategies, including legislation and agency plans, Colorado leaders hope to bolster evidence-based policymaking efforts throughout government with more funding for evaluation overall and by generating the data that could influence decision-making on programs in the coming years.
New Mexico has been more targeted in its approach, focusing specifically on education. ARPA’s Elementary and Secondary School Emergency Relief fund mandated that 20% of funding for education should support evidence-based programming. To help fulfill this requirement, the state Legislative Finance Committee (LFC) supported efforts by the New Mexico Public Education Department (PED) and local school districts to determine which programs were evidence-based and eligible for funding.
To accomplish this, LFC updated its education-related program inventories, the comprehensive lists of the programs that jurisdictions fund in particular policy areas. That should help the PED and local districts identify—and invest in—effective evidence-based programs that address the needs of specific populations and close potential service gaps.
LFC staff members conducted cost-benefit analyses—when sufficient data was available—for each of these programs to showcase the potential return on new investments. Their findings were broken down into different program categories, such as extended learning, student engagement, and social-emotional learning. Committee staff then entered this information into a matrix to show which programs, according to available research, were both highly effective and produced a high return on investment.
As states continue to allocate remaining ARPA funding, these examples can help inform upcoming rounds of budgeting and implementation. Impact evaluations have been shown to provide policymakers with useful information to inform future decisions. Governors can include similar investments in evaluation in their next budget submissions and legislatures can work with agencies to determine where there is the greatest need for investments in evaluation and research capacity.
States would be well served to continue to invest in these evaluations and the staff capacity to conduct them so they can scale up pilot projects that are shown to be effective, improve programs with promising results, and replace those that fail to achieve intended results.
Sara Dube is a project director and Alex Sileo is a senior associate with the Results First initiative.