Wildfires in the United States have been getting bigger and more frequent for decades, with a startling shift in recent years: In the period from 2017 to 2021, the average annual acreage burned was 68% larger than the annual average from 1983 to 2016.1 As fires have grown, so has public spending on wildfire management: Combined funding from the U.S. Department of the Interior and the U.S. Forest Service, two of the federal agencies most involved in wildfires, nearly doubled from fiscal year 2011 to 2020 (See Figure 1).2
Determining who is responsible—and who will foot the bill—for wildfire management activities is complicated. States, localities, and the federal government, as well as nongovernmental entities, are all involved in preparing for, fighting, and recovering from fires, as well as reducing the risk of future ones.
The Pew Charitable Trusts undertook a study to improve the available data and understanding of the impact of wildfire spending on state fiscal policy. The findings, along with recommendations for policymakers, are available in the report, “Wildfires Burning Through State Budgets.”
Pew’s research shows that in recent years, states’ estimates of wildfire costs have often proved insufficient, forcing them to cover spending using after-the-fact budgeting tools that can obscure the true cost of wildfires. In addition, although investment in cost-saving mitigation activities is growing, potential resources are still routinely diverted to fire suppression, thereby limiting the potential benefits of mitigation to communities, the environment, and state budgets. Based on the study’s findings, Pew developed three recommendations:
As a critical piece of the complex intergovernmental system of wildfire management, states have an opportunity to lead in efforts to improve budgeting practices, manage costs through investment in mitigation, and increase the availability of spending data.