Better Incentive Information

Three strategies for states to use economic development data effectively

Partager
Better Incentive Information

Overview

Economic development incentives are one of the primary tools states use to try to strengthen their economies. Every state uses a mix of tax incentives, grants, and loans in an effort to create jobs, encourage business expansions, and achieve other goals. Collectively, states spend billions of dollars a year on incentives, which can significantly affect their budgets, businesses, and economies.

To make these programs work as well as they can, states need good data. Data are necessary for officials to administer incentives and measure their effectiveness.

There are multiple sources of these data. Relevant information can come from businesses, federal records, and other third-party databases. In fact, states already possess, in some form, much of the data they need. But they must ensure that the right people have access to these data; the information needs to be of high quality and analyzed effectively. Many states have struggled with these challenges, leaving officials without the information they need to administer incentives well and policymakers unsure of whether the programs are working as intended.

To identify solutions, The Pew Charitable Trusts partnered with the Center for Regional Economic Competitiveness (CREC) to create the business incentives initiative in 2014. Through the initiative, cross-agency work groups from Indiana, Maryland, Michigan, Oklahoma, Tennessee, and Virginia worked closely with Pew and CREC to provide in-depth access to their economic development oversight and management procedures. While the initiative focused primarily on these six states, Pew and CREC also convened stakeholders from 22 additional states. The initiative built on Pew’s ongoing work to help states establish processes to regularly and rigorously evaluate the results of their tax incentives.

From March 2014 to December 2015, Pew conducted numerous interviews and site visits with participating states’ elected lawmakers and economic development, tax, and budget officials in the legislative and executive branches. The six states reviewed their incentive policies and practices to identify possible strengths and weaknesses, and received technical assistance from Pew to design and implement policy improvements.

The business incentives initiative provided strategies on how states can:

  • Share relevant data.
  • Ensure data are high-quality.
  • Analyze data effectively.

Incentive data challenges states face

States need good data for at least three discrete tasks that are necessary to design and administer effective incentive programs. First, state officials must determine which companies should receive incentives. Then, states need to monitor the performance of participating companies to ensure they are meeting their commitments in exchange for the incentives. Finally, states should analyze the results of incentive programs to identify ways to make these policies more effective.

For each of these tasks, states have often struggled to effectively collect, share, and use data. Different parts of state government may not collaborate effectively with one another. Both the data and responsibility for analyzing it are diffuse: Numerous agencies have a role in administering incentives, studying them, and collecting relevant tax and economic information. Likewise, much of the information is sensitive—such as the business plans or tax records of specific companies—so state agencies are sometimes reluctant to share it even with one another.

Illinois’ experience shows the value of high-quality tax incentive data and the challenges states face when they lack this information. In 2015, after a months-long investigation, Chicago Tribune journalists reported startling findings about Illinois’ Economic Development for a Growing Economy (EDGE) tax credit, one of the state’s largest business incentives. The Tribune found that officials could not say whether companies participating in the program were boosting net employment in the state. Part of the problem was the absence of good data. For example, the state Department of Commerce awarded incentives to specific business locations but lacked data on whether the same companies were reducing employment elsewhere in Illinois. The cash-strapped state government had committed more than $1 billion to a program with limited evidence of whether it was an effective economic development tool.1

In response to these findings, the administration of Governor Bruce Rauner (R) and Democrats in the state Legislature called for better information on EDGE and other incentives. In November 2015, Gov. Rauner announced that EDGE credit awards would be based on increases in companies’ statewide employment, rather than at specific locations.2

Other states have struggled with similar challenges. In 2012, New Mexico’s Legislative Finance Committee reported that the failure to share tax and labor information across state government was hampering the state’s ability to measure the effectiveness of incentives. For example, the committee wanted to know whether a job training incentive helped to increase workers’ wages. However, state officials lacked access to records housed in the state’s Workforce Solutions Department that would have helped them to answer that question.3 Since the report, state officials have been working to improve data quality and sharing practices to better track the impact of incentives.4

Even when state officials have data, ensuring that it is current and reliable is challenging. In 2014, Connecticut’s Department of Community and Economic Development published a detailed evaluation of the state’s incentives. Yet, because of a lag in the availability of tax data from the state’s revenue agency, the evaluation lacked current information. The report analyzed many of the programs only through 2010.5 A 2015 audit found that the Wisconsin Economic Development Corp., which oversees many of the state’s incentive programs, allowed businesses receiving incentives to self-report job and wage numbers. The audit argued that the self-reported figures were less reliable than other potential sources of data, such as official payroll records.6

When states lack good data on their incentives, policymakers are left unsure of whether the programs have succeeded. As a result, states may miss opportunities to create jobs, boost wages, and strengthen their economies.

A 2012 Hawaii state audit detailed the weaknesses of two incentive programs for high-tech businesses, including significant shortcomings in the data available. The audit found that the state had relied on self-reported information from businesses to determine whether they qualified and had failed to monitor their performance or evaluate programmatic success. Even though the director of the state Department of Taxation had reported significant abuse of the programs years earlier, the state continued to award credits to many companies that should have been deemed ineligible. Recognizing the flaws, lawmakers ended the programs in 2010, but not before they proved to be a major budget commitment. “As a result” of the data deficiencies, the audit found, “the State can neither measure nor ensure the effectiveness of the nearly $1 billion in tax credits.”7 Updated research indicates that the programs’ cost could reach $2 billion.8

Strategies

Economic development incentives are one of the primary tools policymakers use to try to strengthen state economies. When incentives are poorly designed or administered, however, they often provide a weak return for a state’s investment. By collecting and analyzing high-quality data on incentives, states can gather the information they need to make these policies more effective.

From March 2014 to December 2015, Pew and CREC worked with Indiana, Maryland, Michigan, Oklahoma, Tennessee, and Virginia to identify successful policies and practices related to economic development incentive data and to make improvements. These states’ experiences illustrate how others can pursue three key strategies.

Strategy 1: Share relevant data

To administer and analyze incentives effectively, state officials need the ability to access and integrate multiple data sources. For example, the data they require on employment and wages are already collected through the federal Bureau of Labor Statistics’ Quarterly Census of Employment and Wages (QCEW), which is managed by each state’s workforce agency.9 Through business tax returns collected by state revenue departments, they can identify which firms received incentives and how much they were worth. By combining data from sources such as these, states can better understand the performance of incentivized companies and the costs and benefits of these programs.

Oklahoma has successfully developed formal data-sharing procedures that overcome bureaucratic barriers and balance the need to protect sensitive information. Oklahoma officials share tax and economic data to help administer one of the state’s major incentives, the Quality Jobs Program. When an applicant is accepted into the program, the business receiving the incentives and officials from various agencies—the Department of Commerce, the Employment Security Commission, and the Tax Commission—sign memorandums of understanding.

These memorandums outline the privacy guidelines state officials must follow when sharing taxpayer data and the penalties those officials face if information is handled improperly. For instance, federal law restricts how state tax officials can use data they receive from the IRS. To reassure businesses that sensitive information will be protected, Oklahoma’s memorandums also specifically note that no IRS-provided data will be involved in the interagency data sharing. At the same time, enough data can pass between agencies for state officials to see whether businesses receiving incentives through the Quality Jobs Program are meeting their commitments, such as to increase payroll and pay above-average wages.10

Recognizing the importance of data sharing, Michigan Governor Rick Snyder (R) signed an executive directive in November 2013 requiring all state departments and agencies to work toward improving data-sharing procedures. The directive also created the Michigan Information Management Governance Board, with membership from each state agency and department.11 The board’s steering committee is reviewing confidentiality laws to verify what information agencies can share with each other. The committee is also considering procedures that would facilitate more data sharing.

In addition to developing formal data-sharing procedures, states also have worked to build stronger working relationships across agencies. To create a more rigorous due diligence process for considering incentive applications, Tennessee needed to enhance collaboration between agencies. Staffers from the Department of Revenue (DOR) and the Department of Economic and Community Development (ECD) meet to discuss decisions on incentive applications and to review data on incentive outcomes. ECD has also collaborated with the Tennessee Department of Labor.12 These types of conversations can set the stage for more formal partnerships. ECD and the Labor Department are developing a formal data-sharing agreement.13 Under a 2015 Tennessee law, ECD will also be responsible for evaluating the state’s economic development tax credits every four years, with the assistance of DOR.14

Strategy 2: Ensure data are high quality

To monitor and evaluate incentives effectively, states need reliable, consistent, up-to-date data.

One key step is to verify information provided by businesses. Maryland’s Department of Labor, Licensing, and Regulation, for example, provides job figures reported by incentive recipients to researchers at the Jacob France Institute at the University of Baltimore. Those researchers compare the totals with QCEW data from the state’s workforce agency. By collaborating with the university, the state receives an independent check on whether the numbers match.15 A limited number of university employees have access to the data and must sign confidentiality agreements. The researchers also must receive approval from state officials before conducting their analyses.16 Likewise, Virginia uses QCEW reports to help monitor businesses’ progress toward job creation goals for various incentive programs.17

Technology solutions can also improve data quality and make it easier for agencies to share real-time information on incentives. Michigan uses the cloud computing application Salesforce as a centralized repository of economic development incentive data. State agencies and localities can enter and review information, such as the activities of incentive recipients, site visits to those businesses, and incentive disbursements.18

The Michigan Economic Development Corp. (MEDC), the state’s lead economic development agency, also created a data integrity team to reduce data entry errors that slow down the process of administering incentives. The team conducts quality assurance reviews of information entered into Salesforce. This step helps MEDC more quickly identify errors, such as rejected projects appearing as active in the database. The team also compiles these data entry errors into a report distributed throughout MEDC. One year after introducing the data integrity team, MEDC had reduced the error rate on initial data entries. This has allowed officials to spend less time checking data for administrative errors and has streamlined the process of overseeing incentive programs.19

Virginia has worked to make its incentive data more consistent. The state has more than a dozen agencies that are involved in economic development policy. Historically, agencies have used different definitions of key terms such as jobs and capital investment, making it hard for officials to compare the results of various programs.20 To address this issue and others, state agencies involved in economic development established a working group, which meets regularly to discuss ways to improve data collection and administrative practices.21

Thanks to those efforts, a statewide report providing data on specific incentive program recipients was published in November 2015. The working group is developing an interactive online database of incentive projects and has established common definitions for jobs and capital investment.22

Strategy 3: Analyze data effectively

Once states have collected high-quality data, they can use it to study the results of their incentives and determine how they can be made more effective.

Since the beginning of 2012, 17 states and the District of Columbia have adopted laws requiring regular evaluation of major tax incentives or have improved existing evaluation processes. Indiana, Maryland, Oklahoma, and Tennessee are in this group. These laws generally require nonpartisan professional staff to study the results of incentive programs on a rotating multiyear cycle. The evaluations draw conclusions about the economic impact of the programs, the extent to which they are achieving their goals, and whether they are being administered efficiently.23

Indiana’s experience shows the value of these efforts. A 2014 evaluation from the state’s Legislative Services Agency (LSA) concluded that two incentives were ineffective.24 In response, lawmakers ended both programs in 2015.25 Like Indiana, numerous states have used evaluations to provide proof that incentives are working well, to redesign programs to improve their effectiveness, or to end outdated or poorly performing incentives.

Good evaluations depend on good data. For that reason, states have included data-sharing provisions in their evaluation laws. For example, a commission made up of a mix of private sector appointees and executive branch officials will evaluate Oklahoma’s incentives, potentially with the assistance of a private or academic economist the commission hires. Under Oklahoma’s 2015 evaluation law, state agencies are required to share data with the commission and the economist, which in turn are forbidden from disclosing this data unless they are allowed to do so under law.26

Maryland’s Department of Legislative Services (DLS) has successfully combined data from various sources to draw conclusions about the results of the state’s incentives. For example, a 2015 evaluation used state and federal data to show that the cost of the state’s film tax credit had increased rapidly at a time when overall film industry employment in Maryland was fairly stagnant.27 The state is still improving upon the data it collects and shares: DLS has used evaluations to highlight additional information that would allow for better analysis in the future.28

In Indiana, the LSA has worked to expand the data available to evaluate incentives. Indiana’s Tax Increment Financing (TIF) program allows local authorities to offer tax breaks in special redevelopment districts. When LSA began studying the program in 2015, a key piece of information was missing for many TIF districts: the year the zone was created. LSA responded by contacting the local units of hundreds of TIF areas to gather the information. That research helped LSA produce a rigorous evaluation that showed how the creation of a TIF district affected property values and employment.29

Besides evaluating program effectiveness, states can draw valuable conclusions by analyzing incentive recipients and their performance. Tennessee’s Department of Revenue, for example, provides policymakers with aggregated data on tax incentive awards, while ECD reports on the geographic distribution of recipients of other incentives.30

This information helps state officials assess whether incentives are helping to serve the state’s overall economic development goals.

Virginia uses QCEW data to monitor more than compliance. Economic development officials continue their reviews of incentivized businesses after the performance contract is complete to see whether companies maintain or expand their presence in the state or reduce their Virginia activities. These data allow for improved estimates of state revenue generated from these projects over time. That helps the officials better assess whether the incentive awards succeeded and whether policy changes would improve results in the future.31

Conclusion

Economic development incentives are key choices for state budgets and economies. In many states, officials have struggled to ensure that these programs are working well because of a lack of good data. The six states in the business incentives initiative have sought to share relevant data across government, ensure that these data are high-quality, and analyze the information effectively. Other states can learn from these examples to help make their economic development incentives more effective and accountable. If states do so, their incentives are more likely to help businesses and workers.

Endnotes

  1. Michael J. Berens and Ray Long, “Illinois Businesses Get Lucrative EDGE Tax Breaks, Fall Short of Job Goals,” Chicago Tribune, Oct. 2, 2015, http://www.chicagotribune.com/news/watchdog/ct-illinois-corporate-tax-breaks-met-20151002-story.html.
  2. Illinois Department of Commerce and Economic Opportunity, “Administration Takes Step Forward on Job Creation Tax Credits,” news release, Nov. 10, 2015, http://www.illinois.gov/dceo/Media/PressReleases/Pages/PR20151110.aspx.
  3. New Mexico Legislative Finance Committee, The Job Training Incentive Program, the Local Economic Development Act, and Select Economic Development Tax Expenditures (Aug. 23, 2012), 17, http://www.nmlegis.gov/LCS/lfc/lfcdocs/perfaudit/Job Creation Incentives.pdf.
  4. The Pew Charitable Trusts, interview with Jon Clark and Maria Griego, New Mexico Legislative Finance Committee, Dec. 17, 2015.
  5. Connecticut Department of Economic and Community Development, An Assessment of Connecticut’s Tax Credit and Abatement Programs (September 2014), 1, 4, http://www.ct.gov/ecd/lib/ecd/decd_sb_501_sec_27_report_revised_2013_final.pdf.
  6. Wisconsin Legislative Audit Bureau, Wisconsin Economic Development Corporation (May 8, 2015), 48–49, 59–60, http://legis.wisconsin.gov/lab/reports/15-3full.pdf.
  7. Hawaii Office of the Auditor, Audit of the Department of Taxation’s Administrative Oversight of High-Technology Business Investment and Research Activities Tax Credits (July 2012), ii, 2, 28–39, http://files.hawaii.gov/auditor/Reports/2012/12-05.pdf.
  8. Hawaii Office of the Auditor, “Credits Continue to Tax the State: Follow-Up on Recommendations Made in Report No. 12-05” (September 2015), 1, http://files.hawaii.gov/auditor/Reports/2015/15-11.pdf.
  9. U.S. Bureau of Labor Statistics, “Quarterly Census of Employment and Wages: Frequently Asked Questions,” accessed Jan. 12, 2016, http://www.bls.gov/cew/cewfaq.htm.
  10. The Pew Charitable Trusts, interview with Dawn Cash, secretary, Oklahoma Tax Commission, April 16, 2015.
  11. Executive Directive No. 2013-1 (Nov. 1, 2013), https://www.michigan.gov/documents/snyder/ED_2013-1_439597_7.pdf.
  12. David Gerregano, deputy commissioner, Tennessee Department of Revenue, business incentives initiative kickoff meeting, Washington, May 7, 2014.
  13. The Pew Charitable Trusts, interview with Sally Haar, director, Center for Economic Research in Tennessee, Jan. 20, 2016.
  14. Tennessee House Bill No. 291, enacted May 20, 2015, http://www.tn.gov/sos/acts/109/pub/pc0504.pdf.
  15. The Pew Charitable Trusts, interview with Michael Grandel and Christine Rose, Maryland Department of Business and Economic Development, May 18, 2015.
  16. The Pew Charitable Trusts, interview with Ting Zhang, assistant professor, University of Baltimore, Aug. 20, 2015.
  17. The Pew Charitable Trusts, interview with Rob McClintock, vice president of research, Virginia Economic Development Partnership, March 26, 2015.
  18. The Pew Charitable Trusts interview with Steve Bakkal, senior policy director, Michigan Economic Development Corp., Feb., 2, 2016.
  19. Ibid.
  20. Virginia Joint Legislative Audit and Review Commission, Review of State Economic Development Incentive Grants (November 2012), 35, 84–86, http://jlarc.virginia.gov/pdfs/reports/Rpt431.pdf.
  21. The Pew Charitable Trusts, interview with McClintock.
  22. Virginia Economic Development Partnership, Effectiveness of Economic Development Incentive Grant Programs Administered by the Commonwealth of Virginia (Nov. 15, 2014), 33–6, http://www.yesvirginia.org/Content/pdf/Library/Fiscal Year 2013-2014 HB1191 Report.pdf.
  23. The Pew Charitable Trusts, “States Make Progress Evaluating Tax Incentives,” updated March 2016, http://www.pewtrusts.org/en/research-and-analysis/fact-sheets/2015/01/tax-incentive-evaluation-law-state-fact-sheets.
  24. Indiana Legislative Services Agency, “Indiana Tax Incentive Review” (November 2014), 11–12, 19, https://iga.in.gov/static-documents/7/1/0/1/710134cd/indiana_tax_incentive_review_2014_annual_report.pdf.
  25. Indiana House Bill No. 1142, enacted April 17, 2015, https://iga.in.gov/legislative/2015/bills/house/1142.
  26. Oklahoma House Bill No. 2182, enacted April 27, 2015, http://webserver1.lsb.state.ok.us/2015-16bills/HB/hb2182_enr.rtf.
  27. Maryland Department of Legislative Services, Evaluation of the Maryland Film Production Activity Tax Credit (September 2015), 42–47, http://dls.state.md.us/data/polanasubare/polanasubare_taxnfispla/Evaluation-of-the-Maryland-Film-Production-Activity-Tax-Credit.pdf.
  28. The Pew Charitable Trusts, interview with Robert Rehrmann, policy analyst, Maryland Department of Legislative Services, May 18, 2015.
  29. Indiana Legislative Services Agency, 2015 Indiana Tax Incentive Evaluation (Oct. 2, 2015), 97–109, https://iga.in.gov/static-documents/6/d/e/c/6dec6072/indiana_tax_incentive_review_2015_annual_report.pdf.
  30. The Pew Charitable Trusts, interview with David Gerregano, Jan. 15, 2016; email interview with Sally Haar, Jan. 20, 2016.
  31. The Pew Charitable Trusts, interview with McClintock.