Approaches for Designing Job Creation Tax Incentives
Pew responds to questions from the Philadelphia Department of Commerce
On Feb. 21, 2023, The Pew Charitable Trusts submitted a memo to the Philadelphia Department of Commerce in response to the agency’s questions about the city’s job creation program, including how to evaluate it, manage undersubscription and oversubscription, connect it to workforce objectives, create metrics to score applicants, and incorporate a “but for” assessment in the award process. The memo also provided examples of apprenticeship and training programs linked with tax incentives in other states.
To: Dawn Summerville, Duane Bumb, Philadelphia Dept. of Commerce
From: Alison Wakefield and Logan Timmerhoff, The Pew Charitable Trusts
Date: February 21, 2023
Subject: Pew Comments on the Quality Jobs Grant Program and High-Impact Forgivable Loan
In response to your request, this memo responds to questions you asked our team in advance of launching two new programs: the Quality Jobs Grant Program (QJGP) and the High-Impact Forgivable Loan Program (HIFLP).
Broadly, these questions concerned:
- Evaluation of the Quality Jobs Grant Program
- Approaches to manage under-subscription in the program
- Approaches to manage over-subscription in the program
- Connecting the grant program to workforce objectives
- Metrics for measuring/scoring HIFLP applicants
- Incorporating a “but for” assessment in the HIFLP award process
- Relevant examples of apprenticeship and training programs linked with incentives
We have also provided a copy of our memo from March 2020, a sample logic chain with program outcome metrics and economic indicator benchmarks, and an illustrative way of identifying QJGP’s program requirements with corresponding verification sources and oversight roles.
QJGP Program Questions:
1. The Quality Jobs Grant Program is designed as an entitlement grant ultimately intended to replace the Job Creation Tax Credit program. For this reason, QJGP offers the same incentive per job ($5,000) and does not include preferences by geography or industry type. How should evaluation be structured to fairly compare the two programs –grant vs. tax credit?
High-quality evaluations start by identifying a program’s goals, so analysts know what to measure and if a program has met expectations. Many evaluators group programs with similar objectives so they can compare the results. Pew has drafted a sample logic chain for QJGP with illustrative program outcome metrics and economic benchmark indicators to help Commerce with this exercise. Many of the indicators were sourced from this report.
The challenge in this instance is that QJGP and the JCTC seem to have different objectives in addition to different eligibility requirements. Commerce has described QJGP as a small business workforce incentive meant to increase hiring and employment among un- and underemployed Philadelphians. JCTC, meanwhile, is a more “traditional” job creation incentive with broad business eligibility requirements. These differences can make it a challenge to compare the outcomes of the two programs.
Though it may be difficult to directly compare the programs, Commerce could articulate policy objectives it hopes to achieve through QJGP that it was not able to achieve under JCTC, and then evaluate whether QJGP met those objectives. This could include an assessment of whether QJGP resulted in stronger connections with the workforce development system, more incented positions filled by Philadelphia residents, or desired changes in the mix of companies receiving the incentives. However Commerce decides to measure program performance, it will need to be sure they can access historical data from JCTC and that they are collecting relevant data for QJGP.
Another option may be to design a study wherein both programs run concurrently, and applicants are assigned to a program based on a predetermined sampling methodology. This would enable Commerce to collect new, comparative data on JCTC applicants and more directly compare the programs. This approach would likely require a skilled researcher to participate in the design of the study, data collection process, sample design, and other efforts before the new program is launched.
Please see section 1d in Pew’s March 2020 memo for additional evaluation considerations.
2. If QJGP utilization remains low during first year of launch, what adjustments should be considered to increase interest/utilization? Increase amount of grant? Increase performance term? Add job training linkage to improve access to city resident job seekers?
We have seen several reasons for low program utilization in other cities and states, including:
- Lack of awareness among targeted businesses,
- Inability to identify or hire the required workforce,
- Complex application processes, and
- Onerous compliance requirements.
Commerce could consider different mitigation steps depending on the reason for low utilization. For instance, if few businesses express interest in the program, it could be due to low awareness. This could indicate a need to expand or adapt marketing efforts to promote the program, or increase the accessibility of program materials by offering them in more languages and in different formats.
Alternatively, if the city finds a low conversion rate of businesses applying for the grant after expressing initial interest (via the interest form), there may be another reason for low uptake. Perhaps the application process is too complex, or applicants don’t feel they have the technical or financial capacity to comply with the program’s requirements. Depending on the situation, Commerce could streamline forms and the application process; review the compliance process for pain points for businesses, such as reducing the amount of data needed for compliance by identifying a pre-existing alternative source (see pages 58-67 for examples of challenges evaluators identified through surveys of businesses participating in a tax credit program in Minnesota); or provide assistance to businesses during the application and compliance processes.
The interest form could be a valuable data resource to explore reasons why businesses apply or not, such as through surveys.
3. Conversely, if QJGP demand exceeds initial funding authorization ($1 million being reserved at program launch), how should City prioritize applications?
States and cities have handled this in different ways, each with its own set of policy implications. Pew’s March 2020 memo addresses some of the ways states handle awards and control costs (see Section 1c and Pew’s publication on program caps).
If Commerce is concerned about over subscription, it may wish to use a scoring system or hybrid approach so it can target benefits to the strongest applications. As noted in our March 2020 memo (section 1d. Evaluation), a scoring system could facilitate a high-quality evaluation comparing outcomes of businesses that do and do not receive an incentive. Alternatively, Commerce could set multiple award rounds during the year to give businesses more time to prepare applications, and the department opportunities to course correct, before allocating the full amount of the program.
4. QJGP’s primary objective is to create job opportunities for unemployed or underemployed city residents. Are there best practice examples of other cities linking job creation incentives to job registries, job training programs or other workforce resources?
While this is not a core area of Pew’s expertise, we can share some thoughts based on applicable experiences with other types of programs.
- Businesses need clear, easily accessible information about program requirements and expectations and procedures in the event of noncompliance. This is often done through performance agreements that articulate the terms by which a business will receive an incentive (see also Section 1c Company Agreements in Pew’s memo). Some evaluations have recommended that program administrators provide more comprehensive compliance resources and supports, such as videos or technical assistance.
- Evaluations often report that lack of coordination between state or city agencies impedes program effectiveness. The 2019 evaluation of the City’s incentives also noted this. QJGP is a complex program involving numerous entities (Commerce, PIDC, the City’s workforce partners, and possibly the departments of Revenue and Labor). These entities will need to efficiently share data for compliance and evaluation, as well as for program effectiveness. Businesses also need to know how to connect with un- and underemployed residents and how to navigate that process.
- Many cities have found it difficult to enforce first source hiring or community benefit agreements. In anticipation of similar challenges, Commerce may wish to establish standards (at least internally) for what qualifies as a good faith effort on the part of businesses to comply with the expectation of hiring un- and underemployed workers.
- Finally, Commerce will need to determine how they will verify program compliance, such as whether the new hires qualify as previously un- or underemployed (see the table at the end of this memo for additional compliance suggestions)
Section 1b (Encourage hiring of target residents) of Pew’s memo provides some examples from other cities. Additional resources include:
- Beyond Training: How State Economic Development Agencies Are Helping Companies Development Talent (CREC, July 2019)
- Repositioning Economic Development Incentives Post-Pandemic (CREC, 2021)
High Impact Forgivable Loan Program Questions:
5. The High Impact Forgivable Loan Program (HIFLP) is intended to be another tool in the City's economic development portfolio that can be utilized where the scale of the business expansion program exceeds the capacity of the QJGP grant program. HIFLP will remain discretionary but will be structured to provide consistency in per job incentive with the non-discretionary grant program. Loan amount can be adjusted to incentivize higher-wage jobs to maintain high program ROI. What other metrics can be measured/scored while retaining flexibility of this attraction and retention tool?
And
6. Can other criteria like "but for" test or leveraging other public incentives be factored into a negotiated amount of incentive?
We have chosen to group these two questions together because they are closely related. Pew’s March memo includes examples of award processes in section 1c. The examples from Austin, California, and San Diego use different metrics for assessing proposals drawn from their economic, community, and workforce development strategies. Commerce could adapt these approaches and use its own scoring rubric to reflect Philadelphia’s priorities.
Governments may not be able to eliminate instances of rewarding businesses for what they would have done without the incentive, but they can take steps to reduce these occurrences. A possible step could be to exclude activity that is already planned or underway. The Texas Enterprise Fund uses this approach in its pre-award analysis. A company cannot have signed a lease, purchased land, or hired employees in the state prior to applying for a grant. Doing so would indicate that the business has already chosen Texas and therefore does not need the incentive to aid in its choice.
Another alternative incorporates a “but for” estimate into the up-front project assessment process. The hope is to get a more accurate cost-benefit analysis and whether support is likely to induce new behavior. A recent example from Michigan offers such a framework.
Michigan’s Economic Development Corporation (MEDC) asked two outside evaluators to provide options for “but for” analyses of the projects supported through its Michigan Business Development Program (MBDP), a discretionary grant program for business attraction and expansion. One of those evaluators, the Center for Regional and Economic Competitiveness (CREC), devised a “firm choice” approach, which drew on economic literature to determine the factors most likely to affect firms’ investment decisions.1 CREC then developed a weighted scoring rubric to determine whether a particular firm’s investment decision was likely to have been affected by MEDC’s grants.
MEDC has now adapted the firm choice approach for its upfront cost-benefit analysis when evaluating applications for MBDP. Instead of assuming 100% of the proposed business investment is a result of the incentive, MEDC scores proposals using the rubric and discounts the investment by the corresponding “but for” amount. Commerce could develop its own firm choice rubric to perform up-front assessments of whether a business’s hiring decisions are likely to be affected by HIFLP support.
7. If new jobs created aren't directly conditioned upon hiring city residents, are there other examples from other jurisdictions of apprenticeship or training programs that can be linked?
Sections 1a and 1b of the Pew memo and our response to question 4 above include relevant options to consider. We have also found several reports from the Brookings Institution related to this topic, such as this excerpt from a recent report:
“Against this backdrop of declining employer investment in training and the simultaneous increase in demand for skills, the low share of incentives in this analysis that directly incentivize firms to conduct job training and skill development is notable. Only 7 percent of incentives went to job training in San Diego, the highest share in the analysis, followed by Indianapolis at 4.4 percent, and Cincinnati at 1.1 percent. To our knowledge, Salt Lake did not have any incentive programs related to job training. This is not to say that job-training programs are not underway in these communities; they just operate outside of firm-specific economic development incentive programs. That noted, the prominence of incentives within the economic development toolkit and the urgency of employer skills needs suggests a mismatch that economic development departments should examine closely."
Example List of Program Requirements and Associated Verification Steps
This is an illustrative way of identifying QJGP’s program requirements and corresponding verification sources and oversight roles.
Program Requirement | When Verified (Application or close-out) | Source of Information (self-reported by business, Depts. Of Revenue or Labor, Payroll Company) |
|
Actions needed to access appropriate data (company approval, MOUs with Revenue, Labor, workforce partner?, (approval from state labor dept) |
---|---|---|---|---|
Job located in Philadelphia |
||||
Job held by Philadelphian |
||||
Permanent position working 30-hours per week or 1,500+ hours/year |
||||
Pays at least Philadelphia Living Wage |
||||
All employees paid at least Philadelphia Living Wage |
||||
Provides employer-sponsored health insurance:
|
||||
Provides paid time off (1 hour/40 hours worked) |
||||
Previously unemployed |
||||
Previously underemployed |
||||
# of net new jobs created |
||||
Submission of grant close-out report |
||||
Approval of grant funds and confirmation of receipt |
||||
Industry code |
||||
Occupation codes |
||||
Workforce partner? |
||||
Additional benefits offered? |
||||
Owner race, ethnicity, gender |
||||
Employee race, ethnicity, gender |