In a December commentary I wrote about how Michigan Economic Development Corp. officials appeared to be seeking “massaged” research from consultants who would provide evidence of the need for more investment in MEDC programs. Specifically, the agency seemed to be seeking affirmation of a specific dollar figure — $291.5 million — that the state should “invest” to lower the unemployment rate.
An astute reader steered me to a story out of Kentucky involving the MEDC’s winning bidder, a firm called AngelouEconomics. According to the article, AE was forced to give a refund to the city of Lexington after submitting a study with “parts [that] were lifted nearly verbatim from recommendations for other cities.”
One local businessman accused AE of “recycling” its previous work, and the firm was forced to apologize to Mayor Jim Gray.
Perhaps it’s not surprising then that the proposal AngelouEconomics submitted in response to an MEDC request for proposal looks like a recycled version of the MEDC’s own deeply flawed RFP. There’s no plagiarism, but the firm’s proposal does read like AE was prepared to provide research designed to generate conclusions conforming to those sought by the MEDC.
Pleasing a client is not necessarily bad business, but it could mean bad economics. The response from AE — especially relative to a losing bidder’s response — suggests that the MEDC went looking for a consultant to justify a much higher dollar figure — three times higher — for its subsidy programs than what Gov. Rick Snyder and Legislature appropriated.
MEDC: Review critique, and/or validate the MEDC’s method of defining an optimal level of investment in an incentive program. Make recommendations to adjust the proposed methodology.
AE: Evaluate Michigan’s present methodology for defining an optimal level of investment in an incentive program and offer recommendations for its improvement.
MEDC: Consider and propose alternative methods for identifying an optimal level of investment in incentive programs. Finalize a recommended level considering adjustments to the MEDC method or the recommended method of the contractor.
AE: Offer recommendations relating to the adjustment (if deemed necessary) to the level of investment contributed by the State in order to ensure that Michigan is competitive with the benchmark states. [Emphasis added.]
It’s no wonder that MEDC officials were so breathless in their review of the consultant’s offer. An Incentive Study Scoring sheet filled out by MEDC employees contains comments such as “on point; in agreement,” “gives us what we need,” and “spot on; these guys are in the bizness” [sic].
Contrast this cozy sense of accord with a competing proposal from Anderson Economic Group. An MEDC official marked down this proposal because Anderson “wants to modify and redirect.” In short, Anderson made it clear that the MEDC’s methodology was flawed.
The MEDC proposal included as an objective the desire to have a consultant “review, critique, and/or validate the MEDC’s method of defining an optimal level of investment in an incentive program. Make recommendations to adjust the proposed methodology.” [Emphasis added.]
But Anderson did exactly that, writing on page three that “Designing an effective tax incentive strategy requires more than ‘picking a number’ for the overall size of the incentive program.” Anderson’s assessment of the MEDC’s methodology continued:
After completion of this phase of the project, we expect to meet with MEDC representatives in order to propose one or more alternative methods for assessing the effectiveness of the state’s business incentive plan.
Anderson even stated its intention to question the steps outlined in the MEDC’s RFP in the “technical plan of work” section. (Full disclosure: Anderson has done research at the request of the Mackinac Center in the past)
This may explain why the MEDC felt compelled to hire a Texas-based firm over a Michigan-based one with an extensive experience in assessing this state’s corporate welfare regimes. The AEG, for example, published “Effectiveness of Michigan’s Key Business Tax Incentives” in March 2010. This is not the first time the MEDC has engaged consultants to generate questionable results using dubious methodology. The MEDC has a habit of hiring consultants who have produced questionable results using questionable methodology.
The MEDC paid a consultant in 2001 to measure the impact of state subsidies for “ubiquitous” broadband deployment. The report came back saying that if government would only subsidize broadband deployment, Michigan’s economy would add 550,000 jobs and result in more than $440 billion of economic activity to its economy.
The author of the report confessed to using a unique and arbitrary methodology, but a broadband deployment program was adopted anyway — and later was acknowledged to be an unmitigated failure. Part of this story is detailed in the Diane Katz article “Should the State Boost Broadband?.”
Incentives matter, especially the ones that lead to government empire-building at state agencies. Increasing the number and size of programs at the MEDC seems to be part of that department’s genetic code.
The Legislature’s response to this should not stop at viewing the forthcoming study with great skepticism. As part of their oversight responsibilities, legislators should also investigate the MEDC’s use of taxpayer money ($80,000 in this case) to create what are essentially marketing materials serving its own bureaucratic agenda rather than the best interests of Michigan’s people and economy.
Michael D. LaFaive is director of the Morey Fiscal Policy Initiative at the Mackinac Center for Public Policy, a research and educational institute headquartered in Midland, Mich. Permission to reprint in whole or in part is hereby granted, provided that the author and the Center are properly cited.