The 2009 Michigan Educational Assessment Program results are in, and schools from Benton Harbor to Midland to Ludington to Detroit are celebrating. For the fifth consecutive year, test scores were up on the whole. That appears to be good news; good enough anyway to briefly overshadow the few dozen districts going bankrupt and Detroit's recent record-setting ineptitude on a national standardized test. The only drawback to these MEAP results is that they don't really tell us how much kids are learning.

This shortcoming of the MEAP is not the fault of individual schools or districts. The MEAP is mandated by the state. Schools have no choice but to follow the state's directions, take four days away from regularly scheduled classes, sharpen their No. 2 pencils, and administer the test. The MEAP's failure as a reliable source for measuring student performance is due to the inability of both federal and state governments to institute meaningful accountability standards.

On the federal side, the No Child Left Behind program created perverse incentives for the states agreeing to carry out its standards. The strategy was to set high bars for schools to meet, inject money into the lowest-performing schools, and then have states hold failing schools accountable. As public choice theory would have predicted, some states realized that it would be easier to make more schools meet the NCLB standards than it would be to actually deal with school failures.

Michigan is one of those states. The MEAP test is the state's primary NCLB yardstick, and since NCLB went into effect, MEAP scores have increased at remarkable rates. When students in Michigan are compared to students in other states via national standardized tests, however, the results are very different.

A National Center for Education Statistics study last year compared state standardized test scores to those of the most consistent national standardized test, the National Assessment of Educational Progress. The study calculated what the NAEP cut score would be for each state based on that state's "proficient" student. Michigan's definition of "proficient" ranked near the bottom in every subject tested. Michigan's was 44th and 46th out of 48 states in fourth-grade reading and math, and 35th and 37th in eighth-grade reading and math, respectively.

Other evidence suggesting the MEAP is a poor measure of actual student achievement is that fact while MEAP scores were soaring to new heights, average scores on other standardized tests didn't budge. The average ACT score in Michigan from 2004 to 2007 (years in which a similar number of students took the ACT) remained constant. Similarly, from 2002 to 2008, the average SAT score slightly decreased in Michigan. Likewise, scores on the NAEP over the last decade haven't changed much either. Don't forget that all the while, graduation rates remained at about 75 percent.

Yet the MEAP says that from 1999 to 2009, the percentage of "satisfactory" or "proficient" fourth-grade reading scores went from 60 percent to 84 percent. The percentage of seventh-graders who were proficient in math went from 63 to 82 percent. In science, 81 percent of fifth-graders were proficient in 2009 compared to 37.5 percent in 1999. Either the MEAP is picking up on an academic miracle (the likes of which would be unprecedented in government-run schooling) that these other tests are missing, or we're witnessing what happens when cut scores and designations of satisfactory are changed.

Further complicating matters, the labels used by the Michigan Department of Education to describe satisfactory scores have varied widely over the last decade. From 2007 to 2009, the MEAP called students either "advanced," "proficient," "partially proficient" or "not proficient." But from 2002 to 2006, student scores were categorized as "exceeded," "met," "basic" or "apprentice" (except in 2003 when the lowest level in some subjects was labeled "not endorsed" and 2002 when writing was deemed either "proficient" or "not yet proficient"). From 1999 to 2001, student scores were just "satisfactory," "moderate" or "low." These varying labels make it almost impossible for parents know how much their kids are actually learning.

The MEAP is such a poor measure of student performance that many schools have opted to use independent tests to gauge the effectiveness of their academic programs. The only possible thing MEAP scores could tell parents is how their school ranks compared to others, but that's essentially meaningless since most parents have little choice over which school their children attend.

Federal (and ill-conceived) standards gives states like Michigan an incentive to see to it that its schools appear to be constantly improving, while at the same time the MEAP fails at telling parents how much their kids are actually learning.

#####

Michael Van Beek is director of education policy at the Mackinac Center for Public Policy, a research and educational institute headquartered in Midland, Mich. Permission to reprint in whole or in part is hereby granted, provided that the author and the Center are properly cited.