Students at Red Rock Elementary School in Jeffers, Minn, are shown here completing the Measures of Academic Progress assessment. Called a computer-adaptive test, the program raises the difficulty level of the questions as long as the student continues to answer correctly. Wrong answers prompt the program to pose easier questions.
A number of Michigan schools are going beyond the MEAP in search of information not only about what their students know, but when and how they learned it.
While the annual Michigan Educational Assessment Program is pivotal in determining if a school meets academic benchmarks, and while the state is rolling out a new program intended to help schools use MEAP results wisely, some educators want more data before making decisions about instruction.
It's an approach that Robert Theaker used as a quality manager in private industry.
"If I only look at data for a single moment in time, it can increase the probability to make a bad decision," said Theaker, senior manager of assessment for National Heritage Academies, a network of 57 charter public schools in six states, including 36 in Michigan.
Transferring that maxim to education, Theaker said NHA schools want to use more than MEAP results to guide decisions about curriculum and teaching practices. Consequently, NHA schools measure student progress in at least three ways: MEAP tests, teachers' classroom assessments and the Measures of Academic Progress, an assessment purchased through the nonprofit Northwest Evaluation Association. Each measures academic skills in a different way.
"If a school only relies on one piece of data, they can come to an erroneous conclusion," Theaker said. "You can end up going in circles."
"Clearly, they are looking for more information to inform their overall instruction," said Ginger Hopkins, vice president of partner relations for the Oregon-based NWEA, speaking not only of NHA, but the 100-plus other schools and programs in Michigan that buy the association's services.
NWEA and other organizations now offer assessments that go beyond a "snapshot" look at a student's current ability into measuring how much academic growth a student made in a given academic year, and how much growth to expect of that student in the next year.
Meanwhile, the Michigan Department of Education wants schools to make more and better use of data, too. The department and the Center for Educational Performance and Instruction are rolling out an online program — accompanied by professional development — to help educators dig deeper into MEAP scores and pinpoint strengths and weaknesses.
Differences in assessment testing can get complex, but in general the MEAP test measures how well students have a learned a body of knowledge predetermined by the state for each grade level. If enough students "pass," the school is said to have made adequate yearly progress under the federal No Child Left Behind Act.
MEAP results can be grouped in different ways — by school, by district, by ethnicity — and the results can be compared to other schools and districts, as well as to past results and to the state average. But Michigan scores can't be compared with other states, since each state develops its own assessment test and sets its own "proficiency" levels.
The MAP test also is based on a body of knowledge, which NWEA calls a learning continuum, but the continuum is not divided by grade levels. Rather, NWEA essentially lines up all the academic material in a given subject from easiest to most difficult. In math, for example, simple counting is at the beginning of the "number sense" continuum and writing 162 percent as a fraction is at the highest level.
Students take a computer-adaptive test to determine where they stand on the continuum. The test responds to the student's answer by adjusting the level of difficulty. If a student answers correctly, the next question will be a little more difficult. If he or she answers incorrectly, the next question will be a little easier. The program adjusts the difficulty level up and down until it can "place" the student in a given range on the continuum.
One advantage of the MAP is that it doesn't stop questioning students until they've reached their limit, whether that limit is higher or lower than expected for their grade, said Mark Esper, director of curriculum and instruction for Grand Traverse Area Catholic Schools. Diocesan schools administer the MAP twice a year to second- through tenth-graders.
"We want to be able to identify student instructional levels," he said. That allows teachers to match instruction to the student's level and to group students at similar levels.
"If you've got a fifth-grader working at second-grade level, they're not going to progress if you keep giving them fifth-grade work," Hopkins said.
The Grand Traverse Area Catholic Schools, those of the Diocese of Gaylord, and NHA schools also use MAP results to track students' academic growth over time, by comparing fall-to-spring or year-to-year scores, the same way that businesses compare quarterly profits or year-over-year results.
"We can see how they've grown relative to that (MAP) scale," Esper said, and also relative to other students who took the assessment nationwide.
But MAP doesn't show just past growth, according to Hopkins. It also projects how much a student should be able to gain in a given year, based on national growth averages. That projection can be adjusted to take into consideration things like income level and past academic performance.
When students don't reach their growth target — or when they surpass it — schools look for ways to adjust curriculum or teaching practices accordingly.
"If we see great growth in an area, we want to ask teachers what they did to obtain that growth," Esper said. "The question we use is this: You're being very successful. Can you identify the reason? We work hard to create a culture in which teachers share their best practices to help improve results for all."
Another way schools use MAP results is to predict how well their students will do on the MEAP. The association has done a comparison of its own learning continuum with Michigan's grade-level content expectations, Hopkins said, allowing the NWEA to project that students who score in a given range on the MAP are likely — or unlikely — to be proficient on the MEAP.
The average cost to administer the MAP is $13.50 per student, according to the NWEA.
DATA FOR STUDENT SUCCESS
The Michigan Department of Education wants to make it easier for school districts to use the data they already have at hand.
At a state board of education meeting in October, Mary Gehrig of the Calhoun County Intermediate School District profiled the "Data for Student Success" computer program, which allows teachers and administrators to analyze MEAP data in detail.
As Michigan Education Report reported in April, Data for Student Success allows teachers or administrators to view several years of MEAP results simultaneously, and zero in immediately on details like: how many students answered a given question correctly, how students did on this year's test compared to last year, which concepts or skills are trouble spots for the most students, and whether there are significantly different results between boys, girls, low-income students, or minority groups.
Already introduced in the Upper Peninsula, the program and accompanying professional development should reach every Michigan school district by the end of 2010.
The point is to use the analysis to set school improvement goals, clarify and address building-level problems and ultimately bring about instructional change, Gehrig said.
"Our premise is that we have to get down to the classroom level in order to improve student achievement," she said.
That's what happened in the Barry County's Delton-Kellogg Schools, Superintendent Cindy Vujea told the state board, where an academic audit showed consistent declines in student proficiency.
"The data was quite startling," Vujea said. The findings led to a number of changes, including a K-12 focus on writing improvement, establishment of literacy teams, and adoption of new reading, math and science programs at the elementary level.
"You would think people were probably asking to retire left and right," Vujea said, giving the stress of implementing three new elementary curricula. "Some are. But the vast majority is saying thank you."
Students at National Heritage take the MAP test within the first two weeks of school, and teachers receive a list of growth targets for their students, Theaker said. Outcome data for the students a teacher instructs throughout the year is incorporated into that teacher's evaluation.
"It's part of our (NHA) culture that part of our evaluation is outcome data," he said.
In Grand Traverse Catholic schools, growth data plays a role in setting goals, but not in determining teacher compensation.
"While important, good results on external assessments are just one part of a complex set of factors that contribute to successful teaching and learning," Esper said. But the diocese does use the data to set goals for teachers to grow and to provide support in reaching those targets.
"We're in the growth business," he said. "Just as we expect students to grow, we expect teachers to grow."
The idea of using growth data as part of a teacher's evaluation is contentious, with critics saying that there are unresolved technical issues in attributing "growth" to individual teachers.
California has banned such use of assessment results, but Tennessee allows it if the data meets specific requirements. New York is piloting a program in which some teachers will receive effectiveness reports, but only to help them improve. The Mackinac Center for Public Policy, which publishes Michigan Education Report, has proposed a pilot merit pay project in Michigan in which participating teachers would receive bonuses based, in part, on student academic growth data.
"I'm not going to say the (MAP) test is perfect," Theaker said. "It's that some tests are more useful than others."
MEANS, NOT ENDS
Valuable as assessment data can be, it's only a means to an end — improving student achievement, educators said.
"Here's the thing people have to remember: It's just the tool," Esper said. "Getting teachers to value the tool and use the data is an important first step."
Teachers routinely use their own observation and classroom assessments to measure progress and make instructional decisions, Theaker said, but, like a check-and-balance system, outside assessments tend to verify what is working and point out what is not. The system lets teachers correlate the data to the practices that brought about the best results, then leads to discussion on how to implement those best practices.
"Teachers see it for themselves," he said.
In Delton-Kellogg Schools, teachers now have common planning periods every day to discuss data-based instructional improvement, Vujea said.
Students might like to join that discussion, Hopkins said.
Some teachers share growth targets and MAP results with individual students and point out what they need to tackle next in order to advance on the continuum, she told Michigan Education Report.
"I see high school kids that are blogging about what they need to be working on."
Lorie Shane is the managing editor of the Michigan Education Report, the Mackinac Center’s education policy journal. Permission to reprint in whole or in part is hereby granted, provided that Michigan Education Report is properly cited.