Spotlight

Growth Measures: Don’t Call ’em ‘Value Added’

by Brett Schaeffer
Gage Kingsbury, director of research at the Northwest Evaluation Association, takes great pains to explain the Portland, Ore.-based consulting firm does not conduct value-added assessment.

What NWEA does, says Kingsbury, is create its own computer-based tests, which are administered to students every nine weeks and which measure a student’s academic growth.

So where value-added assessment is a process to measure yearly student progress, NWEA crafts tests called Measures of Academic Progress to track academic gains and losses quarter to quarter.

"We know at the end of every quarter how much growth has occurred," says Gerrita Postlewait, superintendent of Horry County Schools in South Carolina, which started using NWEA’s tests systemwide last year following a pilot program the prior year. She also works with value-added pioneer William Sanders on analyzing her school district’s results on state and national tests, giving her an abundance of data.

The NWEA reporting, though, provides her with information more quickly than other standardized tests.

Because the Measures of Academic Progress are computerized, teachers can see results the day after the test is given, says Kingsbury, who contends the average state test normally reports out 4 to 6 months after it’s given.

"We can operate like a business and immediately address any problem areas," says Postlewait. "Our students then gain the benefit of a system that’s responsive to their needs."

Several years ago, Postlewait says, her district tried to develop its own assessment—something akin to NWEA tests. But a lack of coherence and time, she says, thwarted the project. She turned to NWEA, at an annual cost of cost of $6 per student for her 30,000-student district, because she believes the state assessment exams, which are conducted each spring, offer no help for making immediate adjustments because the official results are not sent out by the South Carolina Department of Public Instruction until the end of September.

A Leveling Effect

Formed in 1974 as a partnership between Portland-area school districts and the Seattle Public Schools, NWEA was incorporated as a nonprofit in 1977. It now works on a contractual basis with 1,300 districts in 40 states.

In addition to the rapid turnaround, NWEA’s tests also offer district leaders more detailed information than many other standardized tests, says Linda Clark, superintendent of Meridian, Idaho’s Joint School District 2 in Idaho.

"We were looking for a testing system that was testing the curriculum we were teaching," says Clark, who spent 10 years as the district’s director of student achievement before becoming superintendent this past July.

NWEA tests align with her district’s curriculum, providing a level of detailed student assessment previously unavailable, she says.

"The ITBS (the Iowa Test of Basic Skills), which we had for many years, did not have the power to change and inform district practices," she says. "A system that can measure growth by quartile and look at disaggregated data is extremely powerful."

The 28,000-student Meridian district has been using NWEA’s tests for nine years, starting with the organization’s paper-and-pencil versions.

"Virtually all of our elementary schools are leveling," says Clark, meaning students are grouped based on their skills as gauged by the detailed NWEA tests rather than in grade levels. "A 9-year-old might not be in something called 4th grade. Instead, he may be taking middle school math or reading at a lower level," she says. By grouping students this way, Clark says, the district can maximize the abilities of instructional staff and provide students with an ongoing learning challenge.

The rest of Idaho’s 126 public school districts, as well as the state’s 14 charter districts and 15 private school districts have followed Clark’s lead. Three years ago the Idaho Department of Public Instruction contracted with NWEA to develop a new statewide exam, the Idaho Standards Achievement Test.

Measurement Flaws

Ultimately, NWEA’s Kingsbury doesn’t oppose value-added assessment models, he simply sees flaws in them.

One problem with value-added assessment, he says, is that its outcomes are only as good as the data used. "The NAEP tests are an example of a test that is very broadly used in measuring student performance. But the NAEP tests wouldn’t be good tests to look for an amount of value a school is adding to students [because] the test is not accurate at the student level," says Kingsbury.

Another issue, he says, is that fixating on gains can obscure the larger goal of getting students to an established standard. "A school can add a lot of value to a student’s achievement without getting that student to a standard."