Feature

The Revelations of Value-Added

An assessment model that measures student growth in ways that NCLB fails to do by Ted Hershberg, Virginia Adams Simon and Barbara Lea Kruger
In the No Child Left Behind era of high-stakes testing, school administrators are facing their toughest challenge ever. They are being held accountable for the performance of their schools, yet current systems in public education typically fail to provide them with the appropriate tools to manage effectively.

Although the classroom is where learning takes place, superintendents and principals often know precious little about what is happening within them. As never before, administrators need the means to measure and evaluate the impact of curricula, new practices and professional development on academic achievement. For unless the quality of classroom instruction and programs can be improved significantly, students will be unlikely to meet the high standards now required.

Fortunately, significant help is available in the form of a relatively new tool known as value-added assessment. Because value-added approach isolates the impact of instruction on student learning, it provides detailed information at the classroom level. Its rich diagnostic data can be used to improve teaching and student learning. It can be the basis for much-needed improvement in the calculation of adequate yearly progress. In time, once teachers and administrators grow comfortable with its fairness, the value-added also may serve as the foundation for an accountability system at the level of individual educators.

Identifying Contributers

Used by a growing number of states, value-added assessment provides a new way to measure teaching and learning. Value-added uses the annual test scores that are now being collected for students and analyzes them to reveal the progress students are making each year. In its focus on growth rather than solely on levels of absolute achievement, value-added broadens our understanding of the contribution instruction makes to student learning. While family income remains the best predictor of absolute achievement, good instruction is 10 to 20 times more powerful in predicting student growth.

By following individual students over time, value-added accounts for student background characteristics over which schools have no control and that tend to bias test results. And in what is perhaps its most unique contribution, value-added enables educators and the public to identify not only the progress made by students but also the extent to which individual teachers, schools and districts have contributed to it.

Under the value-added approach, test scores are projected for students and then compared to the scores they actually achieve at the end of the school year. Classroom scores that exceed projected values suggest that the instruction is highly effective. Conversely, scores that are mostly below projections suggest that the instruction is ineffective.

But at the same time this approach considers student-related factors, such as the pattern of prior test scores, both those of the individual student as well as those of other students in the same class. If a student’s present performance is below projected scores while students with comparable previous academic history in the same classes have done well, this is evidence of the student effect—external variables such as the home environment that are beyond the control of teachers and schools—that can be ruled beyond the range of a teacher’s influence.

Value-added provides educators with two patterns of instruction that characterize their classrooms: which students are the focus of their instruction (previously low, average or high achieving) and how effective their instruction has been in providing students with a year’s worth of growth from wherever they started in September.

Armed with these data, teachers can meet regularly to discuss how to change their instructional patterns in desired directions. The ensuing conversations end the isolation of teachers and teaching and, when coupled with strong instructional leadership to guide educators through a process of study, schools can be transformed into true learning communities.

While adequate yearly progress measures show only a vague snapshot of performance from year to year and offer no information about where strengths and weaknesses lie, value-added methodologies, characterized by a focus on individual students and longitudinal cohorts, are instead helping educators close the achievement gap, manage and evaluate innovations and improve teaching and learning. The following examples illustrate how superintendents in three different school districts have used value-added to transform their instructional practice and raise student achievement.

Addressable Issues

Value-added assessment has been in place in Tennessee since the early 1990s when it was created by William Sanders, who was then a statistician at University of Tennessee. Schools use the Tennessee Value-Added Assessment System to identify where their achievement gaps exist and address student needs more expeditiously. One example is the Maryville Middle School in Maryville, Tenn., whose principal, Joel Giffin, used value-added data to improve the school’s academic program and help underperforming students make significant gains.

Giffin, now in his 33rd year at the school, believes the analysis and disaggregation of value-added data allowed him to see that 7th grade math performance was not what it should have been for 20 of the lowest-achieving students. The statewide benchmark was 15 scale score points, while these students made a 12.5 scale gain in their test scores. Scale scores are common units of measure on a standardized test.

Giffin and his staff identified five characteristics shared by these students: 1) low socioeconomic status; 2) one-parent families; 3) lack of adults at home at the end of the school day; 4) lack of money for school supplies; and 5) lack of help or positive reinforcement with homework.

“While we recognized that we could not change one through three,” Giffin says of the five factors, “we felt we could change four and five. Four was simple. Money was donated once people understood the need. Five was harder because it meant we had to do things differently.”

The principal’s mission, then, was to change the school to fit the needs of the students. “We created during school time the same support for these students as we would our own kids. We added a daily math class to help them complete homework, provided tutoring, gave encouragement and praise and made sure that they went to class prepared and feeling confident,” he says.

The results were outstanding. The value-added gains of these students were 360 percent of the national norm compared to the state benchmark of 100 percent of the national norm. This was the greatest gain ever of any group of students at Maryville Middle School.

Indeed, through Giffin’s use of value-added and the myriad of reforms he implemented, the Maryville Middle School now ranks as Tennessee’s top-scoring middle school on value-added assessment. The 10-year schoolwide value-added score for the school is 144 percent of the national norm.

Comparing Programs

In Ohio, a value-added pilot project was started two years ago with funding from the Battelle for Kids Foundation. Seventy-eight school districts now are using value-added assessment to better manage program innovations.

James Mahoney, a former school superintendent who leads the pilot, says value-added assessment has been used in multiple ways by the participating districts. (See related story, page xx)

In one school district in Westerville, Ohio, value-added measures showed that two elementary schools were consistently outperforming the others in 4th grade science. The superintendent, George Tombaugh, visited the two schools to talk to the 4th grade teachers and discovered that because none of these teachers was particularly strong in science, they decided that each would specialize in only one segment of the curriculum. The teacher with the comparatively greater subject expertise then taught that segment to all the 4th grade students. Tombaugh now is exploring the possibility of replicating this practice districtwide. Another Ohio district, Riverview Local Schools, piloted a new 4th and 5th grade math program in two of its elementary schools. The district used value-added measures to assess the growth of students in this program as compared to the growth of students in the old program to determine which program was more successful.

A third Ohio district, Miami East, was interested in alternative programs for its elementary schools and investigated programs that worked for students with similar characteristics. The district used the value-added “school search” feature to find similar schools that were getting high levels of progress. When they discovered they were matching the progress of the other schools, they decided to keep their current program.

Measuring Impact

Sharon Kirk, superintendent of the DuBois Area School District in Dubois, Pa., described how her 4,500-student district was making advantageous use of value-added in testimony before the state Senate Education Committee, saying, “The best part of value-added is its measure of teacher and school effectiveness and the resulting potential for increased student achievement.”

The DuBois district was among the original 32 Pennsylvania districts in 2002 to implement value-added. The state board of education quickly saw the benefit and voted to require its use in all 501 districts statewide by 2007.

Kirk says she has been impressed by the difference between value-added measures and No Child Left Behind’s adequate yearly progress measurement.

In her testimony to state legislators in March 2004, she used a simple analogy: “Imagine a busload of children arriving at a park to play baseball. The aim of the event is to teach every child to hit a triple. As the children arrive their individual abilities are very different. Some children can already hit a double while others are having difficulty even managing the steps on the bus. All the children make progress during this event. Many of the students who could hit a double now hit triples. Some of the students who had trouble with the steps on the bus have worked very hard and can now hit a double.

“How would you gauge the most successful students, or who made adequate progress? If hitting a triple is the measure of success, students with the least progress may be considered the only successful ones even though they have made little progress because they started so far ahead. Those children who worked hard and perhaps made the most progress will be judged as failures.”

Making this distinction is important to teachers and one reason why teachers in DuBois have come to trust value-added measures.

This trust helps when it comes time to measure the effectiveness of particular teaching practices or programs. Kirk says she and her staff developed a districtwide instructional improvement plan after reviewing two years of school-level data. The value-added assessment system provided each school with data about progress using proficiency levels and quintiles for math and reading.

In one DuBois school this data revealed that two different cohorts of students were performing in an identical pattern in math for two years in a row. “What that told us is that with two entirely different groups of students we were getting similar achievement. In both years we were failing our top students. Their scores were well above average and certainly proficient but their progress was not,” Kirk says.

This information provided the impetus for change. The principal of Dubois Middle School, Daniel Hawkins, minces no words in his praise of the new approach. “Value-added has been the No. 1 most positive motivator for my staff as well as for me. … It has been a catalyst in our schoolwide improvement program.”

The DuBois Area School District has eight elementary buildings, some highly successful and above the norm. Kirk believes that the only way you can have 100 percent proficiency is to implement sound instructional practice, which can be greatly enriched by value-added assessment.

Likely Impact

Value-added assessment by itself does not improve student achievement. As these examples illustrate, only when educators understand its power and use what they learn to guide instruction and professional development will schools begin to see significant learning gains in their students.

More educators and policymakers are recognizing the benefit of value-added assessment. In addition to Ohio, Pennsylvania and Tennessee, new legislation in Arkansas and Minnesota calls for implementing a form of value-added measurement, and the state school boards associations in Iowa and New York are piloting a value-added program this year. Dallas and Seattle are the most prominent urban districts that use the value-added approach, along with several smaller districts in Colorado, Florida and North Carolina.

In coming years, value-added’s increasing popularity may result in three significant changes for public schools nationwide.

First, we will see the production of subject and grade-specific professional development modules to help educators change their instructional practices to improve student learning. Parallel efforts using simulated data will be undertaken in the colleges and universities where students who desire to become teachers are educated.

Second, with some projections showing that virtually all schools will fail to meet their NCLB targets in the next 4-5 years, significant efforts already are under way to amend how AYP is calculated. Value-added could play a key part in this process by providing schools with an alternative measure that would be able to identify which schools failing AYP in the current year were on a trajectory of growth that would get their students to proficiency by a later grade.

We call this alternative “Growth to Standards.” It should be politically viable because it does not abandon the federal government’s commitment to ensure that all students reach proficiency.

Finally, as teachers and administrators come to appreciate its many strengths, value-added assessment will serve as the foundation for an accountability system at the level of individual educators. Studies of value-added models, such as one recently released by RAND, already are struggling with this question.

The RAND review, “Evaluating Value-Added Models for Teacher Accountability,” concluded that the effect of teachers on student learning was real, it could be large and it persists for years beyond the year in which it is first evident. While the authors cautioned against using value-added for high-stakes personnel decisions until further research is conducted, they felt value-added models “might actually provide less-biased and more precise assessments of teacher effects.” As long as “test-based accountability remains an instrument of education policy,” they recommend value-added assessment should be given serious consideration even in light of its limitations (for individual level accountability).”

Excellent Timing

The timing for the introduction of value-added could not be better. Although knowing that our schools are not worse than they were 20 years ago may provide comfort in the face of intense school bashing, it does not follow that our schools are good enough for the 21st century. An open society facing the twin challenges of terrorism and the fiercely competitive global economy of the Information Age requires citizens and workers who can use technology, think critically and solve problems. If America is to remain a democracy anchored by a strong and stable middle class, then our schools must be able to educate all our children, not merely the top 20 percent, to unprecedentedly high standards.

This will involve difficult change that necessitates overhauling the entire system rather than fixing a few parts. Although critics point to teachers’ unions or to poor instructional leadership from administrators or to government regulations, it is critical to understand that no single factor is to blame. In simplest terms, the nation’s economy has changed much faster than our schools. It is now time to move our schools to their next level of excellence. Value-added will play a key role in this effort.

Ted Hershberg is executive director of Operation Public Education and a professor of public policy and history at the University of Pennsylvania, 3701 Chestnut St., Suite 6E, Philadelphia, PA 19104. E-mail: tedhersh@pobox.upenn.edu. Virginia Adams Simon is the program’s associate director and Barbara Lea-Kruger is director of development and communications.