Feature

Value-Added Analysis in Instruction

One district’s experience in using its data to gauge the effectiveness of schools and teachers in promoting student growth by MICHAEL R. NICHOLSON AND JEFFREY R. BROWN

If you ask Kelly Maddox, a 4th-grade teacher in the Olentangy Local Schools in central Ohio, whether value-added analysis has made a difference in her classroom, you’d hear a resounding “yes.” Of course, three years ago, when she first received value-added data about her students’ performance over the school year, she would have expressed to us — her colleagues in the central office — a decidedly negative view. Maddox remembers feeling incredibly disappointed.


Michael NicholsonMichael Nicholson



“I knew it hadn’t been a great year, but seeing the colors on the chart made it very real,” said Maddox, referring to growth information reflecting aggregated student learning in her classroom. “My initial reaction was to beat myself up and question my ability as a teacher.”

That’s no longer the case. The growth measured among her math students during the 2007-08 school year improved vastly. So how did she get from point A to point Z? To explain what changed, we must take a closer look at the measuring tool that jump-started the performance leap and how it benefited other grades in the Olentangy district.

Our Focal Point
Value-added data provide a viable alternative for gauging school effectiveness — one virtually free of the confounding effects of student demographics and other factors relating to student learning.

How is it different from other measuring methods? It concentrates on growth, rather than attainment. This strengthens the concept and measurement of school effectiveness. Olentangy has used value-added data for several years and learned through experience and study about its advantages and limitations as a school improvement tool.

Our use of value-added data and the simultaneous use of projection data has centered on the classroom in two major ways: using the data to prioritize support for classrooms and using the data to place students in classrooms.

Over the last few years, we’ve continued to view the classroom as the focal point for our improvement efforts. Our district improvement plan emphasizes improving teacher practices related to curriculum, assessment and instruction; it does not focus on technology, textbooks or canned programs. Our use of value-added data must have an impact on classroom practice, too.

Value-added measures do influence curriculum and instructional practices in the classroom, and growth data affect the practices for how we place students in classrooms in our school district.

Targeting Underserved
In the Olentangy School District, an affluent district north of Columbus, Ohio, the middle school math program experienced increasing scrutiny after the 2006-07 school year. For several consecutive years across all the middle schools, the math program was leading to growth considered “below” what was typical, as determined by value-added analysis.

Not to be overlooked, the inferences about the effectiveness of our middle school math program were not drawn solely from value-added data, but from triangulating these data patterns with other information. For instance, the achievement rankings for each grade’s math area were at or near the bottom, compared to math rankings in similar school districts, another important source for estimating program effectiveness.

In addition, a disproportionate number of underserved students were assigned to the “lower” track, and poor and minority students were under-represented, as well, in the advanced tracks. The evidence mounted that something wasn’t right about the district’s middle school math program.

The state proficiency test results were equally unimpressive. And it didn’t offer much help in the way of targeting where to focus improvement efforts. Grade-level proficiency test results are cumulative, so low achievement scores at one grade may be the result of ineffectiveness at that grade level, ineffective instruction at any of the preceding grade levels, as well as less-than-supportive factors outside of school, or a combination of these factors.

Value-added data, on the other hand, isolate the attribution of success or “improvement opportunity” just to the reported grade-level because previous grade levels’ contributions to learning and the influence of external factors already have been accounted for in the students’ prior test history. We knew our students weren’t growing enough in the middle school grades. We had to find out how to fix the situation.

With a growth-oriented perspective on the problem, we started by asking “How adequate is the grade-level opportunity for stretching and growing students?” rather than “How adequate is the grade-level opportunity for remediating students?” We believed the first step in solving this problem was to frame it well, because framing itself can affect the solution we sought.

We identified two opportunities to facilitate stretch and growth in our students’ learning. The first was seen in the process that places students into the higher-end math courses. We hypothesized that too many students were sitting in regular or lower-level courses who warranted participation in the more challenging classes. In other words, we suspected unintentional or intentional gatekeeping practices were keeping students out of the advanced courses where they would learn more. To investigate the decision-making process for course placement, we needed the projection data — value-added assessment’s “sister” set of data.

Projecting Performance
The second opportunity to facilitate the stretch and growth in student learning is related to a curriculum issue and student placement decisions for higher-level coursework.

Value-added assessment functions as the “rearview mirror,” the historical application of student growth data to measure the effectiveness of past schooling practices. Projection data are the “telescope,” or the forward application of student growth data for estimating student achievement in the future.

To calculate these projections, measured quantities of past typical growth for grade levels and subject areas are applied to incoming students’ standardized test scores to estimate the achievement we can expect by the end of the year. These projections are generated after considering all the student test data available in a child’s longitudinal test record. These are by no means single data-point decisions.

When student projections are rank-ordered side-by-side with student course placements, typical patterns usually follow. The first is a general trend of students with higher projections clustering alongside the higher-end courses. Fortunately, our analysis indicated most administrative decisions for student course placement aren’t totally offbase. However, for some students, we also saw discrepancies.

When moving down the list from higher to lower projections, a non-trivial number of students tended to fall “out of order,” where their projection value places them among those students enrolled in higher-end courses, but their course enrollment was in a less rigorous class. Here’s where we had our “aha” moment.

It was this inconsistent alignment of course assignment to projection data that opened up conversations about why many students were not placed in the higher-end math courses. From applying this data to course placement decisions, enrollments in advanced math courses roughly doubled in grades 5, 6 and 7 during the following year.

The use of growth data catalyzed deep discussions about student placement decisions and raised questions about the necessity of some lower-tracked courses in 7th grade.

Comparing Curricula
After observing how many students moved from the 7th-grade lower-track math course to pre-algebra based on the projection data, administrators and lead math teachers questioned the necessity of the lower-level class.

When we compared the curricula of both courses, we noted considerable content overlap. This led to our discovery that the lower course caused an unnecessary 180-day delay in student learning. This, in turn, equated to a cap on student growth. In essence, by having the equivalent of the same course stretched over two years, students on this track may have grown only half-a-year during each of those school years. Consequently, we eliminated the 7th-grade base math course, a step that accelerated opportunities for student learning.

A neighboring school district’s administrators also assumed that lower-achieving students needed the curriculum slowed down for them. Understandably, these professionals were perplexed that the grade levels over which a one-year math curriculum was spread — in an effort to slow things down for the lower-achieving students — had very low value-added data. When they changed their approach to maintaining high curriculum standards and instituted more in-school supports, their value-added measurements over these grade levels took an upswing.

This upward trend also was evident in Olentangy. The results after the first year of implementing projection-data-guided course placements and the decision to make pre-algebra the base math course in 7th grade were affirming. In fact, the value-added growth and proficiency rates in middle school math were the best they’d ever been. Value-added data not only highlighted the areas for improvement, but framed the ideas for the correction, too — growth and stretch versus attainment and remediation.

It’s important to note that these data provide once-a-year strategic information, not day-to-day instructional data. Math improvement in the two school districts didn’t occur because teachers gleaned regular instructional information from value-added data. Day-to-day formative assessment data remain the best source of information for guiding ongoing instructional decisions.

Teacher Reactions
Before considering how value-added measures can help at the individual classroom level, you need to keep in mind several points about the limitations of value-added data.

First, value-added analysis can only provide estimates of effectiveness for about half the teachers — mainly core teachers in grades 3 through high school.

Secondly, the information about these teachers can only distinguish the extremes in performance. Consistent with this claim, our school district’s most recent teacher-level reports (n=913) show this to be true: 16 percent of the grade-level/subject-area classrooms facilitated “above” typical growth, 74 percent facilitated typical growth, and 10 led to “below” typical growth.

Consequently, given the various performance designations, teachers reacted to the data findings in varying ways. This was understandable. Getting feedback about professional performance is never easy to receive. The context and manner in which such feedback is received matters a lot. How many of us feel totally at ease when receiving professional judgments about our work?

As important as this information is, we think school leaders need to consciously and purposefully develop an environment that emphasizes the continuous improvement opportunities that come with this information. And this can’t be an implied message.

A Productive Response
This brings us back to 4th-grade teacher Kelly Maddox, who is responsible for instruction in the four core subject areas — math, reading, science and social studies. She received her first classroom-level value-added report back in 2006, which reflected how her students grew during the 2005-06 school year. She wasn’t exactly thrilled with what it showed. And her reaction to her first individual value-added report is understandable, given the personalized context of the report.

Valid Use of Value-Added in Our District by JOE T. WOOD


In the small rural school district that I direct, the use of a growth model, in combination with other indicators, is proving to be an effective way for measuring teacher effectiveness, projecting student learning and, most importantly, improving outcomes for students.

read more


Maddox’s school leadership facilitated and supported collaboration and conversation among the staff about student performance. This created a positive, nurturing environment that encouraged her to share the findings with others and ask questions.

“Eventually the data caused me to reflect on my practices,” she said. “There had to be something I could build on to improve my strategies. I found myself having in-depth conversations with colleagues. Not just my teammates, but Title I teachers, literacy support teachers, intervention teachers and administrators.”

The outcome of these conversations, Maddox added, “has been the backbone of my practices for the last three years, allowing me to reflect on my value-added data each year.”

Her determination to improve and her professionalism certainly contributed to her productive response to the data, too.

Once Maddox started to use the data to reflect on her practice, the doors to professional growth swung wide open. She changed her core classroom processes of curriculum, assessment and instruction, as intended by the district improvement strategies. Some changes she instituted during 2007-08 to improve on the math growth she facilitated during the previous school year. The changes included these:

•  developed and introduced “I Can” statements for each unit to make the learning more transparent for her and the students (fostering a sense of student ownership over the curriculum);

•  began using the gradual-release model as often as possible (instruction through modeling); and 

•  developed and introduced exit tickets for each section of every unit (assessment tool to identify learning successes and gaps).

The growth measured for her math students during the 2007-08 school year affirmed the strategies she implemented. Maddox’s decision to try those strategies was based on her study of instructional principles that are supported by research. She chose strategies that represented those principles that fit into the context of her classroom and style.

All of this takes professional reflection, which includes a certain amount of risk taking. As school system leaders, we must value such professionalism and do more to overtly support it.

Intensely Personal
It’s now been a several-year journey for Maddox and other teachers in Olentangy for growing comfortable with classroom value-added assessment. We haven’t yet arrived at the destination, and we have stumbled at times during districtwide implementation.

By not emphasizing the professional development use of this information, the communication void might have been filled with misinformation. By being too muted about its continuous improvement use, the data’s use in professional evaluation may come to the fore. It may be within the evaluation context that defenses go up and windows to professional growth come down. It’s a fine line.

As you can imagine, this is an intensely personal journey for teachers. They need support structures in place to reflect upon their data. Support can come from many sources, including administrators, colleagues or instructional coaches. It’s essential, at the outset, for teachers to have professional training on interpreting data to minimize the misconceptions and over-interpretations. Setting up the structures to support dialogue about classroom practice is a step administrators can take as school districts embark on this journey.

Understand that this is an opportunity for individualizing support and professional development. From an administrative perspective, it helps to prioritize improvement efforts to a specific group of teachers. This focuses the allocation of resources to maximize teacher growth.

In the end, value-added measures are data. But data alone do not provide answers. They only stimulate conversations for reflection on classroom and program practices. The power of the data is realized when teachers and education leaders use data to reflect, adapt and improve school practices to maximize the learning growth of students.

Michael Nicholson is executive director for secondary learning in the Olentangy Public Schools in Lewis Center, Ohio. E-mail: michael_nicholson@olentangy.k12.oh.us. Jeff Brown is the district’s executive director for elementary learning.