Feature

Watching the Game and Not Just Keeping Score

How classroom observations can improve instruction by MARSHA ING AND KENNETH MONTGOMERY

Over the past decade, education has become more like baseball. The latter tends to attract followers who enjoy dissecting endless reams of performance data because the sport is obsessed with counting things: the number of home runs, the hitter’s batting average, the pitcher’s earned run average and the frequency of a team blowing a lead after the seventh inning.

With such an abundant supply of data, it can be tempting to spend so much time analyzing the data that you forget to watch the game. However, if you have spent any time at a professional baseball game, you know even the objective data can be misleading. A batter who hits a home run against the opposing team’s worst pitcher in a 10-run loss has done something very different from a batter who hits one out for a 1-0 victory against the opponent’s best pitcher in the final inning.

Marsha IngMarsha Ing is an assistant professor of educational psychology at the University of California, Riverside.



The big league manager who watches the game knows this. The casual observer who dissects the data at the end of the season may not. This leads them to different conclusions about the strengths and weaknesses of the players and what is needed for improvement.

Class Visitations
With the recent explosion of data about schools, educators and policymakers risk making a similar mistake by collecting data that may not lead to useful decisions about classroom instruction by administrators and teachers.

In education, we collect endless data with less attention to how teachers and principals make sense of all this information. We know, for example, how well students performed on a 3rd-grade math assessment but don’t usually know what happened in 3rd-grade classrooms that contributed to student performance. These data are useful to help us keep score of different schools but do not reveal anything about what actually happened in these classrooms and what might be the best strategies for improvement.

Gathering information about what is happening in classrooms can take many forms. Student test scores receive much attention because they are a common meas-ure that is relatively easy to collect. However, to learn about the quality of instruction, the most direct way to gather information is to observe what is happening in classrooms. We know principals may visit classrooms frequently, but the use of these observations to transform instruction varies widely among schools.

District administrators encourage principals to get out of their offices and into classrooms. Yet simply pushing principals into class visits doesn’t translate into gathering useful information about what is happening instructionally. How do principals use this information to improve teaching and learning? Do the assembled data help teachers and administrators make better judgments about classroom instruction?

We’ve gathered evidence in response to these questions from surveys, administrative databases, interviews and shadowing principals in two school districts. We find similar patterns across these districts, and we believe one district’s approach to collecting data on instruction through classroom observations is illustrative.

Three Protocols
Eight years ago, district administrators in the Milwaukee Public Schools shifted their reform strategy from a model based on school choice and autonomy to one focused on improving the quality of instruction across the system. They made this decision based on data about instructional practices and student outcomes that indicated the overall quality of instruction was substandard.

Although the district had many highly successful schools, when principal coaches visited classrooms, they did not consistently see what they would characterize as high-performing classrooms. They agreed that directly observing classrooms was one way to improve instructional opportunities and knew they needed to do more than encourage principals to sit in classrooms; they needed to create a common language to talk about instruction and provide tools for and support to principals’ classroom observations.

In response, the school district implemented three learning-walk protocols: the Instructional Practice Inventory, the Characteristics of High-Performing Urban Classrooms and Measuring What Matters. The latter was used in five high schools that were trying a particular reform model, while the first two were being applied across Milwaukee’s 184 public, charter, alternative and partnership schools.

The Instructional Practice Inventory required multiple people to walk through classrooms on multiple occasions for approximately three minutes per visit, with a minimum of 100 classroom visits per school. Observers check one of six categories that best reflected what they saw in the classroom: active, engaged learning; learning conversations; teacher-led instruction; student work with teacher engaged; student work without teacher support; or complete disengagement. The data were collected by principal coaches and given to an administrative assistant in the central office and the school’s principal. The data were not widely disseminated or analyzed.

In addition to the practice inventory, the district used a learning walk based on a document created by Milwaukee Public Schools and the Milwaukee Partnership Academy that promoted standards of effective instruction. The stand-ards outlined in the Characteristics of High-Performing Urban Classrooms include: (1) active engagement of student learners; (2) strategic use of instructional choices; (3) routine use of a variety of assessments; (4) cultural responsiveness; (5) high expectations based on learning targets; (6) partnerships with families and communities; (7) collaboration with colleagues; and (8) impassioned, engaged adult learners.

The learning walks attempted to determine the level of implementation of the eight characteristics at a school site. These data were aggregated at the school level and remained at the school site.

These learning walks represented a concerted attempt by the school district to have principals observe instructional practice through a common lens.The learning walks constitute an important step in the observing process because they focus the principal’s attention on watching the teaching and learning game instead of merely keeping score.


Skeptical Discoveries
The learning walks helped build a culture in which it is common for teachers to have their practice observed by others. Structuring the learning-walk protocols and preparing and supporting principals to conduct these learning walks also reinforced what instruction should look like throughout the district. It sent a signal to principals that they would be accountable for knowing what instruction looks like in their schools. In the view of Milwaukee’s central administrators, classroom observations represented movement toward improving instruction.

Kenneth MontgomeryKenneth Montgomery is an assistant principal at Capuchino High School in San Bruno, Calif. Photo by Dominic Bigue.



Yet others, particularly teachers and principals, were skeptical about the representativeness of the data gathered through learning walks. Teachers expressed concerns the observations did not provide accurate portrayals of instructional opportunities.

One teacher admitted her students could easily manipulate the situation whenever observers entered the classroom. She described a situation in which she received high marks for student engagement, but student work revealed that fewer than half of the students completed the assignment as intended. Her students understood the motivation for these classroom observations and told her, “I’ve got your back” or “Don’t worry about it,” whenever observers entered the classroom. The teacher reported that when observers were in the classroom, her students “sat at their desks like little angels. They answered all the questions. They suddenly, magically could tell you what we’re doing. They worked on the project and then as soon as the (learning) walk was over, they went back to doing whatever they were doing.” The teacher said she felt like a “fraud” because the observations did not represent what instruction looked like in her classroom.

Principals also reported that the data generated in individual classrooms by a learning walk were inaccurate when aggregated. Data from learning walks were not designed to identify a particular teacher. Instead data were aggregated by school or by grade level. Principals did not record the data if instruction was not occurring. For example, if a teacher was dealing with a student who was tardy, principals might come back another time. In doing so, the aggregated data led to potentially inaccurate conclusions about student engagement.

Abundant Bad Data
Learning walks did not give teachers and principals the data necessary to improve instruction. One observer said she watched a teacher providing incorrect information about a mathematical concept but gave the teacher high marks for student engagement. The measures were designed to focus on engagement and not on the content of the observation.

By creating a dichotomy between content and pedagogy, the learning walk provided misleading information about the overall effectiveness of the lesson. One principal remarked, “Even if we take a regular teacher in a classroom they teach, the students are engaged [but] engaged in what? Are they engaged in building an igloo or engaged in the subject? What are they engaged in? Are they working toward some instructional objectives or something else?”

The emphasis on engagement and not on content missed an opportunity to provide data that would help facilitate a discussion of teacher instructional practice.

The variety in learning walks created a situation in which principals and teachers may have had a considerable amount of data about instruction but little confidence in the quality of this data or the knowledge or skills to do anything with the data. The learning walks have not been the integral part of the instructional improvement plan that the district had hoped they would be. The observations provided general information about instructional practice but failed to generate the data upon which the district or teachers could take action.

Adding to concerns about the lack of good data was the large quantity of data collected. Principals and their leadership teams invested considerable time in collecting data through the learning walks, possibly at the expense of using the data. In a survey conducted by the Institute for Research on Education Policy & Practice, 79 percent of 137 participating principals indicated they conduct classroom observations almost every day, and 65 percent said they spent at least five minutes per classroom. Approximately 10 percent of the principals indicated they always follow up with teachers after classroom observations.

These findings are consistent with another recent survey conducted by Milwaukee Public Schools, where 84 percent of the teachers indicated at least one learning walk had occurred in their classroom, but only 56 percent reported any debriefing of the learning walk. Principals invest a large amount of time observing classrooms but less is known about how all of the data gathered through such observations are processed and used in a way that supports instruction.

Recommended Measures
Observing instructional practices can help focus the district around instruction and make the case for changing instructional practice. However, as we found in Milwaukee and several other districts, this idea of observing classrooms does not necessarily translate to providing the information that leverages change in instructional practice.

Milwaukee recognized the concerns with the three learning-walk protocols and responded to the need to create a single tool across the school district. The district designed and piloted a new tool last year called “Learning at a Glance”.

With promising findings from the first year, the district continues to gather evidence of reliability and validity and is committed to ensuring this tool serves the professional development needs of their principals and teachers.

Here are some guidelines to help other school districts collect data about instructional practice that set up opportunities to have rich discussions about instruction:

•  Signal what is important. The tools used to measure instruction signal what is important. If student engagement is what is most important, design a measure that will capture this. If teacher questioning practice is what is most important, design a measure that will capture this. Do not try to capture everything, or the tool will be difficult to apply and less effective as a feedback mechanism and a signal of what is important.

•  Draw a line from practice to performance. The design and implementation of tools to guide classroom observations should consider how the information will be used by teachers and principals. How will this information help teachers draw a line from their instructional practice to their students’ performance? Will the data tell us what individual teachers are doing and provide enough information to stimulate discussion about what can be done in the classroom to improve instruction? If the data do not help the teachers connect their practice to work generated by the students, the information is of little use.

•  Take a systemic approach. Mechanisms and processes need to be put in place to facilitate discussions around instruction. Information from classroom observations might provide information to principals about what professional development activities to offer or highlight the need to modify schedules to enable teachers to collaborate. Principals also might arrange for teachers to observe other teachers and facilitate discussions among teachers about instructional practices.

In many cases, it may be most useful to remove the principal as the intermediary among the instructional practices observed in classrooms. Rather than have the principal observe instruction and report the data to other teachers, it may be preferable to have teachers directly observe one another’s practice.

•  Triangulate with other measures. Classroom observations should not be the only data used to improve instruction. Triangulate classroom observations with other information, such as student work or classroom assessments. This helps generate discussion about disconnects between different measures. It also will help decrease the temptation to look at only the pedagogical strategies used by a teacher without investigating the content students are being asked to learn.

A rich discussion finds ways to include the observed pedagogical strategies and the content mastered by students. Pairing learning walks with discussions around student work would move schools in the right direction.

The goal is to collect data on instruction that allows teachers, principals and district administrators to take action. We want this data to help them not only keep score but also to know more about what is happening around instruction with an eye toward improvement.

Sabermetricians — those who analyze baseball through objective evidence — have attempted to create new tools and processes to better understand the game, but so far no substitute exists for packing up the family and heading to the ballpark.

Marsha Ing is an assistant professor of educational psychology at the University of California, Riverside. E-mail: marsha.ing@ucr.edu. Kenneth Montgomery is an assistant principal at Capuchino High School in San Bruno, Calif.