Overcoming Misinformation

Type: Article
Topics: Communications & Public Relations, School Administrator Magazine

January 01, 2016

How we might help others and ourselves deal with claims that aren’t true
Meghan Saloman and David Rapp smile for a photo
David Rapp, a professor of cognitive psychology at Northwestern University in Evanston, Ill., and doctoral student Meghan Salomon studied the problematic effects of sharing misinformation.

Teachers work in diverse classrooms with students of disparate skill levels and abilities. Terms such as “struggling readers,” “slow learners” and “unmotivated students” immediately bring to mind challenges, potential interventions and specific experiences working with students.

These terms also convey the idea that a student’s capacity for learning is pretty much fixed. Research surveys show that instructors believe a student’s achievement will remain constant across his or her school experiences.

This view, of course, is incorrect. Classic research by psychologist Robert Rosenthal and Lenore Jacobson, a school principal, on the Pygmalion effect in the classroom demonstrated teachers can have a substantial impact on their students’ performance and the ability to learn is not necessarily fixed.

In their research, teachers were informed that their students had been divided into two groups. One group was identified as more-intelligent youngsters, likely to receive higher grades and more likely to show an intellectual growth spurt compared to their peers. The other group was identified as “normal,” meaning students were likely to develop on an average, fixed trajectory. The teachers were duped, however, as the students actually had been randomly assigned to these two groups without regard to pre-existing conditions, assessment metrics or previous classroom performance. The intention of the experimenters was to determine whether the expectations that teachers now held for these students, based on the random group assignment, would influence the students’ performance.

At the end of the academic year, the researchers measured the academic progress of the students assigned to the groups. For students in the group who had been classified as likely to show intellectual growth, teachers indeed identified them as having more promise, participating more in class, and achieving better grades. In contrast, students in the other group progressed in a normal way without distinction. Teachers’ expectations for their students led them to demand more or accept less from their students.

Although many of us are familiar with the influence of expectations on students, why might we still behave in ways that reflect the idea that student ability is immutable?

Easy Victims

Unfortunately, countless anecdotes and accounts available on websites, in newsletters, in magazines and newspaper coverage, and as conveyed through word of mouth, reflect the idea that student ability is fixed. So despite respected research providing reliable evidence to the contrary, people are confronted with misinformation about fixed student ability. And they may use that misinformation to inform their own teaching practices.

In our work, we are interested in the problematic effects of misinformation, as well as useful methods to overcome such effects. Empirical projects have demonstrated convincingly that people regularly fall victim to misinformation, whether its source is external or something they generate on their own. When we read, listen to others or construct our own understandings, in essence when we are exposed to ideas, that information can be encoded into memory. The process is not that different from when we read false statements, hear inaccuracies or construct incorrect inferences. Once misinformation is encoded, it is readily available to be used later.

Consider a simple demonstration of what has been termed the “misinformation effect.” We presented participants with stories that contained potentially false information. In one story, characters discussed travel plans with each other. Occasionally, story characters would mention information that was patently false without explicitly acknowledging it as such. For example, one character stated, “That’s why we had to go to Russia, because her family lives in the capital city St. Petersburg.” The capital of Russia is actually Moscow, not St. Petersburg, but the characters continued their discussions as if nothing were amiss.

After reading stories containing a mixture of true and false information, participants were presented with a surprise trivia quiz. The quiz included more than 200 general knowledge questions, with questions related to information discussed in the stories. One such question was, “What is the capital city of Russia?” We examined the responses that participants gave to these critical questions after reading potential misinformation.

Participants who read misinformation in the stories were more likely to produce incorrect responses on related quiz questions than if they had read accurate information. Simply being exposed to the false information increased the likelihood that participants would answer questions incorrectly, as a function of using the false information they previously read.

Perhaps these participants might simply not have previously known what the capital of Russia is, and actually learned from what they read.

However, research by Elizabeth Marsh, a psychologist and neuroscientist at Duke University, indicates individuals use false information even when they should know better. Consider first that the misinformation effect emerges regardless of whether presented inaccuracies are associated with unfamiliar facts or with facts that should be well known to the participants (as validated by general knowledge surveys).

Second, misinformation effects also occur even when participants, prior to reading false information, demonstrate knowledge of what is accurate or not.

Third, participants who use patently obvious misinformation (e.g., that the Pilgrims came to America on a ship named the Godspeed rather than the Mayflower) sometimes indicate they knew that information prior to reading, even though it is highly unlikely those inaccuracies would be known prior to reading them!

Problematic Consequences

These findings suggest that people engage in far less critical evaluation than we might hope they would routinely apply and that exposure to misinformation can have subsequent consequences for reasoning, decision making and performance. So what can we do about it? Thankfully, some general principles are useful for reducing the problematic understandings that can emerge from exposure to misinformation.

First, it is important to be critically engaged in the world. People should regularly evaluate what they read, see, hear and believe. This means they should not just question the veracity of information, but also consider alternative possibilities and hypotheses. When we encounter information, we should remain skeptical when it is not accompanied by sufficient evidential support. This can help reduce the likelihood that presented inaccuracies will be relied upon later.

Second, when we encounter information we know is wrong, we should engage in explicit correction of the material. This might mean thinking about what the correct information is or could be (e.g., telling ourselves, “Well, that’s wrong, Moscow is actually the capital of Russia”), making a direct edit to the material as we might routinely do during proofreading, or refuting incorrect ideas with reliable, well-known facts. Doing so helps block our uptake of misinformation, as well as restating and bolstering our understanding of what is accurate and true.

Third, it is useful to consider the reliability of a source providing us with information. If we are reading a novel in which the capital of a city is different from what we know to be true, it would be helpful to compartmentalize that information by reminding ourselves that fiction need not accurately represent the true state of the world.

Of course, knowing which sources are more or less reliable can sometimes prove a challenge. But remaining aware as to whether sources might be biased or exceptionally reputable (e.g., they engage in careful fact-checking practices) can help us evaluate information as we encounter it.

Evaluative Methods

Notably, misinformation and misinforming experiences can prove resistant to intervention and critical evaluation.

Consider another popular myth: The notion that private schools always do better than public schools with respect to achievement measures. National surveys of academic achievement and our personal experiences related to particular schools’ comparative success may lead us to believe this is so.

These claims, which are repeated widely, as well as anecdotal accounts, tend to serve as a reference point in contrast to more objective counterevidence. As such, our own experiences often are seen as more representative than the data provided by surveys, empirical projects and large-scale data analyses.

Overcoming personal experiences because they might be misrepresentative means keeping an open mind about the possibility that what we think we know is only part of the story. Just like our students, school administrators and teachers and the rest of us need to regularly engage in critical thinking and question what is presented to us as fact and what we think we understand. We must seek out reliable support or counterevidence for the claims, ideas and experiences we encounter throughout each day. Avoiding the effects of inaccurate information requires the kind of evaluative stances we hope to encourage in our students, colleagues and loved ones.

Authors

David N. Rapp and Meghan M. Salomon

About the Author

David Rapp is the Charles Deering McCormick Professor of Teaching Excellence at Northwestern University in Evanston, Ill., and co-editor of Processing Inaccurate Information (The MIT Press, 2014).

    David Rapp
   @ProfRapp

Meghan Salomon is a doctoral student in cognitive psychology at Northwestern.

Advertisement

Advertisement


Advertisement

Advertisement