So, Your School was Flagged by an Erasure Analysis, Now What?

By Dennis Maynes

A lot of attention has been focused on erasure analyses in 2011. We were all appalled when special investigators in Georgia confirmed answer sheet tampering by educators in 44 Atlanta Public Schools during the 2009 CRCT administration. Other investigations triggered by erasure analyses have also been publicized widely. People are asking, “Could this happen here, in my community?”

The pressure to cheat is real. In a self-selected Michigan survey, 30% of educators said they felt pressure to cheat and 8% actually cheated. We can’t generalize the result. But, it confirms what we already knew: A small percentage of educators have cheated and will continue to cheat. Hence, it is vital that departments of education heed warning signs of test security problems.

An erasure analysis can provide important clues, but by itself, it cannot confirm cheating. Usually, erasure analyses will flag a school or classroom if its average number of wrong-to-right erasures exceeds the state-wide average by three or four standard deviations. The following observations seem relevant:

  1. Erasures on answer sheets are infrequent. Actual frequencies vary by state and grade, but most answer sheets will not have any erasures and very few will have more than four or five erasures. This means that just a few answer sheets with a large number of erasures is very unusual.
  2. A lot of erasing is NOT the same thing as cheating. The statistics may identify schools or classrooms that merit in-depth inspection. The statistics may indicate very low probabilities of “normal test taking.” But, we must NOT interpret erasure probabilities as the probability that tampering occurred. Doing so is fallacious and improper.
  3. Investigations that are triggered by erasure analysis will continue. They will become commonplace and routine. We should not infer guilt from an investigation being conducted. Guilt or innocence is only inferred after the investigation runs its course.
  4. A single statistic, such as the average number of wrong-to-right erasures in a classroom, cannot tell the entire story. For example, educators at one school said the data were spurious because the average number of wrong-to-right erasures was only 2 erasures higher than the state average. It might be true, if every answer sheet had 2 erasures. But, changing a lot of answers on only one fourth of the answer sheets could have produced the same statistical flag.

If your school has been flagged by an erasure analysis it is important to cooperate with the investigators. The flag was probably caused by something different about your school (e.g., one school gave new erasers to the students who thought it was fun to use them during the test), but not tampering. And, the fact is that most investigations triggered by erasure analysis do not uncover wrong-doing by educators.

Perhaps our unease with erasure analysis is due to the name we attach to the fact-finding that occurs after an erasure analysis. If we were to refer to it as an “audit” instead of as an “investigation” we might view it as routine and necessary, instead of as shocking and salacious.

Caveon

Leave a Reply