Thursday, May 28, 2015

What Two Scholars Found When They Looked at the Quality of Education Research (And It Isn't Pretty)

How many times have you been told that some trendy new education practice is "research-based"? The implication, of course, is that the mere application of this scientific-sounding label to the practice should cause the hearer to lay aside any further critical inquiry on the matter.

When it comes to education (and many other things for that matter) the expression "research-based" is the Good Housekeeping Seal of Approval. The expression is the equivalent of the philosopher's "QED" (Quod Est Demonstrandum: "It is demonstrated"). It is a discussion-ender. The only response considered appropriate is to nod in obedient acceptance.

But what exactly does this expression mean?

What it means is that there has been some study (perhaps more than one), conducted by someone, somewhere, that seems to indicate that the practice may be effective. Or, more likely, that someone has heard someone else say that such a study exists. It almost never indicates that the person making the reference has actually read the study (or studies) to which he refers, or could necessarily cite it.But there is something even more worrisome and it comes in the form of a study—one that you can actually read yourself.

In a study released last year by the Educational Researcher, Matthew C. Makel of Duke University and Jonathan A. Plucker of the University of Connecticut, Storrs, conducted a wide-ranging meta-study of educational research, to determine how many of the education studies published in the 100 most prominent education journals met one of the most basic research criteria. The question they asked was: What were the replication rates of education studies?

Replications, say the authors (quoting another researcher) serve five functions:


  • Controlling for sampling error
  • Controlling for artifacts
  • Controling for fraud
  • Generalizing different /larger populations
  • Assessing the study's general hypothesis

H. M. Collins calls replication "the Supreme Court of science."

Replication is a common procedure in both the hard sciences (biology, medicine, genomics, computer science), as well as the soft ones (economics, political science, and sociology).

And when you consider the low rate of successful replication in the hard sciences, say the authors, the need for replication in the social sciences (such as education) "becomes even more acute." One review found that only 44 percent of studies in health care research are successfully replicated; another that only 11 percent success rate of highly cited cancer trial studies; another that research findings analyzed by the Bayer drug company were able to be successfully replicated.

The rate of successful replication in psychology is peculiarly high—91.5 percent, "making psychologists nearly 5 times better at predicting results than actual rocket scientists. It is an outcome the authors attribute to "collecting the data until the desired result is found, not reporting unsuccessful trials, and eliminating observations and variables post hoc that do not support the targeted hypotheses." And then there is the fact that only 1.07 percent of studies in 100 prominent psychology journals are replications.

So, when the authors looked at all the studies ever published in the top education journals, what did they find?

  • Of the 164,589 studies published in these education journals, only 221 of them were replications—an overall replication rate of .13 percent.
  • Of the studies that were replicated, only 67.4 percent  were successful.
  • Also, 48.2 percent—nearly half—the replications were conducted by the same people who did the original study.

Now let's do a little math here: If you multiply .13 percent by .674, you get .08762 percent. What does it say about education research that only roughly .09 percent of it has been successfully replicated—and almost half of these replications performed by people who had a stake in seeing it successfully replicated?

That's 100 times .0008762. And half of those questionable.  This is a colossal indictment of the legitimacy of education research.

So the next time you hear a professional educator touting some exotic new education idea as "research-based,"  ask him this question: "Has the research been successfully replicated?" Chances are he won't know. When he confesses his ignorance, tell him that there is a higher than 99.9 percent chance that it hasn't.

1 comment:

Art said...

Lessee ...

160,000 + papers. It probably takes an hour to read each one. That comes out to more than 18 years of non-stop, 24/7 reading. And that's before the recollecting, cross-referencing, etc. to make sure that parts of studies did or did not appear in other reports.

Color me just a tiny bit skeptical.