Boots & Sabers

The blogging will continue until morale improves...

Owen

Everything but tech support.
}

1908, 06 Mar 16

Testing Old Results

Interesting.

That, at least, is the theory. In practice, checking old results is much less good for a scientist’s career than publishing exciting new ones. Without such checks, dodgy results sneak into the literature. In recent years medicine, psychology and genetics have all been put under the microscope and found wanting. One analysis of 100 psychology papers, published last year, for instance, was able to replicate only 36% of their findings. And a study conducted in 2012 by Amgen, an American pharmaceutical company, could replicate only 11% of the 53 papers it reviewed.

But as economics adopts the experimental procedures of the natural sciences, it might also suffer from their drawbacks. Ina paper just published in Science, Colin Camerer of the California Institute of Technology and a group of colleagues from universities around the world decided to check. They repeated 18 laboratory experiments in economics whose results had been published in the American Economic Review and the Quarterly Journal of Economics between 2011 and 2014.

For 11 of the 18 papers (ie, 61% of them) Dr Camerer and his colleagues found a broadly similar effect to whatever the original authors had reported. That is below the 92% replication rate they would have expected had all the original studies been as statistically robust as the authors claimed—but by the standards of medicine, psychology and genetics it is still impressive.

}

1908, 06 March 2016

0 Comments

Pin It on Pinterest