replication

Glass houses: economics also often doesn't replicate

There was a big kerfuffle recently about the fact that researchers were only able to replicate about half of prominent psychology results. This is potentially worrisome, and suggests that we should be doing more replication exercises where possible, as well as implementing new norms in journals and elsewhere that encourage better incentives among researchers and editors.

Image courtesy of Nature.

Image courtesy of Nature.


There's another facet of replication, though, that is maybe a step below re-running the whole experiment: simply re-running posted replication code, and making sure that results reported in a study match what the code produces. In this sense, studies really should replicate. Surprisingly (and, I would argue, embarrassingly), Andrew Chang and Phillip Li at the Fed just released a paper describing the results of this type of replication exercise on 60 published papers in good journals. As their title ("Is Economics Research Replicable? Sixty Published Papers from Thirteen Journals Say 'Usually Not'") suggests,  just over half of the attempted replications failed - and more than that without additional correspondence with the papers' authors.  David Evans at the World Bank has a nice blog post that describes the paper in more detail. The ultimate suggestions? Journals should require the posting of replication code and data (even today, some top journals aren't doing this - here's looking at you, QJE!); authors should be clear about what version of statistical software and what non-standard packages were used to conduct the analysis; authors should post expected run-times on code; authors should be sure to note in what order different code files should be run; and replication materials should absolutely include the code files to generate tables and figures. Seems like a reasonable set of requests (I myself have banged my head against a wall repeatedly trying to replicate code without proper documentation, packages, or software versions, so I'm very sympathetic to this cause.)

Here's some great commentary from Nature about replication in research.

Also: if you want to be part of a burgeoning movement to correct these types of problems in the social sciences, you should go to the Berkeley Initiative for Transparency in the Social Sciences' annual meeting (or better yet, submit a paper!)


Make science, not (worm) wars

Make science, not (worm) wars

I have no interest in opening the can of - well, you know - that has taken the development economics twitterverse by storm this week. Without getting into the relative merits of the original study and the replication, though, I think there are lessons that the social science community can and should take away. There's a lot going on here - many of these sub-sections will likely be the topics of further posts, but this is a nice setting to discuss all of them together. More on the scientific method, replications, re-analysis, fixing problems, and science in the media after the break.

Read More