Scientific research is cumulative; many elements of a typical research project would not and could not exist but for the efforts of many previous researchers. This goes not only for knowledge, but also for measurement. In much of the clinical world–and also in many areas of “basic” social and life science research–people routinely save themselves … Continue reading Yes, your research is very noble. No, that’s not a reason to flout copyright law.
In 2012, I signed the Cost of Knowledge pledge, and stopped reviewing for, and publishing in, all Elsevier journals. In the four years since, I’ve adhered closely to this policy; with a couple of exceptions (see below), I’ve turned down every review request I’ve received from an Elsevier-owned journal, and haven’t sent Elsevier journals any … Continue reading Why I still won’t review for or publish with Elsevier–and think you shouldn’t either
[Editorial note: this was originally posted on April 1, 2016. April 1 is a day marked by a general lack of seriousness. Interpret this post accordingly.] As many people who follow this blog will be aware, much of my research effort over the past few years has been dedicated to developing Neurosynth—a framework for large-scale, … Continue reading Neurosynth is joining the Elsevier family
Digital object identifiers (DOIs) are much sought-after commodities in the world of academic publishing. If you’ve never seen one, a DOI is a unique string associated with a particular digital object (most commonly a publication of some kind) that lets the internet know where to find the stuff you’ve written. For example, say you want … Continue reading Now I am become DOI, destroyer of gatekeeping worlds
Apparently, many scientists have rather strong feelings about data sharing mandates. In the wake of PLOS’s recent announcement–which says that, effective now, all papers published in PLOS journals must deposit their data in a publicly accessible location–a veritable gaggle of scientists have taken to their blogs to voice their outrage and/or support for the policy. … Continue reading strong opinions about data sharing mandates–mine included
You could be forgiven for thinking that academic psychologists have all suddenly turned into professional whistleblowers. Everywhere you look, interesting new papers are cropping up purporting to describe this or that common-yet-shady methodological practice, and telling us what we can collectively do to solve the problem and improve the quality of the published literature. In … Continue reading the truth is not optional: five bad reasons (and one mediocre one) for defending the status quo
The Neurosynth database is getting an upgrade over the next couple of weeks; it’s going to go from 4,393 neuroimaging studies to around 5,800. Unfortunately, updating the database is kind of a pain, because academic publishers like to change the format of their full-text HTML articles, which has a nasty habit of breaking the publisher-specific … Continue reading Attention publishers: the data in your tables want to be free! Free!
I’ve written a few posts on this blog about how the development of better online infrastructure could help address and even solve many of the problems psychologists and other scientists face (e.g., the low reliability of peer review, the ‘fudge factor’ in statistical reporting, the sheer size of the scientific literature, etc.). Actually, that general … Continue reading tracking replication attempts in psychology–for real this time
When I review papers for journals, I often find myself facing something of a tension between two competing motives. On the one hand, I’d like to evaluate each manuscript as an independent contribution to the scientific literature–i.e., without having to worry about how the manuscript stacks up against other potential manuscripts I could be reading. … Continue reading The reviewer’s dilemma, or why you shouldn’t get too meta when you’re supposed to be writing a review that’s already overdue
UPDATE 4/20/2012: a revised version of the paper mentioned below is now available here. A couple of months ago I wrote about a call for papers for a special issue of Frontiers in Computational Neuroscience focusing on “Visions for Open Evaluation of Scientific Papers by Post-Publication Peer Review“. I wrote a paper for the issue, … Continue reading building better platforms for evaluating science: a request for feedback