Stevan Harnad’s “Subversive Proposal” came of age last year. I’m now teaching students younger than Stevan’s proposal, and yet, very little has actually changed in these 21 years.
Stevan Harnad’s “Subversive Proposal” came of age last year. I’m now teaching students younger than Stevan’s proposal, and yet, very little has actually changed in these 21 years.
What do these two memes have in common?
tl;dr: Data from thousands of non-retracted articles indicate that experiments published in higher-ranking journals are less reliable than those reported in ‘lesser’ journals. Vox health reporter Julia Belluz has recently covered the reliability of peer-review.
Over the last decade or two, there have been multiple accounts of how publishers have negotiated the impact factors of their journals with the “Institute for Scientific Information” (ISI), both before it was bought by Thomson Reuters and after. This is commonly done by negotiating the articles in the denominator.
tl;dr: It is a waste to spend more than the equivalent of US$100 in tax funds on a scholarly article. Collectively, the world’s public purse currently spends the equivalent of US$~10b every year on scholarly journal publishing. Dividing that by the roughly two million articles published annually, you arrive at an average cost per scholarly journal article of about US$5,000.
In Germany, the constitution guarantees academic freedom in article 5 as a basic civil right. The main German funder, the German Research Foundation (DFG), routinely points to this article of the German constitution when someone suggests they should follow the lead of NIH, Wellcome et al. with regard to mandates requiring open access (OA) to publications arising from research activities they fund.
Posting my reply to a review of our most recent grant proposal has sparked an online discussion both on Twitter and on Drugmonkey’s blog. The main direction the discussion took was what level of expertise to expect from the reviewers deciding over your grant proposal. This, of course, is highly dependent on the procedure by which the funding agency chooses the reviewers.
Update, Dec. 4, 2015: With the online discussion moving towards grantsmanship and the decision of what level of expertise to expect from a reviewer, I have written down some thoughts on this angle of the discussion. With more and more evaluations, assessments and quality control, the peer-review burden has skyrocketed in recent years.
This is a post written jointly by Nelson Lau from Brandeis and me, Björn Brembs. In contrast to Nelson’s guest post, which focused on the open data aspect of our collaboration, this one describes the science behind our paper and a second one by Nelson, which just appeared in PLoS Genetics. Laboratories around the world are generating a tsunami of deep-sequencing data from nearly every organism, past and present.
Why our Open Data project worked, (and how Decorum can allay our fears of Open Data). I am honored to Guest Post on Björn’s blog and excited about the interest in our work from Björn’s response to Dorothy Bishop’s first post. As corresponding author on our paper, I will provide more context to our successful Open Data experience with Björn’s and Casey’s labs.