February highlights from the world of scientific publishing

3rd Mar 2014

Some of what I learned about scientific publishing last month from Twitter: new open access journals, data release debates, paper writing tips, and lots more

New journals

Two important announcements this month, both of open access sister journals to well established ones.

First, at the AAAS meeting it was announced that Science is going to have an online-only open access sister journal, called Science Advances, from early 2015. This will be selective (not a megajournal), will publish original research and review articles in science, engineering, technology, mathematics and social sciences, and will be edited by academic editors. The journal will use a Creative Commons license, which generally allows for free use, but hasn’t decided whether to allow commercial reuse, according to AAAS spokeswoman Ginger Pinholster. The author publishing charge hasn’t yet been announced.

Second, the Royal Society announced that, in addition to their selective open access journal Open Biology, they will be launching a megajournal, Royal Society Open Science, late in 2014. It will cover the entire range of science and mathematics, will offer open peer review as an option, and will also be edited by academic editors. Its criteria for what it will publish include “all articles which are scientifically sound, leaving any judgement of importance or potential impact to the reader” and “all high quality science including articles which may usually be difficult to publish elsewhere, for example, those that include negative findings”; it thus fits the usual criteria for a megajournal in that it will not select for ‘significance’ or potential impact.

These two announcements show that publishers without an open access, less selective journal in its stable are now unusual. Publishers are seeing that there is a demand for these journals and that they can make money. Publishers also see that they can gain a reputation for being friendly to open access by setting up such a journal. This also means that papers rejected by their more selective journals can stay within the publisher (via cascading peer review), which, while saving time for the authors by avoiding the need to start the submission process from scratch, also turn a potential negative for the publisher (editorial time spent on papers that are not published) into a positive (author charges). The AAAS has been particularly slow to join this particular bandwagon; let’s see if the strong brand of Science is enough to persuade authors to publish in Science Advances rather than the increasingly large number of other megajournals.

PLOS data release policy

On 24 February, PLOS posted an updated version of the announcement about data release that they made in December (and which I covered last month). I didn’t pay much attention as the change had already been trailed, but then I had to sit up and take notice because I started seeing posts and tweets strongly criticising the policy. The first to appear was an angry and (in my opinion) over-the-top post by @DrugMonkeyblog entitled “PLoS is letting the inmates run the asylum and this will kill them”.  A more positive view was given by Michigan State University evolutionary geneticist @IanDworkin, and another by New Hampshire genomics researcher Matt MacManes (@PeroMHC). Some problems that the policy could cause small, underfunded labs were pointed out by Mexico-based neuroscience researcher Erin McKiernan (@emckiernan13). The debate got wider, reaching Ars Technica and Reddit – as of 3 March there have been 1045 comments on Reddit!

So what is the big problem? The main objections raised seem to me to fall into six categories:

  1. Some datasets would take too much work to get into a format that others could understand
  2. It isn’t always clear what kind of data should be published with a paper
  3. Some data files are too large to be easily hosted
  4. The concern that others might publish reanalyses that the originators of the data were intending to publish, so they would lose the credit from that further research
  5. Some datasets contain confidential information
  6. Some datasets are proprietary

I won’t discuss these issues in detail here, but if you’re interested it’s worth reading the comments on the posts linked above. But it does appear (particularly from the update on their 24 February post and the FAQ posted on 28 February) that PLOS is very happy to discuss many of these issues with authors that have concerns, but analyses of proprietary data may have to be published elsewhere from now on.

I tend to agree with the more positive views of this new policy, who argue that data publication will help increase reproducibility, help researchers to build on each other’s work and prevent fraud. In any case, researcher who disagree are free to publish in other journals with less progressive policies. PLOS is a non-profit publisher who say that access to research results, immediately and without restriction, has always been at the heart of their mission, so they are being consistent in applying this strict policy.

Writing a paper

Miscellaneous news

  • Science writer @CarlZimmer explained eloquently at the AAAS meeting why open access to research, including open peer review and preprint posting, benefit science journalists and their readers.
  • Impactstory profiles now show proportion of a researcher’s articles that are open access and gives gold, silver and bronze badges, as well as showing how highly accessed, discussed and cited their papers are.
  • A new site has appeared where authors can review their experience with journals: Journalysis. It looks promising but needs reviews before it can become a really useful resource – go add one!
  • An interesting example of post-publication peer review starting on Twitter and continuing in a journal was described by @lakens here and his coauthor @TimSmitsTim here.
  • Cuban researcher Yasset Perez-Riverol (@ypriverol) explained why researchers need Twitter and a professional blog.
  • I realised when looking at an Elsevier journal website that many Elsevier journals now have very informative journal metrics, such as impact factors, Eigenfactor, SNIP and SJR for several years and average times from submission to first decision and from acceptance to publication. An example is here.
  • PeerJ founder @P_Binfield posted a Google Docs list of standalone peer review platforms.

Tags: 

Comments on “February highlights from the world of scientific publishing

[…] have already addressed issues in their blog posts, including the posts written by Ian Dworkin and sharmanedit who summed up the potential […]

Leave a Reply