Crowdsourcing information about journals

24th Jul 2012

Crowdsourced surveys of the experience of authors with journals are useful, but I have found only a few. For now, I propose a simpler survey of information gleaned from journal websites.

I was recently alerted by @melchivers (via @thesiswhisperer) to the existence of a blog by SUNY philosopher Andrew Cullison (@andycullison) that includes a set of journal surveys for the field. As Cullison explains in an overview post, the surveys consist of Google Docs spreadsheets, one for each journal, and a form interface that academics fill in with data on their experience of submitting to that journal. The information requested includes:

  • the time taken for initial review
  • the initial verdict of the journal (acceptance, rejection, revise and resubmit, conditional acceptance, withdrawn)
  • the number of reviewers whose comments were provided
  • an assessment of the quality of the reviewers’ comments
  • the final verdict if the paper was revised
  • the time from acceptance to publication
  • an overall rating of the experience with the editors
  • Some basic demographic data

This survey covers 180 journals in philosophy. The data is collated and various statistics are calculated, such as the average review time and acceptance to publication time and the average acceptance rate. Here are couple of examples: the British Journal of Philosophy of Science and Philosophy of Science.

This kind of survey could be a valuable resource for authors in a particular field who are trying to choose a journal. They are crowdsourced, so they do not rely on only one or a few people to gather data. They also provide real data on how fast journals are in practice, which might differ from the statistics or promises provided on journal websites. However, they have limitations: as pointed out in comments below one of Cullison’s posts, they suffer from reporting bias. This is important given that for many of the journals surveyed there are fewer than ten responses.

I haven’t seen any surveys like this in any other field of academia, and certainly none in biology or medicine. I would be very interested to hear if others have seen any. In biology a similar survey would probably only be useful if divided up into smaller fields, such as plant cell biology or cardiovascular medicine. Or it could focus only on the general journals that cover large areas of science, biology or medicine.

A simpler journal survey

Alternatively, or as a first step towards full surveys of journals in biomedicine, a crowdsourced survey of the information presented on journal websites could be useful. This could include information such as the promised submission to first decision time and acceptance to publication time, licensing details (copyright, Creative Commons and so on), charges, article types and length limits. This would involve only one small dataset per journal, which could fit on a single line of a spreadsheet rather than data for individual papers, so would be more manageable than Cullison’s surveys.

I have made a start on such a survey, and you can find it on Google Docs here. I have used the same set of 98 journals, derived from the UK Medical Research Council  list of journals popular with MRC authors, that I used for my open access charges spreadsheet. For every journal, the spreadsheet now contains the name of the publisher, the main journal URL, the URL for the instructions for authors, whether the entire journal is open access or not, and whether there is an open access option. There are also columns for the following information: what the website says about acceptance to publication time; whether the accepted, non-edited manuscript is published online, and what the website says about submission to first decision time. I have filled in some of these fields but haven’t yet checked all the websites for all this information.

The spreadsheet is editable by anyone. I realise that this risks someone messing up the data or adding spam text. For the columns that I don’t want you to change, I have included a partial safeguard: these columns are pulled in from a hidden, locked sheet of the spreadsheet. Please try not to delete data in any cells – just add data in empty cells. If you have any other suggestions for how to allow information to be added but not deleted, or otherwise to avoid problems, please add a comment below.

Now it’s your turn

Would you like to contribute information to this survey? If so, please go ahead and edit the spreadsheet.

If you could publicise it that would be great too.

And do you have any comments on this process, suggestions for improvement and so on?

Other questions

Have you used Cullison’s surveys and found them useful (or less useful)? Have you come across any surveys like the philosophy one for other fields? Or like my survey?

Tags: 

Comments on “Crowdsourcing information about journals

Nick says:

I have used Cullinson’s surveys. Especially for early career researchers, where the speed of reply is important, having even a rough guideline of how long it will take to hear back from a journal is a great help. Before I found the surveys, I had unwittingly sent articles to places which took 8-9 months (or more!) to give a verdict. I haven’t found anything similar outside of philosophy, although I have looked (particularly for something in political science).

Leave a Reply