Submission to first decision time

5th Apr 2013

Having written previously about journal acceptance to publication times, it is high time I looked at the other important time that affects publication speed: submission to first decision time. As I explained in the previous post, the time from submission to publication in a peer reviewed journal can be split into three phases, the two discussed previously and here and also the time needed for the authors to revise, which the journal can’t control.

A survey of submission to first decision times

I have trawled through the instructions to authors pages of the journals in the MRC frequently used journal list, which I have used in several previous posts as a handy list of relatively high-impact and well known biomedical journals. I’ve used the list as downloaded in 2012, and there may be new journals added to it now. I’ve omitted the review journals, which leaves 96.

From these pages I have tried to find any indication of the actual or intended speed to first decision for each journal. For many journals, no information was provided on the journal website about average or promised submission to first decision times. For example, no Nature Publishing Group, Lancet, Springer or Oxford University Press journals in this data set provide any information.

However, of these 96 journals 37 did provide usable information. I have put this information in a spreadsheet on my website.

20 promised a first decision within 28 or 30 days of submission. 12 others promised 20-25 days. Of the rest, two are particularly fast, Circulation Research (13 days in 2012) and Cellular Microbiology (14 days); and one is particularly slow, Molecular and Cellular Biology (4 to 6 weeks, though they may just be more cautious in their promises than other journals). JAMA and Genetics are also relatively slow, with 34 and 35 days, respectively. (Note that the links here are to the page that states the time, which is generally the information for authors.)

A few journals promise a particularly fast for selected (‘expedited’) papers but I have only considered the speed promised for all papers here.

I conclude from this analysis that, for relatively high-impact biomedical journals, a first decision within a month of submission is the norm. Anything faster than 3 weeks is fast, and anything slower than 5 weeks is slow.

Newer journals

But what about the newer journals? PeerJ has recently been boasting on its blog about authors who are happy with their fast decision times. The decision times given on this post are 17, 18 and 19 days. These are not necessarily typical of all PeerJ authors, though, and are likely to be biased towards the shorter times, as those whose decisions took longer won’t have tweeted about it and PeerJ won’t have included them in their post.

PLOS One gives no current information on its website about decision times. However, in a comment on a PLOS One blog post in 2009, the then Publisher Pete Binfield stated that “of the 1,520 papers which received a first decision in the second quarter of 2009 (April – June), the mean time from QC completion to first decision was 33.4 days, the median was 30 days and the SD was 18.” He didn’t say how long it took from submission to ‘QC completion’, which is presumably an initial check; I expect this would be only a few days.

Kent Anderson of the Scholarly Kitchen asked last year “Is PLOS ONE Slowing Down?“. This post only looked at the time between the submission and acceptance dates that are displayed on all published papers, and it included no data on decision dates, so the data tell us nothing about decision times. In a series of comments below the post David Solomon of Michigan State University gives more data, which shows that the submission to acceptance time went up only slightly between early 2010 and September 2011.

The star of journals in terms of decision time is undoubtedly Biology Open. It posts the average decision time in the previous month on its front page, and the figure currently given for February 2013 is 8 days. They say they aim to give a first decision within 10 days, and their tweets seem to bear this out: in June 2012 they tweeted that the average decision time in May 2012 had been 6 days, and similarly the time for April 2012 had been 9 days.

Other megajournals vary similarly to ordinary journals. Open Biology reports an average of 24 days, Cell Reports aims for 21 days, and G3 and Scientific Reports aim for 30 days. Springer Plus, the BMC series, the Frontiers journals, BMJ Open and FEBS Open Bio provided no information, though all boast of being fast.

What affects review speed?

If newer journals are faster, why might that be? One possible reason is that as the number of submitted papers goes up, the number of editors doesn’t always go up quickly enough, so the editors get overworked – whereas when a journal is new the number of papers to handle per editor may be lower.

It is important to remember that the speed of review is mainly down to the reviewers, as Andy Farke pointed out in a recent PLOS blog post. Editors can affect this by setting deadlines and chasing late reviewers, but they only have a limited amount of control over when reviewers send their reports.

But given this limitation, there could be reasons for variations in the average speed of review between journals. Reviewers might be excited by the prospect of reviewing for newer journals, so they are more likely to be fast. This could equally be true for the highest impact journals, of course, and also for open access journals if the reviewer is an open access fan. Enthusiastic reviewers not only mean that the reviewers who have agreed send their reports in more quickly, but also that it will be easier to get someone to agree to review in the first place. As Bob O’Hara pointed out in a comment on Andy Farke’s post, “If lots of people decline, you’re not going to have a short review time”.

A logical conclusion from this might be that the best way in which a journal could speed up its time to first decision would be to cultivate enthusiasm for their journal among the pool of potential reviewers. Building a community around the journal, using social media, conferences,  mascots or even free gifts might help. PeerJ seem to be aiming to build such a community with their membership scheme, not to mention their active Twitter presence and their monkey mascot. Biology Open‘s speed might be related to its sponsorship of meetings and its aim to “reduce reviewer fatigue in the community”.

Another less positive possible reason for shorter review times could be that reviewers are not being careful enough. This hypothesis was tested and refuted by the editors of Acta Neuropathologica in a 2008 editorial. (Incidentally, this journal had an average time from submission to first decision of around 17 days between 2005 and 2007, which is pretty fast.) The editorial says “Because in this journal all reviews are rated from 0 (worst) to 100 (best), we plotted speed versus quality. As reflected in Fig. 1, there is no indication that review time is related to the quality of a review.”

Your experience

I would love to find (or even do) some research into the actual submission to first decision times between different journals. Unfortunately that would mean getting the data from each publisher, and it might be difficult to persuade them to release it. (And I don’t have time to do this, alas.) Does anyone know of any research on this?

And have you experienced particularly fast or slow peer review at a particular journal? Are you a journal editor who can tell us about the actual submission to first decision times in your journal? Or do you have other theories for why some journals are quicker than others in this respect?


35 Comments on “Submission to first decision time

Nice post! I wrote a post recently about time from submission to acceptance that covers papers where I was the ‘driving’ author and kept track I’m really interested to see what others’ experiences are in this regard. Unfortunately I didn’t specifically track time to first response but I could probably dig that out.

sharmanedit says:

Thanks Jason. It would be great if you (and other authors) could separate out these times into their components (to first decision, in revision, to decision after revision and after acceptance) so that the time the journal took can be separated from the time the authors took. We need a lot more data than is currently available! But your post is a start, together with this one.

I note one particularly short time in your list: 36 from submission to acceptance for Mol BioSystems. Do you know why that one was so quick? Presumably the time to the first decision must have been less than a month for the whole process to be over in that time? Your paper in Infection and Immunity was also pretty fast.

jasonmcdermott873738820 says:

I can separate out the different times retrospectively. I’ll let you know after I update my table.

I’m really not sure why that paper turned around so quickly. First, I think it had minimal revisions requested, which probably means that it was easier for the reviewers to read and evaluate and shortened the response/acceptance time. Second, I’d guess that there’s the element of chance where we happened to get two (I think it was two) reviewers who had the appropriate experience to evaluate the manuscript, and happened to have time to provide a quick turnaround when we submitted. This is partially attributable to having a good editor, but not completely. I’ll take a look back at the timeline on that one but I think we did find out quickly- then had a very quick acceptance after minimal revisions probably just from the editor (i.e. not sent back out for reviewers’ consideration again).

The one in Infection and Immunity was a mini-review, but did include some analysis and we had revisions to make after the review. Again, I think the revisions we sent were just examined by the editor and accepted- they weren’t sent back to the original reviewers. That’s probably accounts for the short turnaround time.

jasonmcdermott873738820 says:

Anna- here’s an update ( to my previous post that now includes time to first response (all but one example have been sent to reviewers), and time from acceptance to publication, as well as time spent in review (subtracting out the time that I was working on revisions). Although its a narrow sample all based on only my papers (which may have a significant bias) it shows that these papers had a mean turnaround time of about 2 months- longer than the journals you list report- though I don’t think any of my papers overlap with the journals on your list.

Jason Hoyt says:

Thanks for mentioning PeerJ, Anna. About 10 days ago we looked at the median time to first decision over the past two months (admittedly in an obscure tweet).

Anyway, median time is 19 days for PeerJ overall, with an aim of < 14 days. For definition, "day" is not working day, but any day of the week.

CEO & Co-founder of PeerJ

sharmanedit says:

Thanks for this data, Jason (I had actually favourited that tweet, but I must have missed it when compiling data for this post).

Could you perhaps post about this on the PeerJ website? The aim of <14 days would be really informative for potential authors, even if the actual data is still too new to post. This aim would reflect well on PeerJ, given that most journals aim for twice that time (as I have shown).

nextgenseek says:

nice post and a great co-incidence that you wrote the post on the same day I was wondering about the submission to acceptance time 🙂

Adam Jacobs says:

I once had a manuscript rejected from PLoS One in less than 2 minutes. Is that some kind of record?

sharmanedit says:

Thanks Adam. I assume the manuscript wasn’t sent to peer reviewers in that time! This post focuses on the time taken for peer review, but looks like it would also be worth me writing a post on how quickly journals reject without review.

There’s the possibility of a big bias here if journals reject a lot of manuscripts without sending them to review. At Methods in Ecology & Evolution we do this quite a lot, which brings down our average decision time (this is not why we do it, of course, but it’s a nice side-effect). I suspect we reject a higher proportion without review than similar jounrals, because we get a lot of submissions that are not quite what we want. So mega-journals like PLOS One (or are they PLOS ONE now? plOs oNE?) and PeerJ will suffer because they (presumably) rarely reject without review.

I think this illustrates a problem with these sorts of statistics: it’s always more complicated than they look.

Another less positive possible reason for shorter review times could be that reviewers are not being careful enough. This hypothesis was tested and refuted by the editors of Acta Neuropathologica in a 2008 editorial.
Of course, if quality is partly judged on timeliness this refutation isn’t valid.

sharmanedit says:

Thanks Bob. I have assumed with these numbers that the times are for those manuscripts that are sent to review, but you’re right, this might not be a valid assumption. Where a time is given as an aim or an aspiration, I think this doesn’t apply, but perhaps those journals that report an average they might or might be including manuscripts rejected without review. This underlines the need for more data!
Re your second point: you’re right of course. There is no indication in the article of whether the ratings are partly based on speed or not, and I can’t find anything about this on the journal website. So perhaps this ‘study’ shouldn’t be taken too seriously.

ianmulvany says:

We also have a triage process at eLife. The goal here is to reduce the risk to an author of losing time. If they submit to us they will get an initial decision fast. The average time from initial submission to final decision is currently standing at about 15 days, when you take into account all papers. The average time for that initial triage process is currently standing at about 4 days.

We are working towards being very open with this data. Some journals do seem to either manipulate these numbers or not understand how to measure the process, for example the Royal Society ( What should matter is how much time the author has to wait from initial submission to acceptance or rejcection, basically cutting down the “limbo” time of the paper.

sharmanedit says:

Thanks for the information about eLife, Ian. I’m amazed you manage such a quick decision when nearly a third of the time is taken up with the initial filtering (triage). You presumably don’t give reviewers very long? And are manuscripts ever revised by the authors after review, and re-reviewed? I can only assume that major revisions are never requested given these times.

The Royal Society example is only for one paper so I wouldn’t like to generalise from it. I’m glad eLife is being open, though. It is important that the dates given on a published paper reflect the history of the paper accurately.

Fantastic blog. I’ll be forwarding this round my department and to some colleagues who are setting up a journal at the moment.

“If lots of people decline, you’re not going to have a short review time”.

Arrggghhh I am subject to this at the moment. I’ve had a paper in review for ~8 months. One reviewer has apparently provided their comments but they can’t get another to review it.

I’ve had both ends of the spectrum though. I had an acceptance in 4 days last week (albeit a letter to the editor) and after 3 rejections from other journals (spanning an entire year of effort) had a quite large review paper reviewed and accepted ‘with revisions’ in 3 days accompanied by extensive feedback and suggestions for amendments from the reviewers.

sharmanedit says:

I’m delighted that you like my blog, James, and I hope it will be helpful to your colleagues.
I’m sorry to hear about this experience. When I was an in-house editor we occasionally had to make a decision on only one report if we couldn’t get any more in a reasonable time (which we defined as nearer 3 months than 8). This is a difficult decision to make as the editors don’t really have enough information. Sometimes they call on an(other) member of the editorial board to help decide, and sometimes they keep on trying to get another report. I hope they’re keeping you in touch with their efforts. And of course the usual timings differ between fields.

Nice post, Anna, and it would be great to have more transparency in this area. The average decision times journals post can be very misleading because they include all categories of decision. As Bob has mentioned above, if a journal rejects a lot without external review the times can look very good. My impression is that a number of journals are rejecting more at the editorial triage stage to cope with both increasing submissions and increasing demands on editorial time, e.g. running and checking CrossCheck reports to look for textual duplication and checking images for inappropriate image manipulation. So the times for any one journal can look to be improving when in reality they aren’t.

Journal policy on what warrants an ‘accept with revision’ or ‘reject + resubmission encouraged’ also affects time to first decision (and annual submission number) because the latter can get a decision very quickly (e.g. editor review or single external review). Some journals may game this to improve their decision times – that’s not ethical and can even have serious consequences for the authors in establishing priority – but many journals have specific policies in place with well-thought-out reasons and guidelines on where the decision boundary falls. In fairness to authors, a journal’s policy needs to be applied consistently across its editorial board and the decisions they make, which can be challenging, and is one of areas new editors need guidance.

Another thing that is probably starting to impact decision times is the introduction of manuscript-plus-reviews transfer (cascading) within groups of journals. The larger the number of decisions that are based on transfers without further review, the better the times will look. Biology Open accepts transfers from the other CoB journals and it would be interesting to know what proportion of its submissions and decisions result from those.

sharmanedit says:

Thanks Irene. You’re right that if resubmissions are included in the submissions the decision times can be artificially decreased because these manuscript take less time to review. I would hope that journals would separate out these two types when reporting statistics.
It is an excellent point that cascading might also affect the reported times – I hadn’t thought of that either. This could indeed be one reason for Biology Open’s speed. I’d really like to hear from them on this question.

Mary Purton says:

FEBS Open Bio had an average time to first decision of 20 days in 2012. We have recently recruited more Editors, and so hope to improve on this time for 2013.

This average does include some papers (19%) that were transferred from other FEBS journals with their reviewer reports. For these papers, the decision time is much shorter, and can be just a few days. This does skew the average slightly, but we hope to keep reviewing time under 28 days for all papers.

sharmanedit says:

Thanks very much for this information about FEBS Open Bio, Mary. It is especially helpful that you admit that this average is skewed by transferred papers.

At F1000Research, papers are put online after an initial screening step, and peer review starts at this moment. This first step has so far taken on average 6 days, including formatting for publication.

From that point, it has so far taken successful papers on average 8.6 days to reach two positive referee reports (at which point the paper is indexed in external databases).

sharmanedit says:

Thanks very much for this information about F1000Research. The process is obviously somewhat different at this journal, so it may not be strictly comparable, but the process is very fast nevertheless. I wonder if open peer review for an already published paper is more likely to attract reviewers who report quickly? Or is the speed to do with enthusiasm for the new model, as suggested in the main post?
(I should declare an interest here: I have helped F1000Research with their initial checking of submissions on a freelance basis.)

[…] Sharman wrote a couple of excellent posts about time to first response for journals and time to publication after acceptance for journals. Following up my previous post […]

sharmanedit says:

Jason McDermott has now posted a revised analysis of the timings for his manuscripts in computational/systems biology (see his third comment above). Although it’s a relatively small sample it is still informative. If other authors did the same the combined results could give us a much fuller picture than I can get from journal websites alone. Thanks Jason!
I note for reference here that the mean time to first decision in this revised analysis was 1.9 months and the standard deviation was 24 days (n=13). As Jason says, this is longer than the times reported by the journals in my spreadsheet, and the reason for that isn’t clear. It could be because these are real times rather than the promises I have collated and/or because his sample of journals is different.
So, who would like to do an equivalent analysis for their own manuscripts? Post it on your own blog and link it to here (or tweet me) so that I will it. I’ll collate any that I find and report back.

luispedro says:

I recently did a little investigation for PLoS (submission to acceptance according to what their papers say). Not quite “to first decision”, but perhaps relevant too:

But this only covers accepted papers of course.

Lola says:

I think it would be great to have journals publish this information alongside their impact factor so it could help us decide where to submit to.

I submitted a paper in February 2013 and I am still waiting for a first decision. It is annoying because a) the work was done in response to interest, and I was hoping to share the peer-reviewed work rather than a report… but it seems like, by the time the paper is published, the interest will be gone,
b) I’ll be applying for jobs soon and “accepted” looks so much better than “submitted” on a CV,
c) I’ve just submitted another paper which could really do with citing this first one, but the journal I’ve sent to doesn’t allow citation of an unpublished paper (sensibly).

I think it is a great idea to start recording these.

sharmanedit says:

Lola, six months does seem a very long time for a first decision, at least for a biomed journal. Do you know how long most papers take to be published or accepted in this journal? Is this an unusually long time for them? If it is unusually long, have the editors been in touch at all? I suggest emailing them and asking for an update, pointing out that this is unusually long and that the delay is causing you problems. It may be that they are having problems getting the reviewers to send their reports, or that the manuscript has somehow got lost if their tracking system isn’t very good (I have seen this happen occasionally). I hope you get a decision soon.
Your comment makes me wonder if a ‘journal speed index’ would be a good idea – it would have to be independently administered and would take a lot of work, but it would really help authors decide.

As part of their Journal Insights project, Elsevier is including metrics about a journal’s reviewing and publication times. You can see an example for FEBS Letters here The review speed metric shows the time from submission to first decision and from submission to final decision for the past 5 years.

Chris McNorgan says:

This is exactly what I was looking for. I am surprised there is not more information on this. In my field (neuroscience), some journals are notoriously slow — I’ve waited over 6 weeks on two occasions for papers that didn’t even go out for review. Had I known I was in for that wait, I’d have sent it elsewhere. I was (still am) considering creating a public database where individuals can provide this information for journals that do not, and to keep honest those that do.

sharmanedit says:

Thanks for this comment, Chris. It would be great if someone (you, me, anyone) could set up such a database. Like you, I’d like to, but it would take quite a lot of work. One day maybe!

Malhar N Kumar says:

I had submitted a manuscript to a U.K based journal called “The Knee”. I received the initial reviews after a very decent 4 weeks time. Both reviews were favourable and had had asked for very minor corrections. I had submitted the corrections promptly and the manuscript was sent back to the same two reviewers again. They once again responded with little comments. However, the handling editor sent me list of corrections to make after the second round of reviews! Nearly two months after submitting the corrections made by the handling editor, there was no reply from the journal regarding decision. I decided to retract the paper and resubmit it elsewhere. Truant peer reviewers and sloppy reviewers have been known, but what about truant editors? Has anyone had this experience?

Upaka Upaka says:

I just want to know the signs that could indicate whether your journal is accepted or not

sharmanedit says:

Hi Upaka. Could you explain what you mean by ‘whether your journal is accepted’? Perhaps you mean ‘whether your paper has been accepted’?

FEBS Open Bio is now displaying its time to first decision on its home page

sharmanedit says:

This is excellent news, and thanks for letting me know. Good fast times too – 3 weeks to first decision, 1 week acceptance to publication. I had noticed that some Elsevier journals were showing these statistics on their websites, and I gather this is gradually being rolled out to all. Impact metrics are also shown for some journals, though presumably not for new journals like FEBS Open Bio. This is something I wish all publishers would do.

[…] popular posts on this blog by far have been the surveys of journals with respect to their speed (of review and publication), impact metrics and charges (for open access and other things), so I will be doing […]

Leave a Reply