# Peer-review at Quantum – analyzing the data

This is a Perspective.

By Christian Gogolin (ICFO-Institut de Ciencies Fotoniques, The Barcelona Institute of Science and Technology, 08860 Castelldefels (Barcelona), Spain).

Many of you asked: “How long does it take to get published in Quantum?” We at Quantum are equally curious, as we want to provide the community with the most efficient peer-review process that is compatible with Quantum’s standards for quality and fairness. The honest answer is “It depends.” More on the nitty gritty details below, but for the impatient, the executive summary is as follows:

For manuscripts that are eventually accepted, it takes on average 138 days (median) and 2.2 peer-review rounds (mean) from submission to decision. For eventually rejected manuscripts it takes 69 (median) days and 1.4 rounds (mean). The decision ending the first round is made after 65 (median) days. The distributions, however, are very broad. Most decisions of accept are taken in less than 200 days (see Figure 1) and virtually all rejections take less than 160 days (see Figure 2).

0
10
20
0
40
80
120
160
200
240
280
320
Days from submission to decision
number of cases

Figure 1: Number of days between submission and decision for all 49 manuscripts whose peer-review process concluded with a decision of accept.

0
10
20
0
40
80
120
160
200
240
280
Days from submission to decision
number of cases

Figure 2: Number of days between submission and decision for all 41 manuscripts whose peer-review process concluded with a decision of reject.

## The time with Authors, Editors, and Referees

How is the time shared between the different parties interacting in the peer-review process? Eventually accepted manuscripts spend a median of 40 days with the Authors, 14 days with the Editors, and 86 days with the Referees (see Table 1 for more details). Overall, rejected manuscripts spend about the same median time with Editors (13 days), but less time with Referees (43 days) and Authors (25 days for manuscripts that are resubmitted at least once), which is a consequence of them going through fewer rounds. Our Editors spend proportionally more time on manuscripts that are ultimately rejected after a resubmission, which is an indication that these are more frequently borderline cases that require additional internal discussion.

Decision Number Median days with Authors Editors and Referees
accept 49 40
14
86
reject 41  0
13
43
reject (resubmitted) 14 25
22
56
overall 92 22
14
61

Table 1: Number of manuscripts and median number of days spent with with Authors, Editors, and Referees for all manuscripts whose peer-review process concluded with a decision of accept or reject, as well as for those which were rejected after at least one resubmission and combined for all manuscripts whose peer-review process concluded.

The median number of days between an invitation to review and Quantum receiving a referee report is 29 (over all 323 referee reports that could be used in the same round), but the maximum was as high as 129 (see Figure 6). A rather broad distribution of referee times is unavoidable in a journal that serves a broad range of sub-communities and which publishes different types of articles ranging from short letters style works to very long and technical contributions. Referees are sent automatic reminders every 14 days. While reminders are useful, and often necessary, we also aim to keep them as friendly and non-coercive as possible. The relation between the journal and its voluntary referees is crucial to the success of the reviewing process.

Excluding desk rejected manuscripts, our Editors invited a mean number of 3.8 referees in the first round, but the highest number was 11. This is a main reason for why the distribution of processing times is so broad. An extreme example is a manuscript for which 13 invitations to review had to be sent out and which ultimately spent 103 days with the Referees and 99 days with the Editors until it was finally accepted after countless discussions involving several Editors, 2 rounds, and 209 days. At the other end of the spectrum is a manuscript that spend just 2.3 days with our Editors, 49 days with Referees, and could be accepted (conditional on minor revisions) after round 1.

The analysed data contains 49 manuscripts that were eventually accepted, 41 that were eventually rejected, and 2 that were withdrawn. The rejections caused 6 appeals, none of which was successful. 13 of the rejected manuscripts were desk rejected in the first round without the involvement of Referees. In total, 325 referee reports were obtained and evaluated by our Editors (2 arrived only after the current peer-review round had concluded) for which our Editors had to send out a total number of 661 invitations to referee. Of all decisions to reject, 34 were made at the end of the first round, 14 at the end of round two and 1 in round three.

## Conclusions and consequences

Directly jumping to the conclusions, based on the statistical analysis, we mainly want to improve in two ways:

1. Rejections after the second round or later should happen even less frequently.
2. Avoidable delays should be minimized.

Why are these important points and how is Quantum going to improve the peer-review experience for its Authors?

On 1: It is frustrating for Authors to go through two rounds of refereeing and then be rejected, especially if the reasons for rejection were already more or less clear after the first round. In the beginning, some decisions of “revise and resubmit” were taken out of caution and despite a very negative outlook. This was sometimes not properly communicated. In addition, Referees often do not have a good understanding of the selection criteria and level of selectivity Quantum is aiming for, making the reports sound more positive to the Authors than they were interpreted by the Editors. We want to improve the situation without encouraging more, potentially unjust and intransparent rejections that would not give the Authors a chance to reply. In the future, our Editors are encouraged to clearly indicate necessary conditions for acceptance when they chose “revise and resubmit” after the first round. It is then up to the Authors to decide whether they want to take the risk of being rejected after a lengthy second round, but also leaves them the option to resubmit in case they can really improve their results or refute the criticism of the Referees without having to resort to an appeal.

On 2: Unfortunately, several manuscripts incurred avoidable delays. To help our volunteer Editors stay on top of things, Quantum has now hired two management assistants, who are monitoring the progress of manuscripts and ensure that they are not “lying around”. The time they started taking on their duties roughly coincides with the end of the time-frame of the dataset underlying the current analysis, so that we will be able to see the impact of this measure in a follow-up study, which will be conducted in the future.

How big a problem are manuscripts that spent more time with our Editors than is usual? We, the Executive Board, think that about 20 days is very good for the time manuscripts spend with the Editors – even for smooth peer-review processes without any unexpected complications. This is the minimum time Authors should be willing to grant our Editors, given that they are all volunteers, full-time researchers, and busy academics. In fact, we are very happy to report that this is indeed achieved for nearly 2/3 of all manuscripts (see Figure 5). There will always be exceptional cases that require internal discussions and possibly policy changes, so that it is nearly unavoidable that some manuscripts take significantly longer. This is reflected in the time from submission to invitation of the first Referee (see Figure 7). Except for five unusual cases, referees were invited within less than 35 days of submission and in the vast majority of cases this happens in under two weeks. Like all people working in academia, also our Editors have highly fluctuating work loads, need to meet important deadlines, and have a multitude of other responsibilities in their professional and private lives. Quantum values their expertise and prefers them to make informed and fair editorial decisions, rather than rushing to conclusions in a hurry. The analysis however also shows that no less than 17 of the 163 manuscripts spent over 40 days with the Editors, and 5 of these even spent over 70 days with the Editors. Here some analysis beyond mere statistics seems necessary to understand what had happened.

0
10
20
30
40
0
20
40
60
80
100
120
140
160
180
days during first round
number of cases

Figure 3: Number of days it took to complete the first round of refereeing for all 126 manuscripts whose first round concluded. The majority of Authors receive their first decision within less than 80 days.

0
10
20
30
40
50
0
20
40
60
80
100
120
140
160
180
200
220
240
days with Authors
number of cases

Figure 4: Number of days spent with the Authors for all 92 manuscripts whose peer-review process concluded with a decision of accept or reject. Most manuscripts spend less than 60 days with the Authors.

0
10
20
30
40
0
10
20
30
40
50
60
70
80
90
100
days with Editors
number of cases

Figure 5: Number of days spent with the Editors for all 92 manuscripts whose peer-review process concluded with a decision of accept or reject. Almost 2/3 of all manuscripts spent at most 20 days with our Editors, but the distribution has a fat tail. A number of manuscripts took significantly longer, spending 40 days or more with our Editors.

0
40
80
120
160
0
20
40
60
80
100
120
140
days until Referees submit
number of cases

Figure 6: Number of days between invitation of a Referee and submission of the corresponding report for all 323 reports that were submitted within the same peer-review round. Referees are sent automatic reminders every 14 days. There appears to be a typical timescale of about 40 days on which most Referees submit, but some Referees take much longer.

0
40
80
120
0
10
20
30
40
50
60
70
days until first Referee invited
number of cases

Figure 7: Number of days between submission and invitation of the first Referee for all 144 manuscripts for which at least one Referee was invited in the first round. In most cases it takes less than 10 days to find an handling Editor and invite referees.

There are several reasons for the delay on the 17 manuscripts that took longer than usual. Many rounds, intense internal discussions to agree on selectivity criteria before decisions and after appeals, and slow Referees (see Figure 6) all played a role. Four of the five manuscripts spending over 70 days with the Editors also spent between 100 and 133 days with the Referees and in three cases, 7, 11, and 13 invitations to review the manuscripts had to be sent out to reach a decision. In several cases it took longer than normal to find an appropriate Editor. In at least four of the cases, a further delay was incurred because the Editors discussed among themselves in great detail whether the manuscript meets Quantum’s selectivity criteria (something the Steering Board explicitly encourages and which will remain important for guaranteeing a fair application of the selectivity criteria). It may be at least a small consolation for the patient Authors that all five manuscripts that spent so much time with our Editors were ultimately accepted.
In at least 11 out of the 17 manuscripts that spent more than 40 days with Editors, however, the Editors also carry responsibility for the slow peer-review process. In each of these cases at least one, often several, of the following factors played a role: too few Referees were invited (due to a lack of experience with the fraction of Referees who can typically be expected to eventually submit), it took too much time to find a handling Editor, or the Editor did not manage to invite more Referees in a timely manner after some of those initially invited had declined. The Executive Board is confident that the measures described above and the growing experience of our Editors will lead to a sustained improvement of the refereeing process at Quantum, as seen from the authors’ perspective.

If you now wonder why Quantum had to send out 49 rejection letters to reject just 41 manuscripts, of which just 6 had to be rejected twice because of appeals, then you are ready to enter the rabbit hole of peer-review process analysis. In that case, please read on…

## Methodology: analysis of the peer-review process

Counting times during the peer-review process and splitting responsibilities for the different time intervals between the involved parties is a surprisingly non-trivial task. The following is an attempt to come up with an objective, procedural, and reasonable scheme of measuring the time a manuscript has spent with Editors, Referees, and Authors that does not require an in-depth analysis of the content of the communication exchanged between these parties. This scheme underlies the whole analysis presented here.

### Background and terminology

Let us start by recapitulating some background information and establishing a solid terminology: At Quantum, the peer review process is organized in rounds that start with the (re)submission of a manuscript or an appeal and conclude with a decision by the Editor. In each round, the Editor must choose between three possible decisions, accept, reject, and revise and resubmit. Manuscripts whose last round has ended with a decision of accept/reject count as accepted/rejected and we consider their peer-review process as concluded. All other manuscripts are considered under consideration. At least in the first round, the Editor should base their decision on at least two referee reports, but they also have the option to desk reject without involving any Referees.
Referees are invited to review a manuscript and are supposed to accept or decline this invitation, but they can, of course, also simply ignore the invitation. Editors can further chose to revoke an invitation at any point in time. Authors can chose to withdraw their manuscript (this ends the current round and the peer-review process as a whole), and appeal against an Editorial decision. The appeal is then reviewed and, if it is granted, the Authors can resubmit (a revised version of) their manuscript.
A specialty of Quantum are the Coordinating Editors, who initially receive all submitted manuscripts, and whose main task is to pick a suitable handling Editor (who is knowledgeable in the area and free of a conflict of interest), but who can also decide to handle a manuscript themselves.
The Coordinating Editors also initiate the majority of desk rejections, but it is the norm, rather than the exception, that multiple Editors are involved in all kinds of Editorial decisions.

In reality, every peer-review process is unique. For example, one report can turn out to be of so low quality that the Editor will seek to obtain a third report after two reports have already been submitted. To avoid a long delay, in another case, the Editor may instead chose to move on to the next round with just a single report, even if a second report is still outstanding. In subsequent rounds, often a single report is sufficient and hence only one Referee is invited. All these possibilities must be taken into account. The multitude of possible histories of a manuscript is what makes measuring times difficult.
The situation is further complicated by the fact that it is technically possible for Authors to resubmit previously rejected manuscripts without an appeal (this makes sense for rejections on formal grounds, such as the manuscript not being on the arXiv at the time of submission), and that decisions by Editors can be reverted (for example, in case they were made accidentally).

### Time with Authors and Referees

The time spent with the Authors is still easy to count. It is the time between rounds, i.e., from decision to resubmission or submission of an appeal. In addition it is, in the case of acceptance, the time between the final decision and the day the final version, ready for publication, is uploaded to the arXiv. This very slightly underestimates the actual time spent with the Authors as they have to also write an email to notify Quantum of the upload, which does not always happen on the same date. This analysis does not count the time between upload and publication, but, contrary to other journals, where many weeks can pass between submission of the final page proofs and the actual publication, Quantum typically publishes articles within the first four days after notification of the availability of the final version.

The time spent with the Referees is also easy to count. It is taken to be the sum of all times during which at least one Referee was active.

### Time with Editors

Measuring the time spent with the Editors is much more complex. The time measuring scheme proposed and employed here is based on the notion of active and inactive Referees: A Referees becomes active when they are invited to review a manuscript. They become inactive when their invitation is revoked, when they decline, when they submit, and after 30 days (timeout) in case they have not either accepted in the meantime or later do submit a report in the same round. If they accept after the timeout, they become active again.

To illustrate the concept of active and inactive Referees, consider the fictitious, example timeline in Table 2:

Event Active Referees Submitted Reports Time after event counts towards time with
Submission 0 0 Editor
Coordinating Editor assigns Editor 0 0 Editor
Referee A invited 1 0 Editor and Referees
Referee B invited 2 0 Referees
Referee A accepts 2 0 Referees
Referee B declines 1 0 Editor and Referees
Referee C invited 2 0 Referees
No response from Referee C after 30 days 1 0 Editor and Referees
Referee A submits 0 1 Editor and Referees
Referee D invited 1 1 Referees
Referee C accepts 2 1 Referees
Referee D declines 1 1 Referees
Referee C submits 0 2 Editor
Decision of revise and resubmit 0 2 Authors
Resubmission 0 0 Editor

Table 2: Events from a fictitious example history of a manuscript, the number of active Referees and submitted reports, as well as the parties towards which the following time interval is counted.

The time spent with the Editors should be the combined time during which further action is needed to ensure sufficient reports can be expected to arrive to make a decision, plus the time after sufficient reports have been received and the decision. Here we try to capture this time as follows (appeals are treated later):
If at any point more than one Referee was active, we assume that the Editor wanted to obtain at least two reports and hence count all times during which the number of active Referees plus the number of submitted reports is less than two, plus the time between the last Referee becoming inactive and the decision towards the time with the Editor. In an orthodox peer-review process, this, for example, includes the time between (re)submission and invitation of the second Referee and, if all promised reports are actually submitted, the time between the submission of the last report and the decision. In less orthodox cases, this also includes, for example, the time between one of initially two Referees declining and the invitation of a replacement, as well as the time between the submission of the last report and the decision, even if another Referee had an outstanding invitation from more than one month before but failed to respond. If this slow Referee however accepts after the 30 day timeout and the submission of the first report, he/she becomes active again, and the time waiting for his/her report after acceptances of the referral is not counted towards the time spent with the Editor. In the time before the Referee becomes active again, the number of submitted reports plus the number of active Referees was less than two and this time is hence counted towards that of the Editor.
If at no point more than one Referee was active, it is assumed that the Editor thought that a single report would be sufficient to conclude this round. In this case, all times with no active Referee are counted towards the time of the Editor. This is obviously a fair way of counting in the orthodox case of one Referee being invited, accepting, and submitting a report with a subsequent decision by the Editor. A non-orthodox case could be that of the invitation of a second Referee after the first report was obtained (for example because the first report was not conclusive enough). Here, the described scheme puts the blame for the delay on the Referee and only counts the times without active Referee towards the time spent with the Editor. If a second Referee is invited while the first is still active, we fall back to the case of at least two active Referees, and then the blame for the delay until the second invitation is put on the Editor.

In the case of a successful appeal, the time it took the Editors to evaluate an appeal should be counted towards the time of the Editor. We do not have a reliable timestamp for when the notification of a granted appeal is sent to the Authors, as this has sometimes been sent outside the automated system of Scholastica. Luckily this does not matter much, as all times from the submission of the appeal (which is recorded in Scholastica and which starts a new round) until the resubmission is anyway counted towards the time of the Editors by the standard scheme. We are just slightly overestimating the true time spent with the Editors as there might be a short gap between granting an appeal and the resubmission by the Authors (this time should rather be counted towards the time of the Authors), and there can be a 1-2 day gap between the submission of an appeal and the notification of the Editors by the Steering Board (which initially receives the appeal and supervises it correct handling). The data we are analysing here does not actually contain any successful appeals.
In the case of unsuccessful appeals, we do count the time until the last recorded action on the manuscript (such as any discussion between the Editors or between the handling Editor and the Authors) towards the time of the Editors.

Note that by counting times in this way, the sum of the times spent with the Authors, Editors, and Referees is usually larger than the time between the initial submission and final decision, because some times are counted towards multiple parties and the time for Authors also includes the time until the upload of the final version. In rare cases, also the opposite can happen (see the Section Caveats below).

## The analysed data

The analysis presented here is based on 5548 events with timestamps generated by the peer-review process from the beginning of the journal until 2018-01-25 16:13:19 UTC. This has been supplemented with data on the times at which Authors uploaded the final version of accepted manuscripts to the arXiv. From this data trove, events corresponding to fake manuscripts that were used to test the Scholastica system were removed. Further, the events of three manuscripts were discarded. In two cases because the manuscript was erroneously submitted several times by the Authors and in one case because the initial invitation of one of the Referees was not correctly recorded, making it impossible to calculate the times spent with the Editors and Referees. Rejected manuscripts for which no appeal was received by 2018-01-25 16:13:19 UTC are considered as not having caused an appeal.

### Detailed statistics

Due to the broad distributions of the times manuscripts spend with the various parties, ranging from essentially zero days for desk-rejected manuscripts (due to our generally very fast coordinating Editors) to over 70 days for the extreme cases discussed above, the median values are more relevant than averages. Median and average values for various times of several interesting subsets of manuscripts are provided in Table 3 and Table 4:

Decision Number Rounds Authors Editors Referees Total
accept 49 2.2 50.51±43.79 26.1±25.98 88.76±38.21 143.17±56.82
reject 41 1.37 10.08±21.17 20.36±19.07 45.43±41.98 74.68±60.62
reject (no appeal) 35 1.23 8.45±21.05 18.04±18.94 43.33±40.64 68.41±56.75
reject (no desk) 28 1.43 11.19±19.38 24.47±20.53 63.5±36.04 97.54±49.74
reject (resubmitted) 14 2.07 29.53±27.57 25.15±17.06 66.54±45.82 121.15±62.56
overall 92 1.82 31.44±40.36 23.05±23.08 67.52±45.91 109.65±68.66

Table 3: Number of manuscripts, mean number of rounds in the peer-review process, as well as the mean and sample standard deviation (not the standard error of the mean) of the number of days spent with Authors, Editors, and Referees, as well as between submission and decision, of manuscripts whose peer-review process concluded with the stated decision of accept or reject (all rejections, manuscripts that did not cause an appeal, such that were never desk rejected, and such that were at least resubmitted once), as well as for all manuscripts. The large standard deviations make it obvious that most of the mean values have only a limited informative value.
Decision Number Rounds Authors Editors Referees Total
accept 49 2.2 39.89 14.42 85.92 138.01
reject 41 1.37 0 12.88 43.21 69.06
reject (no appeal) 35 1.23 0 12.26 39.08 63.72
reject (no desk) 28 1.43 0 15.07 55.5 87.15
reject (resubmitted) 14 2.07 25.03 22.27 55.94 121.65
overall 92 1.82 22.18 13.7 61.05 109.62

Table 4: Number of manuscripts, and median number of days spent with with Authors, Editors, and Referees, as well as between submission and decision, of manuscripts whose peer-review process concluded with the stated decision of accept or reject (all rejections, manuscripts that did not cause an appeal, such that were never desk rejected, and such that were at least resubmitted once), as well as for all manuscripts.

## Caveats

There are a number of caveats that should be taken into account when interpreting the data discussed above:

• Decisions of “revise and resubmit” with a very negative outlook which were taken as a rejection by the Authors are still counted as “revise and resubmit”. This makes us underestimate the true rejection rate and the number of manuscripts whose process has concluded. This can only be rectified in an upcoming analysis of data over a longer time frame, as we have to wait for Authors to resubmit, which can take up to 32 weeks (see Figure 4). In a follow-up study, all manuscripts that have not been resubmitted after a certain number of days, following a decision of revise and resubmit decision, can be counted as rejected.
• After a rejection, Authors can chose to appeal for an in principle unbounded amount of time. It is thus possible that some manuscripts whose peer review process we consider as concluded in this analysis will actually undergo another round of peer-review after an appeal in the future. The number of cases in which this will happen however will be so small that this will not significantly influence the conclusions drawn.
• The number of rejections includes also such against which an appeal was later submitted. A rejected manuscript with an unsuccessful appeal thus generates two rejection events in subsequent rounds. A manuscript that was first rejected on formal grounds, then after review, and finally, again, after an unsuccessful appeal can even cause three rejections.
• Sometimes it happens that a reviewer invitation directly “bounces”, because of an incorrect email address. We count these as not sent out out all because the Editor is notified in a timely manner and should correct the mistake or invite another Referee essentially immediately.
• The sum of the times of the Authors, Editors, and Referees, which is usually greater than the total time from submission to final decision, can, in rare cases, also be shorter. A scenario in which this happens is that in which an Editor already has two reports, but then decides that a third report is needed (e.g., because the two reports are of insufficient quality or otherwise inconclusive) and consequently invites a third Referee and then waits for this third report before making a decision. In this case, the time after submission of the second report and the invitation of the third Referee is counted neither towards that of the Editor (already two reports were obtained and the time between submission of the last report and the decision does not cover this time interval) nor the Referees (no Referee is active at that point). In fact, this does seem to be a fair way of counting. The Editor could not know in advance that the two initial reports would be inconclusive, but the Referees can also not be blamed for the delay.
• Some more general discussions on editorial policies which were sparked by specific manuscripts and had to be resolved before decisions could be taken are not contained in the analysed data, as they happened through different parts of the Scholastica platform. These discussion periods may thus unjustly appear as periods in which the Editors were seemingly inactivity.

## Acknowledgements

C.G. would like to thank Lídia del Rio, Marcus Huber, Lukas Schalleck, and Manuela Gogolin, as well as the whole Editorial Board (Guido Burkard, Steven Flammia, Aram Harrow, Christian Kurtsiefer, Matthew Leifer, Chiara Macchiavello, Antonio Acín, Carlo Beenakker, Agata Branczyk, Nicolas Brunner, Daniel Burgarth, Eric Cavalcanti, Gabriele De Chiara, Ivette Fuentes, Sevag Gharibian, Khabat Heshami, Chris Heunen, Stacey Jeffery, Ashley Montanaro, Milan Mosonyi, Ahsan Nazir, Román Orús, Saverio Pascazio, Marco Piani, Sven Ramelow, Joseph M. Renes, Jörg Schmiedmayer, Volkher Scholz, Ujjwal Sen, Jens Siewert, John A Smolin, Aephraim M. Steinberg, Marco Tomamichel, Thomas Vidick, Francesca Vidotto, Michael Walter, Alexander Wilce, Andrew White, Witlef Wieczorek, Karol Życzkowski) and Steering Board (Anne Broadbent, Harry Buhrman, Jens Eisert, Debbie Leung, Chaoyang Lu, Ana Maria Rey, Anna Sanpera, Urbasi Sinha, Robert W. Spekkens, Reinhard Werner, Birgitta Whaley, Andreas Winter) for corrections as well as comments and feedback on this analysis (special thanks to Debbie Leung, Anna Sanpera, and Thomas Vidick). This analysis would not have been possible without the help of Anna LeSuer and the Scholastica technical team, who provided the raw data and gave us the permission to perform and publish this analysis.

## Raw data

Below each bar chart, the plotted data is contained in the html code of this page inside a div element with class bar-chart-data. Below this paragraph, the html source code of this document contains a div element with class raw-data that contains a html escaped json encoded array representation of an anonymized version of the raw data analyzed above. The array contains for each manuscript (sorted in random order) an array for each round in the peer-review process of that manuscript, that contains an array of events with timestamps (in seconds since the beginning of the first round), the event type, and, if applicable, what (for example accept for a decision event) was done in that event and by whom (a salted PBKDF2/SHA256 hash of the identifier of that person, that is unique to the peer-review process of the given manuscript, but which cannot be used to identify persons across processes of different manuscripts or associate them with their real-life personalities). Feel free to analyze it yourself!

### Cited by

On Crossref's cited-by service no data on citing works was found (last attempt 2021-04-21 02:00:23). On SAO/NASA ADS no data on citing works was found (last attempt 2021-04-21 02:00:24).