Science Criticism

eLife: Faster Is Not Always Better

It is a well-known fact that scientists publish their findings in scientific journals. Some of the most venerable journals, such as Science and Nature, have been around for over 140 years, and are frequently seen as the Himalayan mountains of scientific publishing. For a new journal to ascend to these heights takes time, proper guidance, and a lot of luck. A new publication called eLife is challenging these titans right out of the gate by inviting the best biomedical research to publish with them.

I attended a symposium organized by eLife in Montreal on November 8, 2013 to find out what was so different about this journal, which has been around for a year.

What is eLife’s main claim? That their review process is faster and more streamlined. Indeed, the peer review system can be fastidiously long. After an author has submitted his or her manuscript to a journal, an editor has to send it to two or three scientists in this field and wait for them to carve out non-existent time from their schedule to read it and comment on it. These comments are then sent back to the author, who must either make the required changes (which may involve performing additional experiments) or justify why a particular reviewer’s comment is irrelevant.

This is often the beginning of a back-and-forth dance arbitrated by the editor between the corresponding author and the reviewers. eLife’s peer-review process is claimed to be faster, with an initial decision being made in three days. Following an initial “yes”, reviewers take a look at the manuscript and their comments are condensed into a single set of instructions for the requesting author. The entire process takes 79 to 83 days on average.

Fantastic, no? Actually, this desire to be quick may lead eLife to cut corners and accept poor studies for publication. The fact that, unlike paper journals, eLife has no restrictions on the number of papers it can publish monthly only compounds the temptation to publish anything in order to establish eLife as a publishing titan. Indeed, eLife’s current initial acceptance rate is 25%, meaning that a quarter of the manuscripts submitted to eLife are passed on to reviewers and, most likely, eventually published. The prestigious journal Science, by comparison, accepts fewer than 7% of its submissions1. I do not believe that more is better: a quick look at the state of the scientific literature and its overall lack of reproducibility is sufficient to be skeptical of the benefit of a 25% acceptance rate.

The question of how eLife can afford to be so quick while other journals lag behind in tackling manuscript submissions is answered by its funding system. eLife was created and is financed by three major research foundations: the Howard Hughes Medical Institute, the Max Planck Society for the Advancement of Science, and the Wellcome Trust. The senior editors of eLife, who inspect the manuscripts and coordinate the peer-review process, are paid by eLife to dedicate a certain amount of time per month to dutifully wading through the manuscript pile. The problem I foresee is that these editors are also full-time researchers and lecturers; some of them are practicing physicians; yet others are CEOs of companies and directors of research institutes. Paying someone money to ensure they do additional work does not create more daily hours out of thin air. I fail to see how the editors at eLife have enough time to dedicate to an accelerated review system. This is a disaster waiting to happen.

For sure, there are some benefits to publishing in eLife. The journal is completely open access (no need to pay a subscription fee to access publications): its philosophy is so open that it encourages authors to self-publish their manuscript while it is being properly formatted to speed up the sharing of their discoveries. The display of the information on the site is fairly intuitive; then again, most top-tier journals publish single-column, HTML versions of their articles too, so eLife is no pioneer there. For now, there are no fees to publish, none to read. Articles have no length limit (virtual pages cost very little). Videos can be embedded within the article. And, for the public, eLife provides a 300-plain-word summary of each paper it publishes.

But a major point of contention arises with eLife’s stance against the impact factor. The impact factor is an old, oft-debated metric which assigns a measure of importance to a journal based on how often its publications are cited. This flawed value has often been wrongfully assigned to individual publications or individual researchers in a bid to assess their worth. Grant acceptance and academic tenure are all hinged around the impact factor of a researcher’s work even though most scientists know it is hogwash. eLife currently refuses to play the game and is using an aggregate of metrics, such as number of times an article has been Tweeted or written about in a newspaper, to evaluate its importance post-publication.

As one of the skeptical researchers in attendance put it during the Q&A, if an article gets cited 24 times in studies that refute the article’s conclusions, it will boost that article’s worth. Talking about it does not mean endorsing it. A well-known local researcher, Michel Tremblay, expressed doubt that the fast-tracking of the peer-review process can successfully be maintained in the long term given the limited resources and the expected increase in submissions. The head of marketing and communications for eLife countered that the publisher received 120 submissions last month and their editors are not at capacity yet.

When the 25% acceptance rate was brought up to her attention, she commented that this rate is of little concern to eLife, as they trust the judgement of their editors in choosing to publish the very best biomedical and health science findings. Actually, metrics exist for a reason. If you are relying on your expert editors to decide what’s good and bad and not paying attention to the discrepancy between your acceptance rate and that of the journals you are trying to emulate, you risk deluding yourself.

Will eLife change the way in which quality papers are published and accessed? I doubt it. If anything, it will become a quick gateway for the publication of more irreproducible research. While I admire what the three funding agencies behind eLife are trying to accomplish, the problem hindering quality publishing is systemic and speeding up the approval process is a step in the wrong direction.


(Feature picture is the eLife logo)


1. Science. “The Science Contributors FAQ”. Accessed November 8, 2013.


17 thoughts on “eLife: Faster Is Not Always Better

    • Not at all. I have never worked for any scientific journal nor was this article paid for by one (or by anyone). For the record, I am for open access. What I wrote stems from what I believe are legitimate concerns with the particularities of eLife. These concerns were shared by quite a few scientists in the room given the questions they were asking. If you read my article on the fake paper stunt which was recently conducted in open-access publications, you will see that I was critical of the way in which traditional journals, like Nature, were never part of the equation. I want the most accurate, reproducible research to be published, period.

    • FYI, eLife average time from submission to acceptance is 3 month, exactly the same as other high quality open access journals, like PLOS Biology or BMC Biology. So your concerns about potentially low quality of their review seem unsubstantial to me. I expect that impact factor of eLife will also be in the range of 6-12, which is a reasonable one (not like inflated ones from overhyped “top tier” magazines).

  1. Hi Jonathan,
    The acceptance rate issue is quite irrelevant and needs more careful assessment if any useful point is to be made that concerns it.
    It is trivial to see that one *would* expect for eLife to have a higher acceptance rate than Science and Nature, without this necessarily compromising quality. Just consider number of submissions vs publishing capacity:
    1) Say eLife can publish 15 papers per month (being everything online…) and Science can publish 10 papers per month.
    2) Now, consider that eLife might receive only 20 submissions per month (because it hasn’t built history, good reputation and popularity yet), whereas Science might receive 500.
    3) Consider that the proportion of “good” vs “bad” papers is uniform; e.g., half of all submitted papers are bad, half are good, regardless of the target journal. This means that 10 bad papers and 10 good papers are submitted to eLife in a month (in this example), whereas the numbers are 250 good and 250 bad for Science.
    4) Given the high volume of good submissions to Science, the journal will end up rejecting all bad papers (presumably) and most of the good ones as well. They can only publish 10 papers out of the 500 submissions they got, so their acceptance rate is 2%. When 250 papers are good but you have to choose 10, the decision of which to publish usually has more to do with whim, politics, marketing and popularity trends, than with the actual quality of the research.
    5) On the other hand, eLife has 15 slots to publish papers. If it only got 10 good papers out of 20, it will publish ALL the good ones, and reject the 10 bad ones (hopefully), and publish less papers than it can afford (i.e., it’s not running to full capacity, as the editors claim). However, it’s acceptance rate would be 50%!!
    Does this mean that the 10 papers that eLife vouched for are worse than the 10 that Science decided to publish? Not at all.
    Does the fact that the journal is not running to “full capacity” imply that they’ll publish any garbage that is submitted *just* to publish more? Hardly (that would be a very bad strategy).

    If we start off on the basis that any given journal (or most; or in this case Science and eLife) will be as honest as possible and is run by competent scientists and editors who carefully scrutinize the papers to publish, a “low acceptance rate” most likely means either of 2 things (or an interplay of both): 1) The journal has a VERY high number of submissions, 2) The journal has a very small publishing capacity.
    Often, it’s just the ratio between the two: The journal has a very high number of submissions relative to its publishing capacity. This, of course, says NOTHING about the quality of the submitted or published research. It could easily be the same problem as with popular restaurants that are actually pretty bad (a subjective assessment anyway) but are still crowded just because they’re “popular”, a status that could have been attained merely because of historical reasons and perpetuated through the years by the bad habits and the psychological inertia of the crowds.

    • You make a very valid point, thank you. Indeed, publishing more papers per month in a fairly new journal will have an impact on the acceptance rate which needs to be taken into account. That is fair. The point about “competent scientists and editors who carefully scrutinize the papers to publish”, however, is an assumption I am reluctant to agree with. We would like to think that scientific journals do a good job of evaluating submissions; given the rampant lack of reproducibility of biomedical discoveries (see Bayer’s and Amgen’s papers and the work of Dr. Ioannidis and his team) and the recent “Dr. Cobange” fake paper hustle, we may need to revisit this claim. I am a proponent of good science and am aware of the systemic problems plaguing the contemporary scientific enterprise which should, by no means, be dumped at the foot of eLife for trying to improve the part they play in publishing discoveries. I must however remain skeptical of their claim to be critical yet faster.

  2. Yes; the claim that papers are carefully scrutinized might be dubious, but it is as dubious for Science and Nature as it is for eLife.

    How eLife might be able to process papers faster and maintain quality, I’m not sure. Have they provided a detailed explanation of their methods? In any case, not knowing how they operate internally doesn’t discredit them a priori. Innovations in efficiency (and differences in efficiency across individual businesses) happen all the time in any given industry.

    Also, what about Alexey’s last remark: “FYI, eLife average time from submission to acceptance is 3 month, exactly the same as other high quality open access journals, like PLOS Biology or BMC Biology”?

    Leaving quality claims aside, I love it that at eLife the comments from reviewers (and the authors’ responses) are made public. Is this true for any other journal?
    This alone might have a significant impact on speed of publication and quality of the published research, as it strongly prevents bullying (which generates unnecessary disputes and can lead to multiple rounds of revisions and resubmissions and sometimes even finding additional reviewers to “moderate” disagreements). It also very strongly encourages reviewers to actually do a better job for fear of public and/or internal embarrassment.

    • The main explanation for their efficiency seemed to be that they are paying reviewers to take time out of their schedule to review these papers. However, when I looked up who the reviewers were, they were all scientists in their own right, some of them also practicing physicians or directors of institutes, as I noted in the article. Having worked for such people in the past, I don’t see how paying them will suddenly create more time in their schedule to be diligent.

      The transparency of the review process you mention is indeed quite interesting and I thank you for bringing it to my attention. I think shining a light on this process could help promote objectivity and good scientific civility. I’d like to thank you, by the way, for your very thoughtful and civil comments.

  3. I’d like to point out a few things that I think are incorrect assumptions:
    1. Faster review process means lower quality. Plenty of journals are just caught up in editorial and administrative red tape, and it’s not because their reviews are of higher quality. For example, Nature sat on my paper for 10 days before telling me they weren’t even going to review it. That’s entirely unnecessary. eLife will be pulling from the exact same reviewer pool as any other top journal, so there’s no reason to think the quality would be any different. Now, sure, perhaps they will be forced to slow down once the volume of submissions increases dramatically, but how would the quality of reviews be different? It’s the exact same process but with more openness (as someone mentioned, reviews and response to reviews are public).
    As you mentioned, it’s true that paying reviewers might not lead to more efficiency, but it’s certainly more fair, don’t you think? What might lead to more efficiency: having to coordinate with the other reviewers to come to a consensus (one particularly nice and unique aspect of eLife reviews). This would probably keep reviewers accountable for their time- they can’t just bail out and ask for extra time (as many faculty do with other journals).
    Let’s face it- giving someone 4 weeks to review a paper versus 3 weeks will not change the amount of time the they spend on it. They will wait until 2 or 3 days before a review is due to maybe start looking at the paper. This seems pretty universal in my experience with academia.

    2. Higher acceptance rate means lower standards. Again, someone did an excellent job pointing this out previously, but plenty of high-quality papers are turned down by Science and Nature because they aren’t “sexy” enough. Also, Science and Nature happen to accept plenty of papers that are genuinely terrible, but make flashy claims. If anything, Science and Nature’s decisions are based heavily on what they think will get the most attention, and not necessarily quality of the work. In fact, rate of retraction is much higher in the “best” journals (source):

    Also, consider this: the three funding institutes are heavily leaning on their own faculty to submit to eLife. In case you’re not familiar, faculty who are directly funded by HHMI are the top researchers in the U.S., and I’m guessing similar rules apply for Wellcome and Planck, though I’m less familiar with those institutions. So, while maybe 25% are accepted (btw, assuming reviewed = accepted is just wrong. Plenty of papers are rejected after review), consider that many submissions are coming from these top labs, who have all been encouraged to submit to the journal. It’s pulling from a high-quality pool.

  4. I would also add that the same people “short of time” make reviews in traditional journals. And peer-edited journals usually have a pool of researchers that are known to have great expertise in their areas whom the journals use on a regular basis as editors. So, I think technically eLife shouldn’t necessarily compromise quality by the decision scheme they use. Yet, merits of the mentioned public discussion between reviewers and authors are obvious, and this is an improvement that is akin to when some journals decided to make double-blind peer review so that the reviewer does not see the author names and personal bias or the country-of-origin bias is eliminated.

  5. The whole post is nonsense. First, the main reason for long reviewing periods is that reviewers ask for too many experiments without providing reasons for why asking for 2 more years of work is acceptable. Typically what they ask would generally fit in a follow-up paper. That’s why so many papers end up with 30 supplementary figures. eLife cuts that reviewer abuse: accept it or reject it, but don’t ask 15 things.

    Second, your major point was that eLife reviewers are busy is wrong. Do you know why? Because reviewers for other major journals are busy too, and in fact may be the same people that review for eLife. I’m sure eLife editor Randy Schekman is often asked for reviewing for Nature, Science or Cell.

    Third, your respect for impact factors and status shows you’re a conformist if there ever was one. That’s fine for yourself, but don’t try to impose your values on others. Good scientists rely on reading their papers and making their minds independently of other people’s opinion. Mediocre ones rely on impact factors.

  6. For different reasons, I agree, Faster Is Not Always Better.
    We once submitted to eLife and went through the full peer review. Eventually, the paper was rejected. Reading into the reviewers comments, I could tell they didn’t read the manuscript keenly as they were mentioning drugs and molecules we didn’t use or mention in the paper. Maybe the pressure to quicken the review process is taking a toll on both reviewers and eLife journal and they can’t keep up!

  7. Pingback: Bones of Contention: Is Homo Naledi a New Species? - Technology Org

  8. Pingback: Bones of Contention: Is Homo Naledi a New Species? | TechnologyBlog.Website

  9. Pingback: Bones of Contention: Is Homo Naledi a New Species?

  10. Pingback: Bones of Contention: Is Homo Naledi a New Species? |

  11. I just have many doubts on how this post was written, and also, if you may, the reason of doing so (It appears to me that “just sharing” is not the sole reason). Two of my post doc colleagues published their works recently in eLife and got to hear good feedback about how the peer-review process went (it’s not so often you hear good words about peer-review publishing involving high-caliber studies), not because their papers initially got accepted, rather because the Editors handled the process in close collaboration with the reviewers. Not to mention that in average, it took just less than month for them to decide after the revision was made.

  12. Hey Jonathan,
    I know this is an old article now, but following your link to Science- they only reject 80% of the initial submissions, meaning 20% go on to review.
    Isn’t that more comparable to the 25% of eLife that go to review? Like Science, if the groups that submit to eLife cannot accomplish the recommendations of the reviewers, than their paper is not accepted. So it’s wrong to state that 25% of eLife paper are eventually accepted. The gap may not be as big as you suggest.

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s