On writing a review

Apr 26 2016 Published by under Conduct of Science

21 responses so far

  • MorganPhD says:

    I thought the point was to give the illusion that you were more productive than you really were during your last grant cycle.

  • dr24 says:

    Sort of. A lot of people mix up reviews and multi-study meta-analyses.

    Review the data, but also review the claims and report which claims remain justifiable in the context of a broader information.

  • becca says:

    The point of a review is to summarize the data and *distinguish* them from the claims.

  • DJMH says:

    Thought it was all about axe-grinding, myself.

  • AcademicLurker says:

    I thought review articles were all about the self citations.

  • dsks says:

    Nah, reviews are for throwing out all sorts of ideas you've no intention of pursuing so that you can claim credit when somebody else successfully does instead.

  • Mikka says:

    The reason reviews outnumber primary literature in some fields is that they are a citation cow for journals and authors. I think that a review should at least try to synthesize what the field thinks about something, or to come up with novel insights. Otherwise, why even write it? Since the data is already in the literature, If summarizing it is the purpose, wouldn't a literature roundup, which anyone can do with a quick pubmed search, be enough?

    This is what Asimov thought of data summaries:
    "Your bunch here is a perfect example of what's been wrong with the entire Galaxy for
    thousands of years. What kind of science is it to be stuck out here for centuries classifying the work of scientists of the last millennium? Have you ever thought of working onward, extending their knowledge and improving upon it? No! You're quite happy to stagnate. The whole Galaxy is, and has been for space knows how long."

  • Pinko Punko says:

    Reviews summarize lit in superior fashion to pubmed search. They are most useful if they give insight into paper conclusions, but can still be useful just based on collation.

  • Jessica Tollkuhn says:

    I don't know what is up with some people working on hormones and behavior, but I see many reviews that include one or two figures of new data. That are then cited in the next review...

    I don't see this in other fields I follow.

  • drugmonkey says:

    You mean sneaking new data into a "review"? I see this now and again in my field. Frankly I always keep chiding myself that we ought to do this more regularly with orphan figures.

  • Jessica Tollkuhn says:

    I am surprised you are on board with these shenanigans, DM. IME, it's not so much "orphan" data, as it is "sketchy". Why do journals allow some data to bypass peer review?

  • banditokat says:

    Book chapters is where all my stray figures go.

  • Namnezia says:

    Its a chance to synthesize various studies and provide an intellectual framework for a given area. Or a chance to take various papers one has published over the years and present the entire narrative. Much like you would during a seminar. But on paper.

  • Dusanbe says:

    I like including original figures in reviews (nothing too controversial) because sometimes I can't deal with the copyright/permission bureaucracy needed to use published figs.

  • Newbie(ish) says:

    Orphan data is a curious phenomenon; I am trying to sneak some into a review right now, myself 🙂 It's a very simple figure and the methods are super straightforward (and are also described in published work). Reviews do go out for peer review, and any reviewer ought to treat a piece of orphan data with as critical of an eye as they would treat a primary research article: are the methods sufficiently described, and are the claims made on the data supported? I'm biased here, but the data we want to include is simple but novel; it's just one more piece of info to support claims we make in the review. I'm figuring it's a step up from tossing it in a book chapter.

    To the original question: a review should organize information and there are many ways a review article can be useful or interesting. Summarizing data all in one place is useful/interesting. Synthesizing old data to generate new future research paths is useful/interesting. Providing broad opinion and perspective on the field, backed by your research and others, is also useful/interesting.

  • drugmonkey says:

    JT-

    1) reviews are peer reviewed.
    2) we haven't done it yet so I can't say where I actually stand on the practice.
    3) in the ones I think of most quickly, I am glad the data appeared in the review. As always, I can assess the value of the data myself.

  • Jessica Tollkuhn says:

    thanks DM. I wasn't sure if there were different standards for reviewing primary data within the context of a review article. It's good to know that this is a viable way of getting results to readers.

  • Grumble says:

    Does nobody here write reviews to PROPOSE A NEW FUCKING HYPOTHESIS?????

  • drugmonkey says:

    JT- I think there is certainly an argument to be made about the standards that a given reviewer may apply. And an argument to be made about the inevitability of acceptance for publication of a solicited review over an unsolicited primary research article. However, my thought is that given the diversity of review standards for primary research articles, well, that's not likely to be an area to draw categorical conclusions.

  • Jonathan Badger says:

    @Grumble
    That would count as original research and be against the criteria of most reviews (often in sending a review out for review the journal specifically asks if any original research is included which would disqualify the manuscript as a review).

  • Dr Becca says:

    Does nobody here write reviews to PROPOSE A NEW FUCKING HYPOTHESIS?????

    This is the main reason I write reviews - to set up readers for my Big New Idea that I am ABOUT to show experimentally.

Leave a Reply