Your Grant In Review: The "No Previous Experience With the Assay" StockCritique™

ScienceWoman discussed one of my favorite* StockCritiques™ of grant proposal review in a recent post. The StockCritique™ in question is the observation that since the investigators on the grant proposal have no prior publications which include a scientific technique which is central to the present proposal, this diminishes the overall scientific merit of the proposal. Said critique is levied most often at younger, less established investigators and many of us have seen this one a time or two. Others of us fear this StockCritique™ to the point of letting it dictate our proposals a little too much. I have some thoughts including my usual defense of highly annoying reviewer behavior after the jump.


As with several StockCritiques™, the "no prior experience" one is a Catch-22. How is the new Investigator supposed to get experience in a new technique without the money? Isn't this what Exploratory grants are for? Why are the reviewers doing this, are they Out to Get me? AAAAUUUGGGGHHH!
Quite.
I've received versions of this before, as have many of my peers and colleagues. We all have been in situations where we are anticipating the appearance of this particular StockCritique™ and are scratching our heads trying to figure out how to head it off. Unfortunately, you really can't. And so you are left with the usual strategy which is to just suck it up, hope for the best, and if this arises take it on in the "Reply to Critique" section of the application revision. A commenter over at Sciencewomen suggested the strategy of

simply addressing your lack of expertise in those techniques - you recognize you have x, y, and z to learn, here is your plan for learning them, here is why you will succeed given your experience in learning to deploy methods a, b, and c under similarly challenging circumstances - could suffice for the next submission round. But yes, a supportive letter from a colleague, or pulling someone else in as a sub-contractor or co-pi, would be another way to address it.

This is a fantastic way to put it. While preliminary data would be great, it is not always possible to acquire such by the time of revision. In any case preliminary data may be entirely beside the point. How so?
Well, sometimes when the reviewer includes this "lacks experience with the technique" comment, what they really mean is a bit more involved. As in "Yes, I know this is a commonly utilized and published technique that you are proposing to use. As we all know, however, scientific papers sometimes do not really communicate the difficulty of setting up, validating and deploying this method in a scientifically useful fashion. I have used this technique in my lab, have helped others set this up, heck I've even consulted formally with people on this and, oh, the stories I could tell you. It is not easy even if it appears to be so from the papers. If you have not used this technique you are going to be mired in methods development for 2 years, given the staff and time you have devoted to this project. So when your timeline estimates it will take you 3 months to get this going I am thinking that there is no way that this is going to happen. The kind of productivity that you describe using this technique is a pleasant fantasy but is likewise never going to happen. The fact that you don't seem to recognize this possibility concerns me because it seems as though you have pulled this out of a methods section and have not really done your homework in figuring out what this technique is going to take by asking relevant peers. If you do not add more people, effort or expertise this project is going to end up with no publications resulting."
Okay. Obviously this does not fit all scenarios but it does fit what I would describe as one of the few legitimate uses of the "lacks experience" StockCritique™. And sometimes, despite this legitimacy, it may be the case that other people (including the reviewer) have trouble setting up the technique because they are not the BrilliantzAndEnergetic scientist that you are. Fair enough. It is my hope, however, that considering that the above is what the reviewer may be trying to communicate to you will help you in your revision. It is my experience that a lot of stepwise description of how you are going to work through the methods and validate the measures, combined with a perhaps more conservative timeline than you actually anticipate, can go a long ways toward convincing the reviewer that you can make a go of it. As the comment I quoted from Sciencewoman's post indicates, it also helps to point to other tricky methods you've established in the past as evidence for your careful approach to methods development. Which, btw, is one reason to waste your time on a cheapie "Methods" paper now and again instead of just resting on your primary data papers that employ the method.
__
[*in the hair-pulling, loud-ranting, reviewer-parentage-questioning, physioproffian-expression-of-outrage sense]

13 responses so far

  • PhysioProf says:

    So when your timeline estimates it will take you 3 months to get this going I am thinking that there is no way that this is going to happen.

    What kind of fucking moron generates a timeline that describes any sort of progress in any temporal increment smaller than one year?

  • NM says:

    When we branch out in clinically orientated science and need to use a new technique we tend to pull in somebody with uber-publication records in that technique as an Associate Investigator. The grant indicates that either they will do or supervise the new technique work (i.e. teach us how to do it in an expedited manner).
    Why do you not do this in bench-based science?

  • Lab Lemming says:

    DM says:
    `Well, sometimes when the reviewer includes this "lacks experience with the technique" comment, what they really mean is a bit more involved. As in...'
    And sometimes what they mean is:
    "This is my turf, so get outta my yard. You haven't sucked up to me at enough conferences for me to consider you worthy..."

  • Bill says:

    "simply addressing your lack of expertise in those techniques - you recognize you have x, y, and z to learn, here is your plan for learning them, here is why you will succeed given your experience in learning to deploy methods a, b, and c under similarly challenging circumstances - could suffice for the next submission round."
    PI's don't have "expertise" in techniques. I've seen only the newest of the newly minted PI's even attempt lab work. Makes the "StockCritique" even more of a joke.

  • PhysioProf says:

    PI's don't have "expertise" in techniques. I've seen only the newest of the newly minted PI's even attempt lab work. Makes the "StockCritique" even more of a joke.

    You're misunderstanding. "You" and "your" is shorthand for "your lab".

  • venkat says:

    will "puleeez...just give me money this one time!!" (along with LOLcat pics) work?

  • Bill says:

    "You're misunderstanding. "You" and "your" is shorthand for "your lab"."
    Oh, sorry, my bad.

  • Idea: Summary statements written like a blog post.
    For example, the text reads in underlined blue, "lacks experience with the technique." Applicant clicks on hyperlink and receives:
    "Yes, I know this is a commonly utilized and published technique that you are proposing to use. As we all know, however, scientific papers sometimes do not really communicate the difficulty of setting up, validating and deploying this method in a scientifically useful fashion. I have used this technique in my lab, have helped others set this up, heck I've even consulted formally with people on this and, oh, the stories I could tell you. It is not easy even if it appears to be so from the papers. If you have not used this technique you are going to be mired in methods development for 2 years, given the staff and time you have devoted to this project. So when your timeline estimates it will take you 3 months to get this going I am thinking that there is no way that this is going to happen. The kind of productivity that you describe using this technique is a pleasant fantasy but is likewise never going to happen. The fact that you don't seem to recognize this possibility concerns me because it seems as though you have pulled this out of a methods section and have not really done your homework in figuring out what this technique is going to take by asking relevant peers. If you do not add more people, effort or expertise this project is going to end up with no publications resulting."
    Hi there readers from NIH Program and CSR - you can thank me later.

  • DrugMonkey says:

    What kind of fucking moron generates a timeline that describes any sort of progress in any temporal increment smaller than one year?
    always the literalist, PP. even if you describe a technique plus results in a one year increment someone can still think that you are not going to be able to pull it off. but to answer your question, this 'moron' describes planned progress in increments of less than a year as do many of the applicants who submit grants that end up reviewed in the study section I serve. fwiw.
    Why do you not do this in bench-based science?
    oh, we do. this is part of the reply-to-critique strategy for sure. but it is not a magic wand if the reviewer wants evidence in the application itself that the PI has considered the pitfalls seriously. if the expert is listed but the application still seems optimistic you can end up with "there is no evidence the PI has actually consulted with the listed expert..."

  • neurolover says:

    Yes, translations of all the stock critiques would be greatly appreciated.
    I like this discussion because it shows how the stock critique might really be saying something important ("the technique is more difficult than you seem to think it is, and have showed no evidence that you'll know how to surmount the difficulties"), but also in our alternative translation ("get off my turf"). And, I guess in responding, one should assume the positive and respond to it (since the turf objection is unanswerable).

  • drdrA says:

    Yes- sometimes reviewers go over the top with this though. I can see these kinds of comments ('observation that since the investigators on the grant proposal have no prior publications which include a scientific technique which is central to the present proposal, this diminishes the overall scientific merit of the proposal') are reasonable when the proposed technique is complex and the applicant really does have no experience in them.
    However, there are times when these critiques can be leveled (and sometimes are, speaking from personal experience), over trivial issues. Like- for example- applicant has never worked with cell line A- when it is clear from the application that the applicant has worked with multiple equivalent and more fussy cell lines for all of that person's training. I doubt very much that a more senior applicant would ever hear such a criticism...
    Furthermore, this is frustrating in a more general sense. One is supposed to go about this business of hypothesis driven science, in my humble opinion, as asking a question and then using whatever the appropriate techniques are to get an unambiguous answer. This kind of criticism (you don't have experience in X technique) works against the risk taking in trying those new techniques that are necessary to answer the question.

  • NM says:

    DM
    "there is no evidence the PI has actually consulted with the listed expert..."
    Isn't this pretty much an accusation along the lines of "you are fucking lying". Ouch.

  • DrugMonkey says:

    NM, sortof. Consultants are listed as helping with the project if funded. No obligation they help with the application. But reviewers can see it differently...

Leave a Reply