ScienceWoman discussed one of my favorite* StockCritiques™ of grant proposal review in a recent post. The StockCritique™ in question is the observation that since the investigators on the grant proposal have no prior publications which include a scientific technique which is central to the present proposal, this diminishes the overall scientific merit of the proposal. Said critique is levied most often at younger, less established investigators and many of us have seen this one a time or two. Others of us fear this StockCritique™ to the point of letting it dictate our proposals a little too much. I have some thoughts including my usual defense of highly annoying reviewer behavior after the jump.
As with several StockCritiques™, the "no prior experience" one is a Catch-22. How is the new Investigator supposed to get experience in a new technique without the money? Isn't this what Exploratory grants are for? Why are the reviewers doing this, are they Out to Get me? AAAAUUUGGGGHHH!
I've received versions of this before, as have many of my peers and colleagues. We all have been in situations where we are anticipating the appearance of this particular StockCritique™ and are scratching our heads trying to figure out how to head it off. Unfortunately, you really can't. And so you are left with the usual strategy which is to just suck it up, hope for the best, and if this arises take it on in the "Reply to Critique" section of the application revision. A commenter over at Sciencewomen suggested the strategy of
simply addressing your lack of expertise in those techniques - you recognize you have x, y, and z to learn, here is your plan for learning them, here is why you will succeed given your experience in learning to deploy methods a, b, and c under similarly challenging circumstances - could suffice for the next submission round. But yes, a supportive letter from a colleague, or pulling someone else in as a sub-contractor or co-pi, would be another way to address it.
This is a fantastic way to put it. While preliminary data would be great, it is not always possible to acquire such by the time of revision. In any case preliminary data may be entirely beside the point. How so?
Well, sometimes when the reviewer includes this "lacks experience with the technique" comment, what they really mean is a bit more involved. As in "Yes, I know this is a commonly utilized and published technique that you are proposing to use. As we all know, however, scientific papers sometimes do not really communicate the difficulty of setting up, validating and deploying this method in a scientifically useful fashion. I have used this technique in my lab, have helped others set this up, heck I've even consulted formally with people on this and, oh, the stories I could tell you. It is not easy even if it appears to be so from the papers. If you have not used this technique you are going to be mired in methods development for 2 years, given the staff and time you have devoted to this project. So when your timeline estimates it will take you 3 months to get this going I am thinking that there is no way that this is going to happen. The kind of productivity that you describe using this technique is a pleasant fantasy but is likewise never going to happen. The fact that you don't seem to recognize this possibility concerns me because it seems as though you have pulled this out of a methods section and have not really done your homework in figuring out what this technique is going to take by asking relevant peers. If you do not add more people, effort or expertise this project is going to end up with no publications resulting."
Okay. Obviously this does not fit all scenarios but it does fit what I would describe as one of the few legitimate uses of the "lacks experience" StockCritique™. And sometimes, despite this legitimacy, it may be the case that other people (including the reviewer) have trouble setting up the technique because they are not the BrilliantzAndEnergetic scientist that you are. Fair enough. It is my hope, however, that considering that the above is what the reviewer may be trying to communicate to you will help you in your revision. It is my experience that a lot of stepwise description of how you are going to work through the methods and validate the measures, combined with a perhaps more conservative timeline than you actually anticipate, can go a long ways toward convincing the reviewer that you can make a go of it. As the comment I quoted from Sciencewoman's post indicates, it also helps to point to other tricky methods you've established in the past as evidence for your careful approach to methods development. Which, btw, is one reason to waste your time on a cheapie "Methods" paper now and again instead of just resting on your primary data papers that employ the method.
[*in the hair-pulling, loud-ranting, reviewer-parentage-questioning, physioproffian-expression-of-outrage sense]