This is absolutely BRILLIANT!
a tool designed to address the File Drawer Problem as it pertains to psychological research: the distortion in the scientific literature that results from the failure to publish non-replications. Most journals (especially high impact journals that specialize in publishing surprising findings that have low prior odds of being correct) are rarely willing to publish even carefully conducted non-replications that question the validity of a finding that they have published. Often the only people who learn about non-replications are those who happen to be "plugged in" to social networks that circulate this information in a fragmentary and inefficient way. Even textbook authors are rarely well informed about the replicability of the results that they report on, and may often rely upon results that are known to be dubious by those working in the area.
What a great idea. One of the reasons I recently held out as a justification for the LPU approach to publishing is the hoarding of not-enough-for-pub data out there that might save someone else a whole hell of a lot of time. Well, chasing after a supposed published finding as your control or launching point for new studies can land you in one of those little potholes. Wouldn't it be nice to see a half dozen (or more) attempts to replicate an effect to (at the very least) tell you which are the key conditions and which can be manipulated for your purposes?
Other fields should try something like this.
Disclaimer: I'm professionally acquainted with one or more of the people apparently involved in this effort.