Earlier this week I published the first product of my open peer review experiment, Student feedback using Google+. So far (after 3 days) the manuscript has been downloaded 74 times (latest figures here). This post is to follow up on my earlier reflection and tidy up a few loose ends concerning my experience of the open peer review process.
Given that this was the first time I undertook the process, in addition to posting the manuscript here for review, I also emailed a number of people I considered qualified to review it and pointed out that the process was under way. Those invitations gave rise to some discussions about "inviting friends to review your work" and consideration of whether this was valid peer review or not. In my opinion it was - open is open to all, friends and foes. However, considering the possible introduction of bias into the review process, when I repeat it in future (damn, given the game away now ;-) I will not issue invitations, only post the manuscript here and publicize the post through the normal channels. If that means the numbers of reviewers is lower, I will extend the review period until an acceptable number of reviewers have commented.
I intended to publish the final version a week or more ago, giving an interval from publishing the preprint to publication of the finished product of less than 21 days. However, personal circumstances and my current teaching load extended this to 23 days, a highly acceptable result in comparison to commercial publishers. Although I consider myself fortunate that most of the reviewers concurred on desirable additions, incorporating the comments of 7 referees is definitely hard work - assuming you attract enough reviews, open peer review is definitely not an easy ride! As hoped for, incorporation of the reviewers comments improved the quality of the publication considerably.
The final published version of this manuscript contain no acknowledgements - this work was a solo effort with no external funding. Since I was not sure about the etiquette of thanking reviewers I did not include them. I hope no-one is offended by that. The other thing I forgot was to add copyright information (CC-BY) to the manuscript itself). I was thinking that this would be covered by the repository page, but I now realize that a) this does not have CC status as I assumed, and b) is easily divorced from the manuscript, so the information must be embedded there. Rookie mistake.
The big question for me is, is this model scalable? If I routinely asked for reviews in this way, would fatigue set in, or would my ‘mates’ become an echo chamber? It is too soon to say, but my concern is that the process I have piloted may not be sustainable because the reviewer ecosystem may not be able to circumvent the Tragedy of the Commons. The only way to tell is further experimentation. Watch this space.