Pages

Monday, January 16, 2012

It's academic publishing Jim, but not as we know it

Quote I have a manuscript currently in press with an academic journal which describes work that we performed three years ago. In part, the fault for the delay in publication lies at my door, but the original version of the manuscript now in press was written 18 months ago and first submitted for publication over a year ago. There followed a catalog of errors, some due to me, others due to editors and journals. The current incarnation of the paper was submitted to the journal where it will appear shortly six months ago. It is still not published. I should feel lucky - others have had worse experiences than this:
"Because of the work described in the paper had already been talked about in public forums and included in grant applications, and because publication was important for moving forward with our grant applications, job applications and other papers, we felt we could not spend another year in the review process. The very essence of the scientific process is to challenge paradigms and share the experimental details with other scientists who can then reproduce or refute the findings. Publication is key for this process. We needed to publish."
I have recently been depositing my papers in our institutional repository (How to fix academic publishing again already), but now it's time to move up to the next level: post publication peer review.

I invite reviews of the following original manuscript:


An efficient and effective system for interactive student feedback using Google+ to enhance an institutional virtual learning environment (PDF download via Dropbox) Update: Final version now published
Abstract:
Whether or not you take a constructivist view of education, feedback on performance is inevitably seen as a crucial component of the process. However, experience shows that students (and academic staff) often struggle with feedback, which all too often fails to translate into feed-forward actions leading to educational gains. Problems get worse as student cohort sizes increase. By building on the well-established principle of separating marks from feedback and by using a social network approach to amplify peer discussion of assessed tasks, this paper describes an efficient system for interactive student feedback. Although the majority of students remain passive recipients in this system, they are still exposed to deeper reflection on assessed tasks than in traditional one-to-one feedback processes.

How it works:
  1. Please read the manuscript then leave your review as a comment on this blog post. Please use page and paragraph numbers to refer to specific sections of the manuscript.
  2. Reviews may be named or anonymous as you wish.
  3. To expedite the publication process, this manuscript will be open for review for 14 days from today.
  4. Following the review period, all substantive reviews will be taken into account and the manuscript revised accordingly. (My best estimate from blog stats is that between 1,000 - 2,000 unique visitors view the content on this site. If 1% of visitors take the trouble to leave a substantive review, that's a much more rigorous review process than any academic journal I am aware of.)
  5. If the majority view is generally positive, the revised manuscript (including reviews and author responses) will be published on the Leicester Research Archive.
I think this is as efficient and transparent as I am able to make the academic publishing process, but if you have any comments or suggestions, I welcome them. Most of all, I would welcome your review of the manuscript as a comment here. I cannot offer you any payment or other inducement beyond the knowledge that you will be helping to fix the broken model of academic publishing. And of course, given the opportunity, I will be happy to reciprocate your time in reviewing any papers I feel competent to comment on should you wish to participate in a similar process.



Notes:
Other options considered for sharing the provisional PDF were Slideshare and Google Docs. These were rejected due to problems with PDFs being reformatted and Dropbox selected as the best general purpose solution, but potentially any site which allows free PDF downloads would be suitable. if this blog had been hosted on Wordpress, that would have been a suitable choice, but Blogger does not allow PDF uploads.



A.J. Cann

10 comments:

  1. Review #1 from Dave Bridges (before comments fixed)"
    https://docs.google.com/document/pub?id=1gqaKvhcXJih3kzX9XYS6AX1lZJsaarAB7CbXwvuSD-w


    Major Comments
    Since these may change over time, I would suggest providing a more detailed description of the current privacy capabilities and limitations of G+
    I cannot clearly read Figure 1 in this pdf. Can a higher resolution image be provided as an example (with the understanding that student comments may need to be redacted). Because of this is is unclear what the specific nature of the comment discussions were. Whether they were productive and whether they served to enhance or extend the current material is quite relevant.
    Were there inter-comment discussions (replies to a comment). At what rate were students interacting with each other vs interacting with the staff?
    Is there a better method to determine the number of passive viewers? For future studies, a post-course survey may be warranted.
    What was the comment distribution (were there few long ones and many short ones or vice versa). A histogram may be helpful here.
    Among those who did not engage with feedback, what were their criticisms?
    The easy +1 or “like” system is a benefit for students to give quick low-barrier feedback. A further discussion of this point, and the number of +1’s may be helpful
    Minor Comments
    Pressurized -> Pressured (p2 pgph 1)
    Next sentence Few -> Few Staff
    Provide a reference for separation of online academic and social activity
    I am unclear about why G+ being separate from facebook is relevant
    Define MCQ before it is used
    Results first pgph used -> accustomed
    Need “A” and “B” on the panels of Figure 2. The y-axis on Figure 2B is misleading and ought be set to zero

    ReplyDelete
  2. Paper review (comments on process in separate comment):
    Link to constructivism - is feedback really associated strongly with constructivism? I would have thought it features highly in all learning theories? Behaviourism is very feedback intensive for instance. So I wasn't sure about making the link specifically to this learning theory.
    You may need to explain the headings in Table 1 more clearly (what is "+1").
    Pg 5 - "circumstantial evidence (word of
    mouth) shows that the majority of students passively participated by reading the feedback
    threads on Google+. " - do you have any analytics which might help corroborate this?
    Fig 3 - do pretty word clouds actually tell us anything? Apart from during a discussion about maths they used the word 'maths' a lot. There are other semantic analysis tools which might tell you the level of writing or argumentation, for instance.
    What would be most useful is to have a comparison of this activity with that of a previous cohort using only the VLE.
    I don't think you can justify the claim "Once set up, this pattern of assessment and
    feedback using online quizzes and Google+ takes comparatively little staff time since peer
    interactions amplify staff input. " - it could be that all students were saying was 'I thought this quiz was rubbish, and the answer to 7 is 42". That doesn't demonstrate reflective engagement with the feedback. I'd like to see some more detailed analysis of the type of interaction to back this claim up. It may well be true, but from the evidence presented I don't think you can say that.
    See for example, the work Denise Whitelock has done on analysing e-assessment feedback (eg http://cloudworks.ac.uk/cloud/view/5306) and Anna De Liddo on discourse analysis http://oro.open.ac.uk/25829/1/DeLiddo-LAK2011.pdf

    ReplyDelete
  3. Comment on the process:

    I tried to put my official reviewer hat on and review it as if I was doing a standard (blind) peer review. It may be that this is an inappropriate transfer of process, and instead I should adopt a different style for open, informal review. But we fall back on what we know.

    My review may be a bit harsh, but I was conscious that 'asking your mates to review' isn't really comparable to anonymous peer review at all. I might be far less likely to criticise a friend. My colleague Gill Kirkup maintains that anonymity in the peer review process is essential because it protects the reviewer, particularly a young reviewer who is reviewing a paper by someone eminent in the field. Of course, it also allows people to be ruder than they would be otherwise, and often to say incorrect judgements because there is no debate or come back.
    So this may be a good way to get feedback on a paper, but would it equate to peer review? I don't think so, but then maybe it's a sufficient filter to allow publication and then post-review.
    It's also quite a brave thing to do and I suspect many colleagues might be reluctant to go this route. If you write a crap paper that gets rejected by a journal, only a handful of people have seen it - if you do it this way, potentially hundreds will.

    ReplyDelete
  4. A quick summary of my initial comments from reading this (ok as soon as you start writing things, more things come to mind ..). I didn't look at the other reviews here so as not to prejudice the review. Comments:
    - Introduction: you spend time emphasising the importance of providing an opportunity for a feedback dialogue between students and tutors and illustrate how a simple social tool can provide an efficient means of supporting this, however this left me with a couple of unresolved questions: First, is creating an opportunity for dialogue enough (for instance, effective feedback practive also relates to providing opportunties for the students to act on the feedback given, not just discuss it) and second, is this assessment/feedback design suited to particular assessment tasks (the shared feedback relates to class level issues, rather than individual ones)? A broader discussion of the scope of this design/intervention would have been interesting and useful.
    - Background: see above: For clarity, it would be useful to state that the results being presented refer to the group feedback threads and not to the personal feedback provided. it does become clear later on, but it would be useful to have it stated explicitly.
    - Results:
    -- If I were conducting a formal review for this paper, I would have significant concerns about the claim surrounding 'passive participation'. If you can't provide any specific evidence to support this claim, I think it would be better to remove it. You could still speculate that this occurs, of course (I'm sure it does).
    -- Not sure of the value of comparing the length of four feedback threads: quoting the range (5105-5599) is fine, but as we (readers) don't know anything about the content of these threads the comparison isn't particularly relevant. The comment which is provided (about the maths quiz) only provides a possible explanation for FB04, but FB01 is only 3 words longer( and has more thread participants hence shorter average comments)
    -- It would be useful to have a comment on the fall off between FB1 and FB2,3 and 4 in number of thread participants and +1s, and FB1/2 and FB3/4 in terms of reshares. As a reader I can see these patterns just by looking at the data so I expect the author to comment on them, even if it is only to say 'this is just noise' or whatever.
    -- ... Average comment length was 77 words ... I suppose if I was being thorough, i would want to know median/mode etc - are there a few long commments and lots of 'I agree's'? Also, this is averaged over the four Feedback threads - see the comment above.
    -- 'Overall, most of the comments received from students who actively engaged with feedback were positive' sets alarm bells ringing with me: I'd been assuming that the discussions occuring in this thread were about the feedback content,but this implies that at least some comment actually referred to the process. Or does this statement refer to comments collectred separately. It would be good to clarify.
    -Discussion: it would be useful to acknowledge some of the questions which this study wasn't able to address, and perhaps suggest a design for further study. For instance, textual analysis to see what is happening in these feedback threads (do they become lengthy discussions about specific issues, do they extend beyond the original content, is there any evidence for them leading to new learning - e.g students suggesting resources etc) or a comparitive study with 'in class' feedback sesssions (which in a class of 250 would necessarily be unidirectional tutor-student).

    Of course, you come to the end of a review and it all feels very negative, as I have listed the problems I had with it, and I was invited to be critical. I should say that I think the paper is interesting and well written, and certainly the results presented are valuable. And I applaud the open pblishing with peer review approach you've taken.

    ReplyDelete
  5. Overall I think this is an intriguing paper, which has opened my eyes to new ways of doing things - and as someone now NOT teaching underg's, I should shy away from some of the pedagogical issues...however I concur with another comment, that word clouds prove very little, apart from looking pretty. They are great for slideshows where you can go for the "oh, wow!" response, but mean little in this context. Possibly replace them with a table with some more substantive analysis of the student responses?

    Otherwise, great idea!

    ReplyDelete
  6. The idea of using Google + to enable students to participate in an aysynchronous discussion of generic feedback on assessement tasks looks promising as a learning opportunity.

    I am less clear about quantities and qualities. How many students in the cohort were active rather than passive? how much staff time spent in administering the threads (reading, prompting)? how might we get an idea of how such online writing contributes to learning (beyond being 'a good thing in itself?'

    ReplyDelete
  7. I found the paper really stimulating from a personal point of view - it got me thinking about the larger conceptual contexts, and about my own practice - good abstract and introduction.

    ReplyDelete
  8. I really like the focus of this paper - assessment in today's digital context is a key issue. We need to align what we are trying to do pedagogically with the power of new social media. GooglePlus is a good environment to use for this form of peer feedback. Minor points when you talk about reflection you might like to reference Dewey. For the cycle bit add Kolb. pg 2 Dialogue not dialog. Did you get any quotes from students if you did would be good to include something on this.

    ReplyDelete
  9. Hi Alan, I love what you're doing here. I certainly found feedback on my blog useful for my thesis! (see Appendix 2 at http://neverendingthesis.com)

    p.3 - What if students *have* established a social identity on Google+? Longer-term, perhaps the Facebook/G+ is not a useful distinction? (for example on p.4 you mention that students are 'used to commenting via G+')

    p.3 - The acronym MCQ is not introduced, nor is a rationale for their use given.

    p.6 - I haven't got a reference for you (sorry) but I'm fairly sure there's research around giving students both a mark and formative feedback which shows that mixing the two leads to them just focusing on the score? It would be good to reference that.

    A useful paper overall that would be great if tightened up using the feedback(!) above. I think a 'how-to' style blog post to accompany it would also be handy. :-)

    ReplyDelete
  10. I forgot to add this earlier. From my reading only two problems stood out.

    Background. How do students form a cohort on Google+ Is the tutor a central point or do students connect directly to each other too? If so how is this achieved? because circle sharing isn't going to be obvious to people who don't use Google.

    Figure 1. I confused the running title with the Fig 1 label. This will probably not be an issue in a printed form of the document.

    Is the world a better place with this paper in it? Yes.

    Is it taking space that could be used for a more worthy paper? No.

    ReplyDelete