Tuesday, October 29, 2013

The network as proxy

Cover A new paper in BJET discusses the use of social network analysis as a proxy for academic performance. Unfortunately, it doesn't refer to our earlier work in a this area (Badge, J. L., Saunders, N. F., & Cann, A. J. (2012). Beyond marks: new tools to visualise student engagement via social networks. Research in Learning Technology, 20).

Reviewing the differences in size, composition and structure between the personal networks of high- and low-performing students. Br J Educ Technol (1 November 2013), doi:10.1111/bjet.12110
Abstract: An interesting aspect in the current literature about learning networks is the shift of focus from the understanding of the “whole network” of a course to the examination of the “personal networks” of individual students. This line of research is relatively new, based on small-scale studies and diverse analysis techniques, which demands for more empirical research in order to contextualize the findings and to meta-analyze the research methods. The main objective of this paper is to review two research questions posed by a previous British Journal of Educational Technology contribution by Shane Dawson in order to know whether the differences in personal network composition impact on the performance of students. The two questions were defined by Dawson as follows: (1) Are there significant differences in personal network composition between high- and low-performing students? and (2) Do high-performing students have larger personal networks than their low-performing peers? In addition, the “clustered graphs” method used in this study allows the inclusion of the structural analysis of personal networks. In doing so, a new research question is addressed: (3) Are there significant differences in personal network structure between high- and low-performing students? This paper tries to answer these questions in the context of two undergraduate, inter-university and fully online courses, and two different technology-enhanced learning environments (a virtual learning environment and a personal learning environment) where interactions took place indirectly through shared resources. The results show that the network behaviors of high- and low-performing students' are strongly correlated, and that high-performing students developed larger personal networks than low performers.

The analysis in this paper (wisely) stops at correlation. But correlation is enough to get you into trouble.
"I spent 24 hours last weekend at the Isle of Wight literary festival, only in its second year but already thriving. I went to a session on spies with the writers Roger Hermiston and Professor Richard Aldrich. Aldrich has written a fascinating book about GCHQ. He said that so much of what we do now is on the electronic record that we live in a world that offers neither secrecy nor privacy. You might imagine we still have a secret ballot. Yet analysis of our buying patterns in the supermarket can, these days, reveal how we vote to a likelihood of 87%-90%." Guardian
I think it's fairly clear where learning analytics is heading, from the descriptive to the presumptive. At a time when I'm considering deciding what to buy from Sainsbury's based on a random number generator, I'm not sure what I think about these developments, except that I know I'm not sanguine about them. I believe this sort of data has real societal value, whether it be catching terrorists or supporting student learning, but the present situation we are slipping into is unacceptable. Yesterday I spent an hour interrogating Blackboard to extract data about submissions to pre-lab quizzes. I was interested in how far in advance of the practical classes students had submitted their answers, but the data buried in Blackboard tells me much more than this - not only facts such as IP addresses but far more subtle patterns. Even a simple visual inspection without software tools reveals that the majority of (but not all) students submit their answers to the quizzes multiple times. Why? Because having submitted once and seen the feedback, they then resubmit (sometimes multiple times) to ensure the mark recorded by Blackboard is as high as possible. We could infer all sorts of things from observing this behavior (such as the fact that students are engaging with the assessment part of this exercise rather than the feedback). This is the pattern followed by most students, but what about those students who only submit once, then move on - how should we respond to them? And most importantly, are these students aware of the information that their online activity patterns are revealing? Exactly how evil are the NSA, GCHQ, Blackboard?

There is a way to square the circle. Transparency. Unlike the present situation, we need to make people much more aware of what data is being collected and the implications of this. That's where the NSA went wrong. We in higher education should not follow the same path.

No comments:

Post a Comment