Tuesday, October 16, 2012

#DarkSocial - The Results Are In

DarkSocial At the weekend The Atlantic published an article which has caused a bit of a stir: Dark Social: We Have the Whole History of the Web Wrong. This was timely for me as it runs head on into the draft of a post I was planning for this week.

For the past three years I (or we, little w) have been using social networks with students, starting with Friendfeed and moving on to Google+ last year. We have always "forced" students to use the network (alongside face to face and email support) by means of assessment, even if that only amounted to a small proportion of their overall marks. I've never been happy about doing this, but the argument was that it was for the greater good (by generating a network/cohort effect). But I've never been convinced I was doing the right thing, that the ends justified the means, and faced with growing evidence to the contrary (e.g. What Students Want), this year I stopped. Students were given a choice of signing up to Google+ to receive support, or relying on face to face help sessions, and/or email support. I considered using my old personal Facebook page that I've never really found a use for as an alternative channel, but in the end didn't go with that because I couldn't afford to spread my time too thinly between too many different channels (already adding email to Google+ and f2f was a considerable increase in workload for me). And now the results are in. You want numbers? I got numbers.

29% of students have signed up for Google+ so far (c.f. ~99% under the "compulsion" of assessment). Doesn't sound too bad? I haven't got to the bad news yet. Of those, 0.7% of the cohort have become active contributors. That's pretty much in line with the conversion rate to long term active users from previous years, and also with the Nielsen participation inequality ratio. We know that 5-10% of students are Visitor lurkers, never actively contributing but drawing information from the network occasionally, again, in line with the Nielsen ratio.

Does this mean that social is working? I say no. DarkSocial (mostly email) outperforms social by anything from 3:1 to 10:1, depending on what I measure and how I measure it. What does perform mean? It means actively engage - ask questions, contribute information - interactions other than passive consumption. For me, that's a pretty damning indictment of social technologies. Who cares? I do. This inefficient DarkSocial technology does not scale in that same way that networks do. There is no cohort (peer) effect. And students become passive information consumers rather than producers.

There's no doubt that DarkSocial "works", in the way that Web 1.0 "worked" as a filing cabinet or a noticeboard, in the way that a VLE "works". My @leBioscience project is doing great, performing exactly as I want it to, where I have promoted email as the primary distribution channel (82% of subscribers) - lots of passive readers (but no interaction). That lack of engagement is not what I want when I teach.

Why is DarkSocial so tenacious? Bring Your Own Device (BYOD) has not really impacted on us as much as I expected it to this year, but mobiles have - which is problematic since many apps are inadequate for academic use, lacking functionality (the Google+ app being a case in point). I believe I am seeing the influence of inadequate, underpowered mobile clients, although that is not the complete reason for the failure of social. Email certainly works well on mobile these days - the CrackBerry Effect.

Does my experience of #cfhe12 back up the idea of DarkSocial? Maybe, it's certainly been an underwhelming experience so far, and not what I was hoping for from a cMOOC (other explanations are equally possible ;-). What does any of this matter? I need to know where I'm going, what to invest in. Social has been oversold. Is it time to invest now, or to pull out?

See also: GigaOhm Dark social: Why measuring user engagement is even harder than you think


  1. Our first year biology students have access to a Blackboard discussion forum. This is used for academic and admin discussions, with considerable input from the Head of First Year.

    Reading the discussion forum or posting in it is not compulsory and no assessment is linked to the forum.

    Analysis of last year's data shows that all students accessed at least some of the discussion posts.

    And the active participation rates were high:

    90% of the postings were contributed by 20% of the participants - rather than 1% in Nielsen's model.

    63% of students posted at least once to the forum, so the percentage of non-active participants was 27%, rather than 90% in Nielsen's model.

    What are we doing right?

  2. I wish I knew. I've heard similar stories from other people, but my own experience has been to the contrary, and I've also heard lots of stores where VLE discussion boards are a desert. Clearly local circumstances affect behavior considerably.

  3. I think before declaring that social is not working we have to think about when social does work.Is it working for you or me? Why and how? It works for us because what we are learning feels quite different to what students are often learning. That's my guess.

  4. Social is working for me, most of the time - i.e. except when I'm trying to communicate with me colleagues who do not use social media. But the data shows that it is clearly not working for most (99%) of my students). I'd say the difference is at least partly due to Visitor and Resident behavior: but it's also due to accepting a culture of openness. (Or not.)