Week 9 summary

At the beginning of the week I focused on looking around to get some basics about Learning Analytics (LA), adding links to the HEA and Jisc for example. Overall in general I got the sense that the advantages of LA for the learner was for the institution to be able to provide support and guidance. The proof of this seemed to be increased retention. However, I then explored many examples of algorithms gone wrong and the human impact of this.

I have been a LA sceptic without having an in-depth knowledge on the topic and I was quite surprised that during the tutorial many shared my cynicism and highlighted the need for more qualitative and contextualised analysis.

As education has higher and higher student numbers and fewer teachers LA is yet another way to solve this problem – I tried to show this on my image – but I don’t believe that it will. In any conversation I have ever had with others LA is always framed in terms of surveillance. Essentially, ‘we need to track our content to prove student didn’t engage’.  Every time I ask why.  ‘Proving’ the student has clicked on content is absolutely no proof of engagement. Only a human making contact, face to face or virtual, can deduce this and know if the student is having problems or is simply working at their own pace and timetable. I added my own annotated LARC report to help me frame this.

In addition, there is also the issue of ethics around the collection and analysis of all this ‘big data’ and I tried to highlight this by including the blog post by Lorna Campbell who tried to question a software company on its collection of data policies.

I ended the week by including a Storify and a TAGS Explorer map of the, very informative and enjoyable, Tweetorial as I tried to visualise the conversation as this helps me see past the numbers.

One Reply to “Week 9 summary”

  1. ‘the advantages of LA for the learner was for the institution to be able to provide support and guidance. The proof of this seemed to be increased retention. ‘

    Yes, that does seem to be the way analytics is framed in a lot of cases. How much is this driven by institutions wanting to improve retention figure though, rather than actually improving learning experiences for their students?

    ‘As education has higher and higher student numbers and fewer teachers LA is yet another way to solve this problem’

    Yes, it does seem quite difficult for learning analytics projects to escape an underlying interest in efficiency. Indeed, in one of the keynotes at the LAK conference (http://lak17.solaresearch.org) I was at last week, the speaker pretty much concluded that analytics is only needed in large classes where the teacher has little contact with students. You make a good point about the face to face here, however I wonder if it is too easy to say, well, just get more teachers? Isn’t analytics providing a solution to an unfortunate, but very real problem?

    I really liked your Storify presentation of the Tweetorial, and you might want to compare this, and the TAGS one with the Tweetarchivist presentation this week. What do you ‘get’ from each of these visualisations: what kind of ‘community’ are they conveying, what are you ‘supposed’ to understand from them? How do they compare with your own memory of experiencing the week?

Comments are closed.