Week 9 – Weekly Synthesis

Week 9 already! Wow!

This week’s Lifestream activity has been dominated by the group ‘Tweetorial’ in which we investigated some topics and issues highlighted in the recommended viewings and readings. In summarising my Tweetorial activity, I would note that I contributed to discussion threads surrounding the following key themes concerning Big Data and Learning Analytics (LA):

  • Ethical considerations
  • Social media influence on algorithmic culture
  • Big data influence over students
  • Algorithmic pattern identification
  • Dependence on analytics

I felt it essential to explore the vastness of Big Data and to consider the implications of identifying patterns when it is analysed. I felt that this week’s recommended material focused on either how data was gathered/analysed or the resulting consequences for students. Therefore, I became increasingly interested in the gap between big data and hypotheses and what new knowledge we can discover from the space in between. My ‘Analyzing and modeling complex and big data’ post attempted to address this issue.

Following on from the ‘Tweetorial’ I was motivated to explore some of the issues raised to put them into a relevant context. My ‘Learning Analytics – A code of practice’ post summarised my investigation into a JISC funded LA project in which the project team addressed many (if not all) of my concerns around ethics and student intervention. In hindsight, I had only really considered LA from the perspective of the institution and the learner – not of the individual as a person.

It was another enjoyable week and I’d like to thank my tutors and peers for a very engaging Tweetorial.

 

Learning Analytics – A code of practice

This week’s Tweetorial highlighted areas of Learning Analytics (LA) that I was interested in investigating further – in particular ethics and student intervention.

Until recently I had a vague awareness of a JISC funded project aimed at developing a Learning Analytics service for UK Colleges and Universities (Jisc, 2015). I decided to delve into the project’s Code of Practice to gain a clearer understanding of how the education sector currently addresses some of the issues that we have been discussing this week.

During the Tweetorial, James Lamb asked the #mscedc group:

James' Tweet
James’ Tweet

I responded by tweeting:

Stuart's Tweet
Stuart’s Tweet

Therefore, I was relieved to read that JISC acknowledge that “Institutions recognise that analytics can never give a complete picture of an individual’s learning and may sometimes ignore personal circumstances”.

What I also found to be of high interest when reviewing the Code of Practice was guidelines relating to student access to analytical data. JISC stress “If an institution considers that the analytics may have a harmful impact on the student’s academic progress or wellbeing it may withhold the analytics from the student, subject to clearly defined and explained policies.”

I found this fascinating as we have been considering the potential consequences for students based on the comparison between analytical output and an institution’s performance benchmarks. What I hadn’t considered is how a student’s performance may be affected by viewing their own analytical data.


References

JISC. (2015). Code of practice for learning analytics. Retrieved: 18 March 2017. https://www.jisc.ac.uk/sites/default/files/jd0040_code_of_practice_for_learning_analytics_190515_v1.pdf