Lifestream Blog – Final Summary

My lifestream blog contains a blend of sourced and composed resources that reflect the key themes of Education and Digital Cultures. To fully explore each theme I conducted a series of practical exercises to gain insight from both an institutional and individual perspective. The content of my blog highlights many different points of view on each theme and is reinforced by experimentation that ultimately allowed me to construct knowledge of each topic through experience.

I was intrigued by cybercultures and the concept of posthumanism. It would appear that the human race is no longer satisfied with colonising digital territories and now seeks to infuse technology with our minds and bodies. I learned of an ethos that digital is better and that mechanical intervention will inevitably lead to progress whilst acknowledging the antithesis and realising that this may not always be the case.

The political and economic factors (Lister et al 2009) influencing digital education also intrigued me. This was most evident in my micro-ethnography where economic gain was the driving force of the MOOC in which I participated. My micro-ethnography would suggest that there are indeed limitations within a LMS that contribute to the perception of online community cultures, but that they only exaggerate circumstances that often originate out-with digital spaces.

As with most scenarios where the physical and digital worlds intersect there are inevitably ethical considerations to acknowledge. I noticed that ethics was a recurring theme throughout each block of the course, be it the ethics surrounding cyborgs, online communities, and analytics and big data. I learned that that no matter how great and efficient digital cultures make us, we are still human beings with qualities and principles that cannot be expressed digitally – ethics and responsibility being the two most relevant to the course.

Throughout the course I have questioned if, as human beings, we are supposed to benefit as individuals from digitisation – particularly when studying algorithmic cultures. In studying my own performance and analytical data from an online learning activity, I gained experience of the impact that exposure to learning statistics has on students. I realised that whilst big data and analytics support the notion that digital is better, within education this may only ring true for the institution and not the individual. This was an invaluable experience in connecting my understanding of the course themes to the content of my lifestream blog.

My lifestream blog shows the ubiquity of digital cultures in business, politics, education and everyday life. Our internet browsing trends, shopping habits, and social media interactions are being shaped and influenced by digital trends set by computer interpretation of our behaviours and actions. Education is merely another strand of life that is being made more efficient, accessible and available by digital intervention.

On conclusion, one could also observe a shift in digital culture over time. In the early stages the purpose of digitisation was to assist humans to do basic tasks. This gradually evolved into doing machines performing complex tasks and exceeding the limitations of human form. In the present, we are using technology as an alternative form of intelligence and as a tool for efficiency and predicting the future. Certainly, if transhumanism and cyberpunk ideologies come to pass, then the human form will play a lesser role in both education and the wider society.


References

Lister, M., Dovey, J., Giddings, S., Kelly, K. (2009). Networks, users and economics. In New media: a critical introduction. M. Lister (Eds.) (London, Routledge): pp. 163-236.

Week 11 – Weekly Synthesis

This week I have been reading through my Lifestream blog and tidying up my tags and metadata.

In a way I have been analysing some fairly low key learning analytics on myself. After adding and editing the tags on all of my posts I was able to quickly identify and notice the key themes that I have studied over the duration of this course. Until recently I had considered basing my assignment on algorithmic cultures and their influence in shaping digital education. However after reviewing my tags I noticed that over the last 11 weeks I have shown the strongest and most interest in community cultures – particularly within MOOCs. As a student, it was the first time I have been able to use data to make an informed and proactive decision about my studies with no influence from my tutors or the University as an institution.

I used the rest of this week to dabble in some of the less popular themes within my blog, such as the use of music in education. My tweet referencing a blog that draws metaphoric comparisons between MOOCs and music albums killed two birds with one stone. I was able to revisit the music themes whilst exploring the use of metaphors with community culture.

I have also visited the blogs of my peers to see what technologies have been used over the course to create artefacts and ethnographies in the hope to find some inspiration for presenting my digital essay.

 

 

Week 10 – Weekly Synthesis

It was good to catch up with the group during this week’s Google Hangout. I always really enjoy discussing recent tasks and themes with my peers as I always find a new and interesting points to consider as a result.

One example of such points would be considering the difference between text based communications on Twitter in comparison to those within a MOOC. My hand-drawn diagram within the ‘#mscedc Digital Cacophony – Tweetorial vs MOOC’ post suggests that despite not being purposefully built for education, I found Twitter to be a more suitable forum for group discussion.

Following the Tweetorial I further investigated the need for analytics and big data within education. In the post entitled ‘Big Data, the Science of Learning, Analytics, and Transformation of Education’, Candace Thille noted that online environments encourage students to collaboratively move towards set goals whilst being able to synthesise knowledge to apply in new contexts. It is that ability that held my interest throughout the week and became a consideration that I took into my critical analysis of the Tweetorial.

For my critical analysis I examined the analytical data from the Tweetorial. I found myself comparing my performance to that of my fellow students and documenting my thoughts from both an individual and a collaborative perspective. The data would indicate that I made a lower than average contribution which on initial observation could be interpreted negatively. However I felt that I both contributed and received useful information throughout the activity and constructed new knowledge as a result.

I felt that this week afforded me the opportunity to gain first-hand experience of the topics and themes that I have been studying.

 

Stuart Milligan the Tweetorial participant vs Stuart Milligan the student – A critical analysis

Introduction

Week 9 of Education and Digital Cultures was my first experience of a ‘Tweetorial’. It was a very public way for our group to explore the topic of ‘Learning Analytics and Calculating Academics’. The openness was certainly consistent with the ethos of the course as a whole. The activity encouraged the group to engage with each other (and indeed the wider Twitter community) to discuss a range of topics that were explored throughout the previous few weeks. The benefit of using Twitter to facilitate the activity was to gather data and analytics by using the #mscedc hashtag and some Twitter-related data archiving tools.

I had mixed feelings about participating in such an expanded forum.  A combination of fears such as exposing my learning to a huge and unfamiliar mass of people, time constraints and a 140 character messaging limit all contributed to my less-than-average participation throughout the duration of the activity. Overall however, I felt that I had made a decent contribution to the Tweetorial.

 

Summary

The Tweet Archivist data added a much needed context to a seemingly fathomless digital abyss. An immediate example of a surprising statistic was that around 700 (at time of writing) tweets were posted during a 19 day period. In my self-defined role as a ‘small contributor/big lurker’ at no point during the Tweetorial did I ever feel aware of the high volume of activity going on around me. It is only on reflection that I consider this statistic to be accurate. I find it interesting that the total number of text based contributions during the Tweetorial mirrors that of an average discussion forum that I observed within the ‘Internet of Things’ MOOC. Despite this similarity, I cannot say that I was aware of the same “digital cacophony” (Milligan 2017) that I experienced whilst conducting the micro-ethnography on the IoT MOOC.

The Tweetorial can be considered a success when comparing the final analysis with the objectives identified prior to the start of the activity. The aim of the Tweetorial was to conduct “some intensive tweeting around the ideas raised in weeks 8 and 9 of the course”. The top word analysis successfully identified and summarised the key words and discussion topics that have emerged throughout the preceding 8 weeks of the Digital Cultures course.

 

Analysis

Some of the final statistics cast a sobering effect over me when I contrasted them with my own evaluation of contribution to the Tweetorial – most notably with the top user and user mention statistics. Prior to reading the final analysis I was content with my contribution and felt that I had contributed to most discussion threads and had a decent input to the Tweetorial. However after realising I was ranked 18th (out of 25) in the top user table and that I did not feature in the user mention rankings at all I felt somewhat deflated. Based on this, I felt relatively insignificant to both the activity and to the wider Twitter community whilst also feeling slightly embarrassed and disappointed in myself. As Kohn (1999) suggests, exposing students to ranking systems turns education into a competitive process rather than a learning one.

As I sought solace I investigated the analytics associated with my own Twitter account. I was uplifted after reading that during the same 19 day period my own tweets:

  • had 3600 impressions
  • received 39 likes (avg 2 per day)
  • received 15 replies (avg 1 per day)

From an individual perspective I was generally happy with these statistics and was relieved when I compared them with the same metrics for the group. I was therefore afforded the opportunity to appreciate that general analysis of big data often neglects the circumstances and performance of the individual. Though my performance was considerably lower than that of my peers I certainly felt that I constructed knowledge and make a contribution to the Tweetorial with which I am happy.

Personal analytics
Personal analytics

 

Conclusion

In conclusion, as a learner I feel that there was little educational value in having access to analytic data of my performance within the Tweetorial. If anything, reviewing the data made me feel apprehensive and worried about my performance in comparison to my peers – whereas my individual analysis proved to be quite pleasing. I felt that I had contributed enough to both learn from and contribute to the activity, the only doubts that I had were as a direct result of comparing myself with others.

Due to the nature of the activity I felt very limited by having no opportunity to re-visit the Tweetorial and make additional contributions to alleviate my concerns. However I do wonder if further learning could be achieved if I had the opportunity to make more contributions. I could potentially fall into the trap of tweeting for the sake of tweeting, just to improve my statistics which would have little or no benefit for either the group of myself.


References

Kohn, A. (1999). From Degrading to De-Grading. Retrieved: 24 March 2017. http://www.alfiekohn.org/article/degrading-de-grading/

Milligan, S. (2017). The Internet of Things MOOC’ – First Impressions. Retrieved 24 March 2017. http://edc17.education.ed.ac.uk/smilligan/2017/02/12/the-internet-of-things-mooc-first-impressions/

Liked on YouTube: Big Data, the Science of Learning, Analytics, and Transformation of Education

Big Data, the Science of Learning, Analytics, and Transformation of Education
From the mediaX Conference “Platforms for Collaboration and Productivity”, Candace Thille, with the Stanford Graduate School of Education highlights the power of platform tools and technologies to transform observation and data collection. This process enables researchers from industry and academia to know their user better – as consumers, as producers, and as learners.
via YouTube https://youtu.be/cYqs0Ei2tFo

Week 9 – Weekly Synthesis

Week 9 already! Wow!

This week’s Lifestream activity has been dominated by the group ‘Tweetorial’ in which we investigated some topics and issues highlighted in the recommended viewings and readings. In summarising my Tweetorial activity, I would note that I contributed to discussion threads surrounding the following key themes concerning Big Data and Learning Analytics (LA):

  • Ethical considerations
  • Social media influence on algorithmic culture
  • Big data influence over students
  • Algorithmic pattern identification
  • Dependence on analytics

I felt it essential to explore the vastness of Big Data and to consider the implications of identifying patterns when it is analysed. I felt that this week’s recommended material focused on either how data was gathered/analysed or the resulting consequences for students. Therefore, I became increasingly interested in the gap between big data and hypotheses and what new knowledge we can discover from the space in between. My ‘Analyzing and modeling complex and big data’ post attempted to address this issue.

Following on from the ‘Tweetorial’ I was motivated to explore some of the issues raised to put them into a relevant context. My ‘Learning Analytics – A code of practice’ post summarised my investigation into a JISC funded LA project in which the project team addressed many (if not all) of my concerns around ethics and student intervention. In hindsight, I had only really considered LA from the perspective of the institution and the learner – not of the individual as a person.

It was another enjoyable week and I’d like to thank my tutors and peers for a very engaging Tweetorial.

 

Learning Analytics – A code of practice

This week’s Tweetorial highlighted areas of Learning Analytics (LA) that I was interested in investigating further – in particular ethics and student intervention.

Until recently I had a vague awareness of a JISC funded project aimed at developing a Learning Analytics service for UK Colleges and Universities (Jisc, 2015). I decided to delve into the project’s Code of Practice to gain a clearer understanding of how the education sector currently addresses some of the issues that we have been discussing this week.

During the Tweetorial, James Lamb asked the #mscedc group:

James' Tweet
James’ Tweet

I responded by tweeting:

Stuart's Tweet
Stuart’s Tweet

Therefore, I was relieved to read that JISC acknowledge that “Institutions recognise that analytics can never give a complete picture of an individual’s learning and may sometimes ignore personal circumstances”.

What I also found to be of high interest when reviewing the Code of Practice was guidelines relating to student access to analytical data. JISC stress “If an institution considers that the analytics may have a harmful impact on the student’s academic progress or wellbeing it may withhold the analytics from the student, subject to clearly defined and explained policies.”

I found this fascinating as we have been considering the potential consequences for students based on the comparison between analytical output and an institution’s performance benchmarks. What I hadn’t considered is how a student’s performance may be affected by viewing their own analytical data.


References

JISC. (2015). Code of practice for learning analytics. Retrieved: 18 March 2017. https://www.jisc.ac.uk/sites/default/files/jd0040_code_of_practice_for_learning_analytics_190515_v1.pdf