I started my week trying to find out what exactly algorithms were. I had a vague understanding that they were part of the coding that looks for patterns and then changes functionality of certain online spaces, usually to do with shopping and social media. I’ve mostly come across them through social media feeds where influencers are usually advocating for you to turn notifications on about their posts. What surprised me when I started looking for information about how algorithms work, almost as often information on how to manipulate them popped up.
I was trying think about how algorithms may influence education and where they might fall short when I stumbled upon the amazing Joy Buolamwini. She highlighted the real consequences of how having a lack of diversity in programming can impact technology in ways we do not expect. It was evident from her experience that technology rendered her invisible by not being able to read her features. I wonder how many other invisibilities are not yet evident.
We met for our weekly Skype group and some of the bigger themes emerging from that conversation were about how algorithms are used for control and surveillance. We wondered if this might cause students from certain, ethnic, socio-economic backgrounds to be marginalised.
The TED talk on How algorithms shape our world. Was really insightful on how algorithms link. The ‘ecosystem’ metaphor Slavin used echoed Active algorithms: Sociomaterial spaces in the E-learning and digital cultures MOOC (Knox 2014).
It was in this vein I found Hack Education’s article about the Algorithmic Future of Education. Watter’s highlights how the marketization of education and how important ‘care’ is when dealing with students.
I rounded the week off working with Stuart by comparing how algorithms work in different online spaces.
Knox, J. 2015. Algorithmic Cultures. Excerpt from Critical Education and Digital Cultures. In Encyclopedia of Educational Philosophy and Theory. M. A. Peters (ed.). DOI 10.1007/978-981-287-532-7_124-1
‘when I started looking for information about how algorithms work, almost as often information on how to manipulate them popped up’
That’s a really interesting insight, and says quite a bit about ‘where we are’ with algorithms presently – people are perhaps becoming more aware of what algorithms are and how they work, but just who those people are and what their motivations are for doing it would be revealing. One ends up with quite a strange situation when people’s activity online is in response to what they think the algorithm is doing, rather than just ‘using’ social media…
Joy Buolamwini’s work is great, isn’t it? Really like this, such a superb example of algorithmic bias.
Your collaborative work with Stuart is absolutely brilliant – so much so, I’m thinking about something like this for future iterations of the course! The experimental nature of comparing two different individual’s ‘algorithmic play’ is great, and really shows potential differences, which aren’t always evident for the individual. Fantastic work you two!