This is a quick mind-map made after reading Jeremy Knox’s paper on Active Algorithms, 2015.
The article was a really useful and clear link between the community cultures we have been studying on ECD and algorithmic cultures we are beginning to look at now, demonstrating how the technical (algorithmic), social and material come together to constitute situations in which agency becomes blurred and impossible to locate.
I explored the concept of sociomateriality in my last mscde module, focusing on using it as an approach for IT personnel to examine the ways in which technologies are designed, supported and used. Knox’s paper has added an extra dimension to this study of sociomateriality by thinking of it in spatial terms – the contingent and complex enactment of a learning space as enabled by the social and material.
There is a tendency to think of coding and algorithms as being non-human agentic forces, forgetting the very human intention that has gone into their compilation. Rather, in the non-human camp, and after my experience of ifttt, I would add breakdown and intermittent loss of connectivity and functionality as threads in the entanglement. These happenings are a very real and affective part of our experience of technology and they are often due to emphatically material failure.
I have now just started reading the Gillespie article which makes me want to investigate our creeping acceptance of algorithmic control. We acquiesce in Google’s algorithms because we find it such a useful search engine, we make ourselves marketing targets because online retailers are so convenient. We submit to the narrowing chambers of our social media sites threaded with popular news items, even fake ones, because it is great to keep up with our friends.
It is interesting that the government carefully researches ‘nudging’ us to make ‘better choices for ourselves’ (Behavioural Insights team) whilst watching our global corporations use every trick in the book to relieve us of our cash.