Week 8 Lifestream Summary

It’s been an interesting week experimenting with algorithms. I’ve enjoyed trying to ‘reverse engineer’ the Amazon recommendation algorithm and, ultimately, going some way toward disproving my own hypotheses.

Reflecting on the cultural aspects of algorithms, I see similar dichotomies in the views people hold about them to those we saw documented in the literature relating to cyberculture and community culture.  To me this is clearly a linking theme and I see possibilities in exploring this in my final assignment for this course.

As with many other topics, the views people hold are likely to be heavily influenced by the media and, just as all things ‘cyber’ are often painted as worthy of suspicion, this does seem to be a ‘go to’ stance for copy writers and producers when taking a position on algorithms.  The banking crisis is probably the biggest world-wide event that has contributed to this and other stories, such as reliability issues with self-driving cars, or inaccuracy of ‘precision munitions’, add to the general feeling of unease around the use and purpose of algorithms.  I chose the latter two examples deliberately as there is a moral as well as technical aspect to both.

So the stories about algorithms that help people control prosthetic limbs more effectively, or to ‘see’ with retinal implants, or even driver-less cars  travelling tens of thousands of miles without incident, can be lost amongst more sensationalist stories of those same cars deciding ‘whose life is worth more’ when an accident is unavoidable.

As a result I wonder how much knowledge the general public has about the algorithms that make their day to day life a little easier, by better predicting the weather, ensuring traffic junctions cope better with rush hour traffic, or even just helping people select a movie they’re likely to enjoy.

One could argue that this underlying distrust of algorithms is no bad thing, particularly if this can lead to unbiased critical appraisal of their use in a particular field, as highlighted by Knox, J. (2015) with regard to their use in education:

“Critical research, sometimes associated with the burgeoning field of Software Studies, has sought to examine and question such algorithms as guarantors of objectivity, authority and efficiency. This work is focused on highlighting the assumptions and rules already encoded into algorithmic operation, such that they are considered always political and always biased.”

This week has made me a little more uneasy about the way “algorithms not only censor educational content, but also work to construct learning subjects, academic practices, and institutional strategies” Knox, J.  (2015).  In my professional practice we do not have the sophistication of systems that would make this a concern, but our learners are exposed to and learn from other systems and their apprehensions about how we might use their data will no doubt be coloured by their view of ‘big data’.  With that in mind this is clearly a subject I should have on my radar.

Apologies for writing double the word limit for this summary and including new content rather than summarising, it’s one of those subjects that, when once you start writing, it’s difficult to stop!


Knox, J. 2015. Algorithmic Cultures. Excerpt from Critical Education and Digital Cultures. In Encyclopedia of Educational Philosophy and Theory. M. A. Peters (ed.). DOI 10.1007/978-981-287-532-7_124-1


2 thoughts on “Week 8 Lifestream Summary”

  1. Useful and detailed summary here Nigel – do try to stay within the 250-word limit for these, but yes, sometimes one needs to elaborate. It might be worth trying to extend some of these ideas in a separate post.

    ‘ To me this is clearly a linking theme and I see possibilities in exploring this in my final assignment for this course.’

    Sounds good – so this would be oppositions between ‘pure’ humans and ‘distinct’ technologies? Or dystopia and utopic views of technology? I agree that these are a strong themes that work through the blocks. It would be important to see the different ways this dualism comes about in the cyber- community and algorithmic themes though.

    I think you’re right to question some of the dystopia visions of algorithms, and I think we do need to trace the social systems they are embedded in. Some of your examples here are good in this respect: an algorithm that controls a prosthetic limb doesn’t make decisions that ‘plug in’ to the same social conditions as, say, a social media news feed. They are both ‘algorithms’, but the ‘algorithmic system’ involved is more complex?

    1. Yes I was thinking specifically of the dystopia and utopic views of technology and I think I can construct a education-based narrative from the available texts that would include all three of our themes, that said Hand’s ‘Narratives of promise and threat’ sets the benchmark very high!
      > an algorithm that controls a prosthetic limb doesn’t make decisions that ‘plug in’ to the same social conditions as, say, a social media news feed.
      No, not directly. The point I was trying to make is that that education (or any other field for that matter) does not exist in a bubble. Public opinions, and therefore cultural shifts, are often based on more generalised views. In this respect whether an algorithm decides what I see in my recommended shopping list, or stops a plane from falling out of the sky is almost irrelevant. In my opinion it’s the overall ‘good’ or ‘bad’ reputation a type of technology has (which is largely created by the popular media), that results in a dystopia or utopic view. In this respect algorithm becomes short hand for something marvellous or malevolent, depending on your opinion.

Leave a Reply