It’s been an interesting week experimenting with algorithms. I’ve enjoyed trying to ‘reverse engineer’ the Amazon recommendation algorithm and, ultimately, going some way toward disproving my own hypotheses.
Reflecting on the cultural aspects of algorithms, I see similar dichotomies in the views people hold about them to those we saw documented in the literature relating to cyberculture and community culture. To me this is clearly a linking theme and I see possibilities in exploring this in my final assignment for this course.
As with many other topics, the views people hold are likely to be heavily influenced by the media and, just as all things ‘cyber’ are often painted as worthy of suspicion, this does seem to be a ‘go to’ stance for copy writers and producers when taking a position on algorithms. The banking crisis is probably the biggest world-wide event that has contributed to this and other stories, such as reliability issues with self-driving cars, or inaccuracy of ‘precision munitions’, add to the general feeling of unease around the use and purpose of algorithms. I chose the latter two examples deliberately as there is a moral as well as technical aspect to both.
So the stories about algorithms that help people control prosthetic limbs more effectively, or to ‘see’ with retinal implants, or even driver-less cars travelling tens of thousands of miles without incident, can be lost amongst more sensationalist stories of those same cars deciding ‘whose life is worth more’ when an accident is unavoidable.
As a result I wonder how much knowledge the general public has about the algorithms that make their day to day life a little easier, by better predicting the weather, ensuring traffic junctions cope better with rush hour traffic, or even just helping people select a movie they’re likely to enjoy.
One could argue that this underlying distrust of algorithms is no bad thing, particularly if this can lead to unbiased critical appraisal of their use in a particular field, as highlighted by Knox, J. (2015) with regard to their use in education:
“Critical research, sometimes associated with the burgeoning field of Software Studies, has sought to examine and question such algorithms as guarantors of objectivity, authority and efficiency. This work is focused on highlighting the assumptions and rules already encoded into algorithmic operation, such that they are considered always political and always biased.”
This week has made me a little more uneasy about the way “algorithms not only censor educational content, but also work to construct learning subjects, academic practices, and institutional strategies” Knox, J. (2015). In my professional practice we do not have the sophistication of systems that would make this a concern, but our learners are exposed to and learn from other systems and their apprehensions about how we might use their data will no doubt be coloured by their view of ‘big data’. With that in mind this is clearly a subject I should have on my radar.
Apologies for writing double the word limit for this summary and including new content rather than summarising, it’s one of those subjects that, when once you start writing, it’s difficult to stop!
Knox, J. 2015. Algorithmic Cultures. Excerpt from Critical Education and Digital Cultures. In Encyclopedia of Educational Philosophy and Theory. M. A. Peters (ed.). DOI 10.1007/978-981-287-532-7_124-1