I stumbled upon this blog by Audrey Watters and what a find!
This particular blog post theme was on the Algorithmic Future of Education and put into three A’s which were: austerity, automation, and algorithms.
Similar to Boyd & Crawford (2012) and Selwyn (2014) , Watters describes the data collected in education as an administrative way to record and analyse: assessment, outcomes, standardisation and the monitoring and control of labour.
When discussing Artificial Intelligence (AI) and the machine as a tutor she asks about the view of “intelligence” and “learning” of machines and how might that be extrapolated to humans?
Similar to my experience on a Massive Open Online Course (MOOC) she highlights the fact that AI assesses the answer(s) to multiple choice question(s) and that that doesn’t require a particularly complicated algorithm. She goes on to discuss the need for personalisation and an individualisation of instruction and assessment, mediated through technology if we are going to learn outside the typical classroom.
“They must be able to account for what students’ misconceptions mean – why does the student choose the wrong answer. Robot tutors need to assess how the student works to solve a problem, not simply assess whether they have the correct answer. They need to provide feedback along the way. They have to be able to customize educational materials for different student populations – that means they have to be able to have a model for understanding what “different student populations” might look like and how their knowledge and their learning might differ from others. Robot tutors have to not just understand the subject at hand but they have to understand how to teach it too; they have to have a model for “good pedagogy” and be able to adjust that to suit individual students’ preferences and aptitudes. If a student asked a question, a robot would have to understand that and provide an appropriate response. All this (and more) has to be packaged in a user interface that is comprehensible and that doesn’t itself function as a roadblock to a student’s progress through the lesson.” (Watters, 2015)
If we look at Williamson’s paper at how ‘machine learning’ can be used to predict actions, behaviour and attitude. Big Data, Algorithms and Learning Analytics are trying to anticipate and predict how people act to govern education in a way that makes learners amendable to pedagogic intervention. (Williamson, 2014, p97)
I’ve stated in previous posts, there is a high expectation of technology because of sic-fi. Technology and the algorithm may be intelligent but they do not have a consciousness or an understanding of human tendencies which can transfer certain information into knowledge. The fear should not lie in them killing us or becoming superior or fear that is rooted to their ability to make us redundant and take our jobs. Like Audrey Watters implies “it’s that they could limit the possibilities for, the necessities of care and curiosity.”
Boyd, Danah, & Crawford, K. (2012). CRITICAL QUESTIONS FOR THE BIG DATA. Information, Communication & society, 15(5), 662-679. DOI: 10.1080/1369118x.2012.678878
Selwyn, N. (2014). Data entry: Towards the critical study of digital data and education. Learning, Media and Technology, 40(1), 64-82. DOI: 10.1080/17439884.2014.921628
Watters, A. (2015, October 22). The Algorithmic future of education. Retrieved from http://hackeducation.com/2015/10/22/robot-tutors
Williamson, B. (2014). Governing software: Networks, databases an algorithmic power in the digital governance of public education. Learning, Media an dTechnology, 40(1), 83-105. DOI: 10.1080/17439884.2014.924527
from Pocket http://ift.tt/1jX3BNt
Above is a few screenshots of my Fabletics account after they combined my frequently searched, size and favourite styles purchased into a personalised swimwear collection. Now, I am an individual that lives in active wear and I purchase alot online, BUT I feel that they may have missed out on the algorithm that informs them that I live in Scotland. In Scotland, beachwear consists of wellies and a Canada Goose jacket!!
from Flickr http://flic.kr/p/SdMyQs
from Flickr http://flic.kr/p/Sd5U9E
from Flickr http://flic.kr/p/Sd5L2h
As a Dance Educationalist I do not get the luxury of wearing smart dresses or outfits to work. Majority of my work week is spent in active wear which is carried on to my gym sessions and extra curricular activities with my daughter, dog and the horses. I am therefore, FOREVER in active wear. I enjoy clothes so I like to shop online (I mean whoever has the time these days to go shopping in person?) for smart outfits despite the informal appearance. Fabletics is a website I’ve used for a while and it conveniently caters to my taste, size and lifestyle. The first thing I was required to complete was a ‘pop quiz’ where I answered numerous questions on my activity, my shape, size and my colour and style preference. Each month I am sent e-mails and updates of co-ordinated outfits and personal recommendations. At first, I thought this was wonderful and I felt as if I had an online personal shopper. As it continues my bank balance suffers and I have more capri pants to open my own store! Algorithms is not just for the client, it is definitely for the convenience of the company. I now have no need to buy any active wear for a few years. The algorithms at play managed to alter choice by sorting, ranking and creating outfits that I could order. Why buy a top when you can but an outfit? The algorithm has the decision on what should be visible to me when I open my account or they take it a step further and send an e-mail. They create ‘truths’ around my choice, taste and lifestyle (Beer, 2016). If I’ve bought it then you bet I am wearing it to get my moneys worth!!
Beer, D. (2016). The social power of algorithms. Information, Communication & Society, 20(1), 1-13. DOI: 10.1080/1369118x.2016.1216147
from Flickr http://flic.kr/p/Sd5JBU
These TED talks helped put the use of algorithms into perspective and helped challenge my thinking of how they can influence individuals, communities, work environments and education.What interests me was the connection to how we understand and perceive something and our understanding of knowledge. The MOOC ‘The Brain and Space’ from my mini-ethnography last block covered how our senses and motor systems construct space using vision, hearing, touch, body position, movement and balance. We can see that algorithms take form in a multitude of ways but the experience and understanding of the data is what makes it accurate. Sometimes people are poor at making decisions and we fall back on algorithms to make choices for us. People will turn to the algorithm to decide outcomes, make designs, who to marry, what to study or even use google search engine to find out facts. However, there are flaws in the algorithm. Andreas Eskram speaks of us taking multiple facts from google then using our critical thought to debate and make our own understanding of the information which will in turn transfer to our knowledge of a subject. Fei-Fei Li takes us through the process of discussing how teaching computers to understanding pictures is in some ways similar to teaching a toddler to process images. Like the ‘Brain and Space’ MOOC she insists ‘Vision begins with the eyes but truly takes place in the brain’. No one teaches a child to see, they learn through experience. Therefore her approach is training the machine by experiencing both quality and quantity images. However, the machine can not appreciate the extra qualities associated to an image or the chemistry that an algorithm may miss on a dating site. Amy talks about dating using an algorithm to support her eventful experience. What is apparent is the human connection that happens over time in incremental steps, experiencing facts, images or each other either through a virtual environment or face to face, no matter the pace.
from Flickr http://flic.kr/p/SfYhrn
from Flickr http://flic.kr/p/SkR8tm