I stumbled upon this blog by Audrey Watters and what a find!
This particular blog post theme was on the Algorithmic Future of Education and put into three A’s which were: austerity, automation, and algorithms.
Similar to Boyd & Crawford (2012) and Selwyn (2014) , Watters describes the data collected in education as an administrative way to record and analyse: assessment, outcomes, standardisation and the monitoring and control of labour.
When discussing Artificial Intelligence (AI) and the machine as a tutor she asks about the view of “intelligence” and “learning” of machines and how might that be extrapolated to humans?
Similar to my experience on a Massive Open Online Course (MOOC) she highlights the fact that AI assesses the answer(s) to multiple choice question(s) and that that doesn’t require a particularly complicated algorithm. She goes on to discuss the need for personalisation and an individualisation of instruction and assessment, mediated through technology if we are going to learn outside the typical classroom.
“They must be able to account for what students’ misconceptions mean – why does the student choose the wrong answer. Robot tutors need to assess how the student works to solve a problem, not simply assess whether they have the correct answer. They need to provide feedback along the way. They have to be able to customize educational materials for different student populations – that means they have to be able to have a model for understanding what “different student populations” might look like and how their knowledge and their learning might differ from others. Robot tutors have to not just understand the subject at hand but they have to understand how to teach it too; they have to have a model for “good pedagogy” and be able to adjust that to suit individual students’ preferences and aptitudes. If a student asked a question, a robot would have to understand that and provide an appropriate response. All this (and more) has to be packaged in a user interface that is comprehensible and that doesn’t itself function as a roadblock to a student’s progress through the lesson.” (Watters, 2015)
If we look at Williamson’s paper at how ‘machine learning’ can be used to predict actions, behaviour and attitude. Big Data, Algorithms and Learning Analytics are trying to anticipate and predict how people act to govern education in a way that makes learners amendable to pedagogic intervention. (Williamson, 2014, p97)
I’ve stated in previous posts, there is a high expectation of technology because of sic-fi. Technology and the algorithm may be intelligent but they do not have a consciousness or an understanding of human tendencies which can transfer certain information into knowledge. The fear should not lie in them killing us or becoming superior or fear that is rooted to their ability to make us redundant and take our jobs. Like Audrey Watters implies “it’s that they could limit the possibilities for, the necessities of care and curiosity.”
References:
Boyd, Danah, & Crawford, K. (2012). CRITICAL QUESTIONS FOR THE BIG DATA. Information, Communication & society, 15(5), 662-679. DOI: 10.1080/1369118x.2012.678878
Selwyn, N. (2014). Data entry: Towards the critical study of digital data and education. Learning, Media and Technology, 40(1), 64-82. DOI: 10.1080/17439884.2014.921628
Watters, A. (2015, October 22). The Algorithmic future of education. Retrieved from http://hackeducation.com/2015/10/22/robot-tutors
Williamson, B. (2014). Governing software: Networks, databases an algorithmic power in the digital governance of public education. Learning, Media an dTechnology, 40(1), 83-105. DOI: 10.1080/17439884.2014.924527
from Pocket http://ift.tt/1jX3BNt
via IFTTT