This weeks theme has been algorithms, algorithms and algorithms. I must admit that I didn’t entirely understand how to explain an algorithm and this was highlighted on my weekly Skype chat with Dirk, Chenée, Stuart and Anne. I mean, I knew it was a process that involved a sequence of actions to perform calculations, reasoning or data processing but in regards to computer science I couldn’t put it into words. I spent the week looking at videos, a blog, recommended blogs and articles discussing algorithms. I even watched BBC Bitesize to listen to the description in the simplest form. The week involved a video that highlights the importance of female role models in childrens books, a bookmark using Medium, a TED talk podcast , an article on students public and private distribution of identity when involved in social media projects and how algorithms affect the workplace and/or may influence education accquirement and attainment. I met with James Lamb to discuss dance in regards to assessment, I played around with an emojicon experiment, I acknowledged that my Fabletics account may not be so personalised and that they may need to rethink my beachwear options. My YouTube account was overloaded with 360 degree videos because of last weeks viewings and my Facebook account made me laugh as it combined my love of dance and humour in this recommended video. The video that went viral of a BBC broadcast got me to appreciate life behind the scenes and I was also left amazed as scientific research and technology allows us to capture footage of a 20 week foetus through algorithms. The use of algorithms in social media makes me question the need to expand our interactions and I finish the week with a conclusion that humans have a sense of awareness that can surpass technology and the algorithm.
This week, I have been the main contact for my Higher Dance and HNC pupils. With one week until external examination they are feeling the pressure and are relying on myself as their teacher for guidance and pastoral support. I have worked with the pupils over a period of time which gives great insight and an understanding of the class as a group of individuals with a variety of personalities and learning needs. If I were to record and document their process using an app such as Class dojo there may well be certain pupils that come across as similar in statistics but the learning process, performance qualities and personal skills would not transpire. I find it very difficult to accept that algorithms and technology may be responsible for the opportunities available to the future academic generation. Humans have an ability to read each other and process information that algorithms may miss. Eynon (2013) states Big Data as a ‘technical fix’ rather than tools to empower and support. If we were to use the Learning Analytics to understand behaviour through a holistic approach then the patterns would increase in value.
Eynon, R. (2013). The rise of big data: What does it mean for education, technology, and media research? Learning, Media and technology, 38(3), 237-240. DOI. 1080/17439884.2013.771783.
March 12, 2017 at 08:28AM
Open in Evernote
Scientists have developed an algorithm to create the world's most detailed pregnancy scan to date.Credit: SWNS
Posted by Telegraph Science and Tech on Thursday, 9 February 2017
Scientists have developed an algorithm to create the world’s most detailed pregnancy scan to date. It really is incredible….This popped up on my Facebook (FB) feed through my browser history and the recommended videos by the activity of FB ‘friends’. It’s like when you want a specific car, you then suddenly spot that make and model everywhere. I feel like everyone around me is pregnant and the algorithms are showing me babies and their activity is influencing my recommendations of videos associated with pregnancy. Arrrggghhh, I feel pressurised into having another baby. Can algorithms influence peer pressure???
Credit: SWNS #mscedc
When live TV goes wrong…This BBC guest's children become the stars of the show.
Posted by BBC News on Friday, 10 March 2017
When live TV goes wrong…or should we say right? This moment for me was magical! It brought a smile to my face as the Professor tried to remain composed and professional while his cooler than cool toddler swaggers in to a live broadcast. Whether on a conference call, taking a virtual class or involved in a study group Skype chat, our lives can interrupt the moment. Our reaction is what makes us human. This video went viral in a matter of hours and although it brought a lot of laughter and joy, I couldn’t help but feel sad and disheartened at how cruel and judgemental people can be on social media. Online comments were full of vile accusations and assumptions, offering advice on how the individual involved should have handled the situation. Technology allows individuals to work from the comfort of their home and may even capture a moment of their ‘home’ life, which for me is endearing. We should value the advantages it brings rather than scrutinise because we can replay and dissect ones actions.
I stumbled upon this blog by Audrey Watters and what a find!
This particular blog post theme was on the Algorithmic Future of Education and put into three A’s which were: austerity, automation, and algorithms.
Similar to Boyd & Crawford (2012) and Selwyn (2014) , Watters describes the data collected in education as an administrative way to record and analyse: assessment, outcomes, standardisation and the monitoring and control of labour.
When discussing Artificial Intelligence (AI) and the machine as a tutor she asks about the view of “intelligence” and “learning” of machines and how might that be extrapolated to humans?
Similar to my experience on a Massive Open Online Course (MOOC) she highlights the fact that AI assesses the answer(s) to multiple choice question(s) and that that doesn’t require a particularly complicated algorithm. She goes on to discuss the need for personalisation and an individualisation of instruction and assessment, mediated through technology if we are going to learn outside the typical classroom.
“They must be able to account for what students’ misconceptions mean – why does the student choose the wrong answer. Robot tutors need to assess how the student works to solve a problem, not simply assess whether they have the correct answer. They need to provide feedback along the way. They have to be able to customize educational materials for different student populations – that means they have to be able to have a model for understanding what “different student populations” might look like and how their knowledge and their learning might differ from others. Robot tutors have to not just understand the subject at hand but they have to understand how to teach it too; they have to have a model for “good pedagogy” and be able to adjust that to suit individual students’ preferences and aptitudes. If a student asked a question, a robot would have to understand that and provide an appropriate response. All this (and more) has to be packaged in a user interface that is comprehensible and that doesn’t itself function as a roadblock to a student’s progress through the lesson.” (Watters, 2015)
If we look at Williamson’s paper at how ‘machine learning’ can be used to predict actions, behaviour and attitude. Big Data, Algorithms and Learning Analytics are trying to anticipate and predict how people act to govern education in a way that makes learners amendable to pedagogic intervention. (Williamson, 2014, p97)
I’ve stated in previous posts, there is a high expectation of technology because of sic-fi. Technology and the algorithm may be intelligent but they do not have a consciousness or an understanding of human tendencies which can transfer certain information into knowledge. The fear should not lie in them killing us or becoming superior or fear that is rooted to their ability to make us redundant and take our jobs. Like Audrey Watters implies “it’s that they could limit the possibilities for, the necessities of care and curiosity.”
Boyd, Danah, & Crawford, K. (2012). CRITICAL QUESTIONS FOR THE BIG DATA. Information, Communication & society, 15(5), 662-679. DOI: 10.1080/1369118x.2012.678878
Selwyn, N. (2014). Data entry: Towards the critical study of digital data and education. Learning, Media and Technology, 40(1), 64-82. DOI: 10.1080/17439884.2014.921628
Watters, A. (2015, October 22). The Algorithmic future of education. Retrieved from http://hackeducation.com/2015/10/22/robot-tutors
Williamson, B. (2014). Governing software: Networks, databases an algorithmic power in the digital governance of public education. Learning, Media an dTechnology, 40(1), 83-105. DOI: 10.1080/17439884.2014.924527
from Pocket http://ift.tt/1jX3BNt