This was in a presentation I saw today. I wish the speaker had expanded on this and explained how all these people are linked to algorithms. #mscedc March 22, 2017 at 04:43PM
I managed to look into this a little bit after the conference. Something that struck me while looking at the picture presentation of the speaker was that all the pictures were of very old white men, except for Al-Khwarizmi, who was Arabic. I found a nice timeline with a representation of the history of algorithms. At least here they mentioned Ada Lovelace.
In one of the seminars I attended this week I heard the term physio-lytics being bandied about. This is the practise, from what I understand as I haven’t been able to information on it, of measuring and making sense of data that is retracted from wearable technology. It is the information on your staff ID card, Fitbit or smartwatch that will be used for this kind of analysis.
I started this week still stuck on how algorithms worked and how they might be seen to influence education. Which lead me to send my first tweet out about asking whether database results could shape research. I tweeted my question and it was in this vein of thought I went looking for any academic papers that could support what I suspected. There were a lot about bias but I found an example which I saved on Diigo. This article focused on some of the issues around systematic reviews with regards to database searches. It prompted my thinking on how research could be adversely affected by search results but more importantly highlighted the human element of how important information literacy is for scholarly processes.
It was only during the tweetathon I finally felt like I had joined the party with regards to how data and learning analytics play a role in shaping education, but it was quite difficult making sense of what was going on. I felt I was more active than I demonstrated.
I pinned a graphic from Pinterest promising that data mining and learning analytics enhance education which was reminiscent of the instrumentalism around discourse (Bayne 2014) in Block 1.
The TED talk presented how big data can be used to build better schools and transform education by showing governments where to spend their money in education. It made me realise that, when looked at quite broadly, data can revolutionise education.
Finally, I reflected on the traffic light systems that track and rate students, something I’d like to explore further. Ironically, on the first day of week ten, while I was playing catch up in Week 9, I attended some staff training on Learning Analytics, ‘Utilizing Moodle logs and recorded interactions’, where I was shown how to analyse quantitative data to monitor students’ use and engagement.
Bayne, S. (2014). What’s the matter with ‘Technology Enhanced Learning’? Learning, Media and Technology, 40(1): pp5-20.
I have been wondering about how algorithms may affect education. At the moment much of my everyday work overlaps with ‘Information Literacy’ and how to prepare students to critically use and engage with those skills involved with critically assessing different texts. I know that from my own studies finding sources is tedious and requires patience and if students aren’t able to analyse texts critically their work becomes very difficult. It is arduous work trying to find both primary texts and secondary texts trying to support an argument. It has got me thinking about how academic databases present information to those looking for it. I think the consequences of this is perhaps less apparent for those looking for information for subjects based within the Social Sciences but could potentially be very harmful for those doing degrees in Medicine.
As an example, I refer back to the late 1990s and Dr Andrew Wakefield’s claims that the Measles, Mumps and Rubella vaccine was linked to Autism. His theory was found untrue. His assertions and the way in which he conducted his research saw him struck of the medical register. His article is still available to read but has been retracted. It is still available, so others can learn from it. The consequences of his claims have proved far reaching. There has been an increase in Measles as many parents chose not to inoculate their children after Wakefield’s claims. (I noticed on my last trip to the University of Edinburgh that there was an outbreak recently.) The article in question, Ileal-lymphoid-nodular hyperplasia, non-specific colitis, and pervasive developmental disorder in childrenhas been cited nearly two and a half thousand times according to Google Scholar. Although I haven’t looked at all of the texts citing Wakefield et al (1998), I would hedge that many of them would be disproving his theory and demonstrating what not to do when carrying out medical research. My interest in this particular example is that, not all academic articles will be as contentious and certainly not all subjects will have hard data to assist in providing a critical lens.
The reason I mention this allegory, is that I think it demonstrates how databases only present information, we cannot trust algorithms to sort that information. We may not be aware of why certain sources have been cited as much as they have, those which have been cited a lot haven’t always been cited because they are good. The Wakefield example is extreme. There were almost seventy papers published this week in the British Medical Journal alone. It would be impossible to expect that doctors read all new research. Just as it is impossible for academics to read all the information in their fields. Databases are a key tool in higher education but it is not often explicit how information is displayed. By relevance? By popularity? By date? By number of citations. Is it clear how this information is being presented? Could the way in which information in databases is being prioritised (Knox 2015) be affecting the way in which research is carried out?
Knox, J. (2015).Algorithmic Cultures. Excerpt from Critical Education and Digital Cultures. In Encyclopedia of Educational Philosophy and Theory. M. A. Peters (ed.). DOI 10.1007/978-981-287-532-7_124-1
Wakefield, A., Murch, S. H., Anthony, A., Linnel, J., Casson, D. M., Malik, M., Berelowitz, M., Dhillon, A. P., Thompson, M. A., Harvey, P., Valentine, A., Davies, S. E., Walker-Smith, J. A., (1998). Ileal-lymphoid-nodular hyperplasia, non-specific colitis, and pervasive developmental disorder in children. The Lancet 351(9103): pp. 637-641
This paper provided a tangible example of how bias can effect research. This is more clearly evident in the biological sciences than in social sciences.
I started my week trying to find out what exactly algorithms were. I had a vague understanding that they were part of the coding that looks for patterns and then changes functionality of certain online spaces, usually to do with shopping and social media. I’ve mostly come across them through social media feeds where influencers are usually advocating for you to turn notifications on about their posts. What surprised me when I started looking for information about how algorithms work, almost as often information on how to manipulate them popped up.
I was trying think about how algorithms may influence education and where they might fall short when I stumbled upon the amazing Joy Buolamwini. She highlighted the real consequences of how having a lack of diversity in programming can impact technology in ways we do not expect. It was evident from her experience that technology rendered her invisible by not being able to read her features. I wonder how many other invisibilities are not yet evident.
We met for our weekly Skype group and some of the bigger themes emerging from that conversation were about how algorithms are used for control and surveillance. We wondered if this might cause students from certain, ethnic, socio-economic backgrounds to be marginalised.
The TED talk on How algorithms shape our world. Was really insightful on how algorithms link. The ‘ecosystem’ metaphor Slavin used echoed Active algorithms: Sociomaterial spaces in the E-learning and digital cultures MOOC (Knox 2014).
It was in this vein I found Hack Education’s article about the Algorithmic Future of Education. Watter’s highlights how the marketization of education and how important ‘care’ is when dealing with students.
I rounded the week off working with Stuart by comparing how algorithms work in different online spaces.
Knox, J. 2015. Algorithmic Cultures. Excerpt from Critical Education and Digital Cultures. In Encyclopedia of Educational Philosophy and Theory. M. A. Peters (ed.). DOI 10.1007/978-981-287-532-7_124-1