Instagram: The history of algorithms.

How are all these people linked to algorithms?

This was in a presentation I saw today. I wish the speaker had expanded on this and explained how all these people are linked to algorithms. #mscedc March 22, 2017 at 04:43PM

via IFTTT

I managed to look into this a little bit after the conference. Something that struck me while looking at the picture presentation of the speaker was that all the pictures were of very old white men, except for Al-Khwarizmi, who was Arabic. I found a nice timeline with a representation of the history of algorithms. At least here they mentioned Ada Lovelace.

Tweet

In one of the seminars I attended this week I heard the term physio-lytics being bandied about. This is the practise, from what I  understand as I haven’t been able to information on it, of measuring and making sense of data that is retracted from wearable technology. It is the information on your staff ID card, Fitbit or smartwatch that will be used for this kind of analysis.

Red, amber, green: Learning analytics. Week 9

Is there any benefit to rating students’ success?

I started this week still stuck on how algorithms worked and how they might be seen to influence education. Which lead me to send my first tweet out about asking whether database results could shape research. I tweeted my question and it was in this vein of thought I went looking for any academic papers that could support what I suspected. There were a lot about  bias but I found an example which I saved on Diigo. This article focused on some of the issues around systematic reviews with regards to database searches. It prompted my thinking on how research could be adversely affected by search results but more importantly highlighted the human element of how important information literacy is for scholarly processes.

It was only during the tweetathon I finally felt like I had joined the party with regards to how data and learning analytics play a role in shaping education, but it was quite difficult making sense of what was going on. I felt I was more active than I demonstrated.

I pinned a graphic from Pinterest promising that data mining and learning analytics enhance education which was reminiscent of the instrumentalism around discourse (Bayne 2014) in Block 1.

The TED talk presented how big data can be used to build better schools and transform education by showing governments where to spend their money in education. It made me realise that, when looked at quite broadly, data can revolutionise education.

Finally, I reflected on the traffic light systems that track and rate students, something I’d like to explore further. Ironically, on the first day of week ten, while I was playing catch up in Week 9, I attended some staff training on Learning Analytics, ‘Utilizing Moodle logs and recorded interactions’, where I was shown how to analyse quantitative data to monitor students’ use and engagement.


Bayne, S. (2014). What’s the matter with ‘Technology Enhanced Learning’? Learning, Media and Technology, 40(1): pp5-20.

Can the way in which databases present information affect research?

I have been wondering about how algorithms may affect education. At the moment much of my everyday work overlaps with ‘Information Literacy’ and how to prepare students to critically use and engage with those skills involved with critically assessing different texts. I know that from my own studies finding sources is tedious and requires patience and if students aren’t able to analyse texts critically their work becomes very difficult. It is arduous work trying to find both primary texts and secondary texts trying to support an argument. It has got me thinking about how academic databases present information to those looking for it. I think the consequences of this is perhaps less apparent for those looking for information for subjects based within the Social Sciences but could potentially be very harmful for those doing degrees in Medicine.

As an example, I refer back to the late 1990s and Dr Andrew Wakefield’s claims that the Measles, Mumps and Rubella vaccine was linked to Autism. His theory was found untrue. His assertions and the way in which he conducted his research saw him struck of the medical register. His article is still available to read but has been retracted. It is still available, so others can learn from it.  The consequences of his claims have proved far reaching. There has been an increase in Measles as many parents chose not to inoculate their children after Wakefield’s claims. (I noticed on my last trip to the University of Edinburgh that there was an outbreak recently.) The article in question, Ileal-lymphoid-nodular hyperplasia, non-specific colitis, and pervasive developmental disorder in children has been cited nearly two and a half thousand times according to Google Scholar. Although I haven’t looked at all of the texts citing Wakefield et al (1998), I would hedge that many of them would be disproving his theory and demonstrating what not to do when carrying out medical research. My interest in this particular example is that, not all academic articles will be as contentious and certainly not all subjects will have hard data to assist in providing a critical lens.

Google Scholar results for Wakefield.

The reason I mention this allegory, is that I think it demonstrates how databases only present information, we cannot trust algorithms to sort that information. We may not be aware of why certain sources have been cited as much as they have, those which have been cited a lot haven’t always been cited because they are good. The Wakefield example is extreme. There were almost seventy papers published this week in the British Medical Journal alone. It would be impossible to expect that doctors read all new research. Just as it is impossible for academics to read all the information in their fields. Databases are a key tool in higher education but it is not often explicit how information is displayed. By relevance? By popularity? By date? By number of citations. Is it clear how this information is being presented? Could the way in which information in databases is being prioritised (Knox 2015) be affecting the way in which research is carried out?


Knox, J. (2015). Algorithmic Cultures. Excerpt from Critical Education and Digital Cultures. In Encyclopedia of Educational Philosophy and Theory. M. A. Peters (ed.). DOI 10.1007/978-981-287-532-7_124-1

Wakefield, A., Murch, S. H., Anthony, A., Linnel, J., Casson, D. M., Malik, M., Berelowitz, M., Dhillon, A. P., Thompson, M. A., Harvey, P., Valentine, A., Davies, S. E., Walker-Smith, J. A., (1998). Ileal-lymphoid-nodular hyperplasia, non-specific colitis, and pervasive developmental disorder in children. The Lancet 351(9103): pp. 637-641

Reporting bias and other biases affecting systematic reviews and meta-analyses: A methodological commentary

Abstract
Systematic reviews and meta-analyses often occupy the top of the hierarchy of evidence in support of evidence-based clinical practice. These studies commonly inform the formulation of clinical guidelines. Bias can intrude at several levels during the conduct of systematic reviews. The effect these various biases, in particular reporting bias, have on pooled estimates and review inferences are potentially significant. In this review, we describe several forms of selection and reporting biases that may occur during the conduct of a systematic review, how these biases might affect a review and what steps could help minimize their influence on review inferences. Specifically, we support calls for prospective international trial registration and open access to trial protocols as two potential solutions that may improve the methodological quality of systematic reviews and the validity of their results. © 2006 Future Drugs Ltd.
from Diigo http://ift.tt/2n7dfSe
via IFTTT

This paper provided a tangible example of how bias can effect research. This is more clearly evident in the biological sciences than in social sciences.

Co-constructed ecosystems Week 8

ecosystem
Photo: Flickr @giveaphuk

I started my week trying to find out what exactly algorithms were. I had a vague understanding that they were part of the coding that looks for patterns and then changes functionality of certain online spaces, usually to do with shopping and social media. I’ve mostly come across them through social media feeds where influencers are usually advocating for you to turn notifications on about their posts. What surprised me when I started looking for information about how algorithms work, almost as often information on how to manipulate them popped up.

I was trying think about how algorithms may influence education and where they might fall short when I stumbled upon the amazing Joy Buolamwini. She highlighted the real consequences of how having a lack of diversity in programming can impact technology in ways we do not expect. It was evident from her experience that technology rendered her invisible by not being able to read her features. I wonder how many other invisibilities are not yet evident.

We met for our weekly Skype group and some of the bigger themes emerging from that conversation were about how algorithms are used for control and surveillance. We wondered if this might cause students from certain, ethnic, socio-economic backgrounds to be marginalised.

The TED talk on How algorithms shape our world. Was really insightful on how algorithms link. The ‘ecosystem’ metaphor Slavin used echoed Active algorithms: Sociomaterial spaces in the E-learning and digital cultures MOOC (Knox 2014).

It was in this vein I found Hack Education’s article about the Algorithmic Future of Education. Watter’s highlights how the marketization of education and how important ‘care’ is when dealing with students.

I rounded the week off working with Stuart by comparing how algorithms work in different online spaces.


Knox, J. 2015. Algorithmic Cultures. Excerpt from Critical Education and Digital Cultures. In Encyclopedia of Educational Philosophy and Theory. M. A. Peters (ed.). DOI 10.1007/978-981-287-532-7_124-1