Can the way in which databases present information affect research?

I have been wondering about how algorithms may affect education. At the moment much of my everyday work overlaps with ‘Information Literacy’ and how to prepare students to critically use and engage with those skills involved with critically assessing different texts. I know that from my own studies finding sources is tedious and requires patience and if students aren’t able to analyse texts critically their work becomes very difficult. It is arduous work trying to find both primary texts and secondary texts trying to support an argument. It has got me thinking about how academic databases present information to those looking for it. I think the consequences of this is perhaps less apparent for those looking for information for subjects based within the Social Sciences but could potentially be very harmful for those doing degrees in Medicine.

As an example, I refer back to the late 1990s and Dr Andrew Wakefield’s claims that the Measles, Mumps and Rubella vaccine was linked to Autism. His theory was found untrue. His assertions and the way in which he conducted his research saw him struck of the medical register. His article is still available to read but has been retracted. It is still available, so others can learn from it.  The consequences of his claims have proved far reaching. There has been an increase in Measles as many parents chose not to inoculate their children after Wakefield’s claims. (I noticed on my last trip to the University of Edinburgh that there was an outbreak recently.) The article in question, Ileal-lymphoid-nodular hyperplasia, non-specific colitis, and pervasive developmental disorder in children has been cited nearly two and a half thousand times according to Google Scholar. Although I haven’t looked at all of the texts citing Wakefield et al (1998), I would hedge that many of them would be disproving his theory and demonstrating what not to do when carrying out medical research. My interest in this particular example is that, not all academic articles will be as contentious and certainly not all subjects will have hard data to assist in providing a critical lens.

Google Scholar results for Wakefield.

The reason I mention this allegory, is that I think it demonstrates how databases only present information, we cannot trust algorithms to sort that information. We may not be aware of why certain sources have been cited as much as they have, those which have been cited a lot haven’t always been cited because they are good. The Wakefield example is extreme. There were almost seventy papers published this week in the British Medical Journal alone. It would be impossible to expect that doctors read all new research. Just as it is impossible for academics to read all the information in their fields. Databases are a key tool in higher education but it is not often explicit how information is displayed. By relevance? By popularity? By date? By number of citations. Is it clear how this information is being presented? Could the way in which information in databases is being prioritised (Knox 2015) be affecting the way in which research is carried out?


Knox, J. (2015). Algorithmic Cultures. Excerpt from Critical Education and Digital Cultures. In Encyclopedia of Educational Philosophy and Theory. M. A. Peters (ed.). DOI 10.1007/978-981-287-532-7_124-1

Wakefield, A., Murch, S. H., Anthony, A., Linnel, J., Casson, D. M., Malik, M., Berelowitz, M., Dhillon, A. P., Thompson, M. A., Harvey, P., Valentine, A., Davies, S. E., Walker-Smith, J. A., (1998). Ileal-lymphoid-nodular hyperplasia, non-specific colitis, and pervasive developmental disorder in children. The Lancet 351(9103): pp. 637-641

Reporting bias and other biases affecting systematic reviews and meta-analyses: A methodological commentary

Abstract
Systematic reviews and meta-analyses often occupy the top of the hierarchy of evidence in support of evidence-based clinical practice. These studies commonly inform the formulation of clinical guidelines. Bias can intrude at several levels during the conduct of systematic reviews. The effect these various biases, in particular reporting bias, have on pooled estimates and review inferences are potentially significant. In this review, we describe several forms of selection and reporting biases that may occur during the conduct of a systematic review, how these biases might affect a review and what steps could help minimize their influence on review inferences. Specifically, we support calls for prospective international trial registration and open access to trial protocols as two potential solutions that may improve the methodological quality of systematic reviews and the validity of their results. © 2006 Future Drugs Ltd.
from Diigo http://ift.tt/2n7dfSe
via IFTTT

This paper provided a tangible example of how bias can effect research. This is more clearly evident in the biological sciences than in social sciences.

Co-constructed ecosystems Week 8

ecosystem
Photo: Flickr @giveaphuk

I started my week trying to find out what exactly algorithms were. I had a vague understanding that they were part of the coding that looks for patterns and then changes functionality of certain online spaces, usually to do with shopping and social media. I’ve mostly come across them through social media feeds where influencers are usually advocating for you to turn notifications on about their posts. What surprised me when I started looking for information about how algorithms work, almost as often information on how to manipulate them popped up.

I was trying think about how algorithms may influence education and where they might fall short when I stumbled upon the amazing Joy Buolamwini. She highlighted the real consequences of how having a lack of diversity in programming can impact technology in ways we do not expect. It was evident from her experience that technology rendered her invisible by not being able to read her features. I wonder how many other invisibilities are not yet evident.

We met for our weekly Skype group and some of the bigger themes emerging from that conversation were about how algorithms are used for control and surveillance. We wondered if this might cause students from certain, ethnic, socio-economic backgrounds to be marginalised.

The TED talk on How algorithms shape our world. Was really insightful on how algorithms link. The ‘ecosystem’ metaphor Slavin used echoed Active algorithms: Sociomaterial spaces in the E-learning and digital cultures MOOC (Knox 2014).

It was in this vein I found Hack Education’s article about the Algorithmic Future of Education. Watter’s highlights how the marketization of education and how important ‘care’ is when dealing with students.

I rounded the week off working with Stuart by comparing how algorithms work in different online spaces.


Knox, J. 2015. Algorithmic Cultures. Excerpt from Critical Education and Digital Cultures. In Encyclopedia of Educational Philosophy and Theory. M. A. Peters (ed.). DOI 10.1007/978-981-287-532-7_124-1

The Algorithmic Future of Education

from Diigo http://ift.tt/1jX3BNt
via IFTTT


This article highlights the problems encountered with teaching methodology based on technological instrumentalism.

It also draws attention to the marketisation of education and how much money is now being spent in venture capital investment in education.

Another aspect it looks at is how a utopian view of technology, like students being able to have their own private tutor in a machine, overlooks the human/emotional side that is so strongly accompanies real student-teacher interaction.

How algorithms shape our world

We live in a world run by algorithms, computer programs that make decisions or solve problems for us. In this riveting, funny talk, Kevin Slavin shows how modern algorithms determine stock prices, espionage tactics, even the movies you watch. But, he asks: If we depend on complex algorithms to manage our daily decisions — when do we start to lose control?

from Pocket http://ift.tt/1qh6LJS
via IFTTT

I thought this TED talk was interesting as Slavin indicates that the way humans and algorithms interact is an ‘ecosystem’, a complex interconnected system when one facet cannot survive in the same way without the others. This supports the idea that ‘spaces…cannot be entirely controlled by teachers, students, or the authors of the software’ Knox (2014).


References

Knox, J. K. (2014). Active algorithms: sociomaterial spaces in the E-learning and Digital Cultures MOOC. Campus Virtuales, 3(1): 42-55.