Monthly Archives: March 2017

Bookmark! Code acts

From University of Stirling code acts blog https://codeactsineducation.wordpress.com/

New technologies of psychological surveillance, affective computing, and big data-driven psycho-informatics are being developed to conduct new forms of mood-monitoring and psychological experimentation within the classroom, supported by policy agendas that emphasize the emotional aspects of schooling.

This blog post reminded me of a recent feature in the TES about an app which records and measures pupils’ resilience by providing them with a chipped card teachers can scan to record “the desired skill or character trait”. This hegemonic practice of imposing judgement on how a student should be or feel and measuring their progress towards achieving a prescribed optimum is at the very least unsettling, at the worst, Orwellian. The emotive computing and psycho-informatics described by Williamson are based on ‘the vision of a transparent human,’ and would allow ‘students’ emotions to be data-mined and assessed in real-time for the purposes of continuous, automated school performance measurement’. The very stuff of nightmares.

from Diigo http://ift.tt/1Rlug1X
via IFTTT

Pinterest! LARC

Just Pinned to Education and Digital Cultures: The Learning Analytics Report Card (LARC) project asks: ‘How can University teaching teams develop critical and participatory approaches to educational data analysis?’ It seeks to develop ways of involving students as research partners and active participants in their own data collection and analysis, as well as foster critical understanding of the use of computational analysis in education. Working with students on specific courses within the Masters in Digital Education. http://ift.tt/2mccV5K

In spite of my default viewpoint being negative and dystopic (!) this looks like a really interesting approach to Learning Analytics, although I might have preferred it if students could choose what was tracked, not just what was displayed in their report. Perhaps doing so would mean that not enough student data would be captured for the research to be meaningful.

As well as tracking student data, individuals could be asked to supply contextual detail to supplement the algorithm’s understanding of them.