Category Archives: Lifestream

Pinterest! Dispositif

Just Pinned to Education and Digital Cultures: Foucault’s dispositif

What is the dispositif?

I pinned this because I found it serendipitously whilst looking for the word diapositif or negative, (old-fashioned film slides) for a short post I was writing on Jeremy’s Abstracting Learning Analytics blog. Foucault’s dispositif seems to sum up the complex interrelation of elements making up the mechanism or apparatus behind Learning Analytics:

From Wikipedia

This requires some reading before I can properly decide if it’s relevant.

Bookmark! A thousand hours

Thinking about quantifying the learning student made me reflect on time on task and whether, if an accurate measurement can be taken, more time on task would correlate to greater success (with the usual caveat about defining success). I don’t think it is always a foregone conclusion although mastery of a subject or skill is often characterised by the amount of time spent engaged in it. Time, here, is the amassed amount of hours, days or years needed to become a pianist, a professor or a potter. Is it possible to make creativity correlations? Pinheiro and Cruz (2014) itemise a series of tests to measure creativity but suggest

that the phenomenon of creativity cannot be described by any of these tests alone, but only through a battery of joint measures

Mapping Creativity: Creativity Measurements Network Analysis


from Diigo

Instagram! Google ads and extreme content

From BBC News at

From the Guardian at

via Instagram

These news items emphasise what happens when algorithms step outside our control or when even they aren’t sophisticated enough to do what we want at scale, requiring human input.

They’re also telling of the sort of responsibility we want to take for the tasks and code we create.

Evernote! Week 9 weekly thoughts

Audio Week 9 weekly thoughts
Open in Evernote

The school inspection has taken place. Some data amassing was required, but most of it was conducted by humans interacting with each other in the real world. How long this will remain the case is up for question if our study and discussions about learning analytics this week hail the beginning of an inevitable phenomenon. Inspections in the future might be done remotely with officials tapping in to the school’s metrics, viewing dashboards and delving into detailed individual student action plans, predictions and prescriptions carefully compiled by the code. Even the psychological temperature of the pupils will be available remotely in real time.

I have swithered all week between a reactionary distrust of learning analytics – a concept of learning by numbers and an ambition to instantiate a quantified student measured against coherent mapped knowledge domains – and an acknowledgment of the importance of research and a creeping suspicion that some of it might actually be useful, with a confession, too, that my happiness and motivation indicators do actually nudge up a little each time an automated comment on my lifestream applauds me for a great post. 

I have bundled up all my LA thoughts into one post (not such a heavy call on the algorithmic burden), although I sprinkled a few little comments on infographics, LA reports and modelling the student elsewhere as well as starting to contribute to Dan’s Milanote. I started my tweetorial tweets a bit early with a question which, for me, still hangs in the air.

I feel moocs on behaviourism and neuroscience coming on 🙂

Bookmark! Code acts

From University of Stirling code acts blog

New technologies of psychological surveillance, affective computing, and big data-driven psycho-informatics are being developed to conduct new forms of mood-monitoring and psychological experimentation within the classroom, supported by policy agendas that emphasize the emotional aspects of schooling.

This blog post reminded me of a recent feature in the TES about an app which records and measures pupils’ resilience by providing them with a chipped card teachers can scan to record “the desired skill or character trait”. This hegemonic practice of imposing judgement on how a student should be or feel and measuring their progress towards achieving a prescribed optimum is at the very least unsettling, at the worst, Orwellian. The emotive computing and psycho-informatics described by Williamson are based on ‘the vision of a transparent human,’ and would allow ‘students’ emotions to be data-mined and assessed in real-time for the purposes of continuous, automated school performance measurement’. The very stuff of nightmares.

from Diigo

Pinterest! LARC

Just Pinned to Education and Digital Cultures: The Learning Analytics Report Card (LARC) project asks: ‘How can University teaching teams develop critical and participatory approaches to educational data analysis?’ It seeks to develop ways of involving students as research partners and active participants in their own data collection and analysis, as well as foster critical understanding of the use of computational analysis in education. Working with students on specific courses within the Masters in Digital Education.

In spite of my default viewpoint being negative and dystopic (!) this looks like a really interesting approach to Learning Analytics, although I might have preferred it if students could choose what was tracked, not just what was displayed in their report. Perhaps doing so would mean that not enough student data would be captured for the research to be meaningful.

As well as tracking student data, individuals could be asked to supply contextual detail to supplement the algorithm’s understanding of them.