Audio Week 9 weekly thoughts
Open in Evernote
The school inspection has taken place. Some data amassing was required, but most of it was conducted by humans interacting with each other in the real world. How long this will remain the case is up for question if our study and discussions about learning analytics this week hail the beginning of an inevitable phenomenon. Inspections in the future might be done remotely with officials tapping in to the school’s metrics, viewing dashboards and delving into detailed individual student action plans, predictions and prescriptions carefully compiled by the code. Even the psychological temperature of the pupils will be available remotely in real time.
I have swithered all week between a reactionary distrust of learning analytics – a concept of learning by numbers and an ambition to instantiate a quantified student measured against coherent mapped knowledge domains – and an acknowledgment of the importance of research and a creeping suspicion that some of it might actually be useful, with a confession, too, that my happiness and motivation indicators do actually nudge up a little each time an automated comment on my lifestream applauds me for a great post.
I have bundled up all my LA thoughts into one post (not such a heavy call on the algorithmic burden), although I sprinkled a few little comments on infographics, LA reports and modelling the student elsewhere as well as starting to contribute to Dan’s Milanote. I started my tweetorial tweets a bit early with a question which, for me, still hangs in the air.
I feel moocs on behaviourism and neuroscience coming on 🙂
‘Even the psychological temperature of the pupils will be available remotely in real time.’
Indeed. At the LAK17 conference (http://lak17.solaresearch.org) I was at, one of the keynotes outlined a large scale research project that used various sensors and 3D cameras to measure and record many aspects of a classroom, including body temperature!
I think a suspicion that learning analytics might be useful is a good thing! I certainly think that it can be, if designed and deployed creatively and sensitively. How could one make analytics in a way that promoted excess rather than efficiency?
On the question of ‘how can we measure learning when we don’t when or how it happens’ … yes, its a good question! I’m not sure it is a question that is really lost on the analytics community though. Analytics, as a field, isn’t really under the impression that is *is* measuring learning, rather it’s concerned with finding good *proxies* for it. There is a difference between causation and correlation, and analytics is certainly aware of it. However, do we need to know what ‘learning’ is before we can identify a good proxy for it?
‘How could one make analytics in a way that promoted excess rather than efficiency?’
I’m not sure what you mean by this but I guess that it means the generation of much rich detailed data not so easily reduced to dashboard depictions?
I don’t think we need to know what learning is to identify a proxy because I think some of our scientific discoveries have come about by exploring the shadowy traces of a thing which betrays its existence. This makes me think of Stéphane Mallarmé whose dream was to avoiding naming an object, but just to suggest it – so more symbolism than proxy. If we were certain of what learning *is* and were able to name and locate it, perhaps that would be the reducing of it, more so than the reduced picture we criticise Learning Analytics for producing.