I’ve had the time of my life(stream) – Week 9 summary

There are twin themes to the lifestream this week: the first, a desire to engage critically with the content of the block, emerges through plenty of references to critical texts and articles. Learning analytics, and the omnipresent ethical debate which surrounds it, is a key component. For example, I included Anderson’s brilliant Wired article (mentioned in the lecture we heard this week), a short reading list on learning analytics and ethics, and a JISC report, which I found to be quite uncritical: I included some of the notes I made while reading it, and my responses.

There’s a keen sense, then, of ‘what I’m reading‘. The phrase, chosen as a blog title over two months ago when I set up IFTTT to work with Evernote, encapsulates the second of the twin themes: an emerging issue of temporality. This, I think, is present in several ways. With regard to learning analytics and Big Data, temporality incorporates ideas of the currency of the collection of data: methods of data collection and analysis will improve, but this will not necessarily account for the patchiness of past data, upon which decisions will continue to be made. This, for me, problematises our historiographical understanding of how Big Data might help us to become more self-aware, especially with regard to the “timeless quest” to which Cukier refers in this Ted talk. But temporality is present too in a personal sense: in my reasons for missing the tweetstorm, which was locked into a specific point in time, but also in a keenly felt fixing of attention to forthcoming assignments.

The themes converge with a slightly more substantive blog post on Big Data, learning analytics and posthumanism which, I think, may be a fruitful topic to consider further in the digital essay.

Big Data, learning analytics, and posthumanism

I’ve now read a few articles assessing the pros and cons of learning analytics and, regardless of the methodologies employed, there are patterns and themes in what is being found. The benefits include institutional efficiency and institutional performance around financial planning and recruitment; for students, the benefits correspond to insights into learning and informed decision-making. These are balanced against the cons: self-fulfilling prophecies concerning at-risk students, the dangers of student profiling, risks to student privacy and questions around data ownership (Roberts et al., 2016; Lawson et al., 2016). This is often contextualised by socio-critical understandings which converge on notions of power and surveillance; some of the methodologies explicitly attempt to counter presumptions made as a result of this, for example, by bringing in the student voice (Roberts et al., 2016).

In reading these articles and studies, I was particularly interested in ideas around student profiling and student labelling, and how this is perceived (or sometimes spun) as a benefit for students. Arguments against student profiling focus on the oversimplification of student learning, students being labelled on past decisions, student identity being in a necessary state of flux (Mayer-Schoenberger, 2011). One of the things, though, that’s missing in all of this, the absence of which I am feeling keenly, is causation. It strikes me that big data and learning analytics can tell us what is, but not always why.

A similar observation leads Chandler to assert that Big Data is the kind of Bildungsroman of posthumanism (2015). He argues that Big Data is an epistemological revolution:

“displacing the modernist methodological hegemony of causal analysis and theory displacement” (2015, p. 833).

Chandler is not interested in the pros and cons of Big Data so much as the way in which it changes how knowledge is produced, and how we think about knowledge production. This is an extension of ideas espoused by Anderson, in which he argues that theoretical models are becoming redundant in a world of Big Data (2008). Similar, Cukier and Schoenberger argue that Big Data:

“represents a move away from trying to understand the deeper reasons behind how the world works to simply learning about an association among phenomena, and using that to get that done” (2013, p. 32).

Big Data aims not at instrumental knowledge, nor causal reasoning, but the revealing of feedback loops. It’s reflexive. And for Chandler, this represents an entirely new epistemological approach for making sense of the world, gaining insights which are ‘born from the data’, rather than planned in advance.

Chandler is interested in the ways in which Big Data can intersect with ideas in international relations and political governance, and many of his ideas are extremely translatable and relevant to higher education institutions. For example, Chandler argues that Big Data reflects political reality (i.e. what is) but it also transforms it through enabling community self-awareness. It allows reflexive problem-solving on the basis of this self-awareness. Similarly, it may be seen that learning analytics allows students to gain understanding of their learning and their progress, possibly in comparison with their peers.

This sounds great, but Chandler contends that it is necessarily accompanied by a warning: it isn’t particularly empowering for those who need social change:

Big Data can assist with the management of what exists […] but it cannot provide more than technical assistance based upon knowing more about what exists in the here and now. The problem is that without causal assumptions it is not possible to formulate effective strategies and responses to problems of social, economic and environmental threats. Big Data does not empower people to change their circumstances but merely to be more aware of them in order to adapt to them (p. 841-2).

The problem of lack of understanding of causation is raised in consideration of ‘at risk’ students – a student being judged on a series of data without any (potentially necessary) contextualisation. The focus is on reflexivity and relationality rather than how or why a situation has come about, and what the impact of it might be. Roberts et al. found that students were concerned about this, that learning analytics might drive inequality through advantaging only some students (2016).The demotivating nature of the EASI system for ‘at risk’ students is also raised by Lawson et al. (2016, p. 961). Too little consideration is given to the causality of ‘at risk’, and perhaps too much to essentialism.

His considerations of Big Data and international relations leads Chandler to assert cogently that:

Big Data articulates a properly posthuman ontology of self-governing, autopoietic assemblages of the technological and the social (2015, p. 845).

No one here is necessarily excluded, and all those on the periphery are brought in. Rather paradoxically, this appears to be both the culmination of the socio-material project, as well as an indicator of its necessity. Adopting a posthumanist approach to learning analytics may be a helpful critical standpoint, and is definitely something worth exploring further.


Anderson, C. (2008). The End of Theory: The Data Deluge Makes the Scientific Method Obsolete. Retrieved 19 March 2017, from https://www.wired.com/2008/06/pb-theory/
Chandler, D. (2015). A World without Causation: Big Data and the Coming of Age of Posthumanism. Millennium, 43(3), 833–851. https://doi.org/10.1177/0305829815576817
Cukier, K., & Mayer-Schoenberger, V. (2013). The Rise of Big Data: How It’s Changing the Way We Think About the World. Foreign Affairs, 92(3), 28–40.
Lawson, C., Beer, C., Rossi, D., Moore, T., & Fleming, J. (2016). Identification of ‘at risk’ students using learning analytics: the ethical dilemmas of intervention strategies in a higher education institution. Educational Technology Research and Development, 64(5), 957–968. https://doi.org/10.1007/s11423-016-9459-0
Mayer-Schonberger, V. (2011). Delete: the virtue of forgetting in the digital age: Princeton: Princeton University Press.
Roberts, L. D., Howell, J. A., Seaman, K., & Gibson, D. C. (2016). Student Attitudes toward Learning Analytics in Higher Education: ‘The Fitbit Version of the Learning World’. Frontiers in Psychology, 7. https://doi.org/10.3389/fpsyg.2016.01959

Transcript of “Big data is better data”

Self-driving cars were just the start. What’s the future of big data-driven technology and design? In a thrilling science talk, Kenneth Cukier looks at what’s next for machine learning — and human knowledge.

from Pocket http://ift.tt/2nA1wwV

This is a good (and pithy) talk, but there are two points he makes that I find particularly interesting:

  1. We have to be the master of this technology, not its servant […] This is a tool, but this is a tool that, unless we’re careful, will burn us“. A patent warning against technological determinism, here, but one which (in my opinion) is not necessarily couched with enough care to help us to understand how to avoid a fully instrumentalist approach.
  2. Humanity can finally learn from the information that it can collect, as part of our timeless quest to understand the world and our place in it“. This accompanies a strong sense of why Big Data is important, but it’s also very essentialist: it’s about reflecting the here and now, rather than attempting to understand the past. I wonder if there are some historiographical problems here, given that Big Data collection is so recent and new, and still so patchy in places. The ‘timeless quest’, given this, seems to be one which will be answered from a position of privilege: from those who are fortunate enough, paradoxically, to have data collected about them.

What I’m reading


March 18, 2017 at 04:48PM
Open in Evernote

I included this because it so strongly chimed with what I was thinking about student profiling – in particular, the highlighted bit reflects my experience of working with young people in HE, and the (in my opinion) dangers of treating any people, but particularly young people, as linear, as models of themselves, or as unable to follow unpredictable paths.

It’s from here, by the way:

Lawson, C., Beer, C., Rossi, D., Moore, T., & Fleming, J. (2016). Identification of ‘at risk’ students using learning analytics: the ethical dilemmas of intervention strategies in a higher education institution. Educational Technology Research and Development, 64(5), 957–968. https://doi.org/10.1007/s11423-016-9459-0

[Edit on Sunday 19th March: it’s also, I notice very much retrospectively, an attempt for me to use the lifestream to model how I study, how I make notes, how I identify comments and other thoughts. There’s another example here. I didn’t really realise I was doing this.]

‘Don’t call us lazy!’ Students call for activity tracking apps to be let loose on the education sector

85% of FE students think an activity tracking app for learning and teaching would be helpful, finds a new survey by Jisc, the digital services and solutions organisation for UK education and research.

from Pocket http://ift.tt/2mTCrdg

I included this because it’s extremely recent, and contains such a different voice and attitude to learning analytics in comparison to what else I’ve been reading. The article documents how FE students are almost exclusively positive about learning analytics, in comparison to HE students documented in other places.

What I’m reading

Ethics and learning analytics: a short reading list

Lawson, C., Beer, C., Rossi, D., Moore, T., & Fleming, J. (2016). Identification of ‘at risk’ students using learning analytics: the ethical dilemmas of intervention strategies in a higher education institution. Educational Technology Research and Development, 64(5), 957–968. http://ift.tt/2mZMJdo
Roberts, L. D., Howell, J. A., Seaman, K., & Gibson, D. C. (2016). Student Attitudes toward Learning Analytics in Higher Education: ‘The Fitbit Version of the Learning World’. Frontiers in Psychology, 7. http://ift.tt/2mG8Z9H
Rubel, A., & Jones, K. M. L. (2016). Student privacy in learning analytics: An information ethics perspective. The Information Society, 32(2), 143–159. http://ift.tt/2mZPL1n
Scholes, V. (2016). The ethics of using learning analytics to categorize students on risk. Educational Technology Research and Development, 64(5), 939–955. http://ift.tt/2mGk1f5
Sclater, N., Peasgood, A., & Mullan, J. (n.d.). Learning analytics in higher education. Retrieved 17 March 2017, from http://ift.tt/1SDGa6m
Siemens, G. (2013). Learning Analytics: The Emergence of a Discipline. American Behavioral Scientist, 57(10), 1380–1400. http://ift.tt/2mG1M9x
West, D., Huijser, H., & Heath, D. (2016). Putting an ethical lens on learning analytics. Educational Technology Research and Development, 64(5), 903–922. http://ift.tt/2n00rgv

March 18, 2017 at 10:58AM
Open in Evernote

I missed the tweetstorm!

But…I always knew I would. It coincided with the last two days of our academic term, so my calendar was full, and long days meant I had no time even to read the tweets, let alone contribute to any of the discussions.

I’ve read it this morning, though, and it looks to have been hugely successful. Lots of tweets, lots of content, and I now find myself more than ever impressed and a little daunted by the talents and skills of my coursemates. But when Jeremy and James release the analytics of the tweetstorm that they’ve been collecting, I won’t be on it.

So here’s my question: how do you analyse absence?

a student who logs into an LMS leaves thousands of data points, including navigation patterns, pauses, reading habits and writing habits (Siemens, p. 1381)

Well, not only has Siemens never seen the dreadful analytics potential of the VLE we use, the crucial point is this: “who logs into”. Similar ideas are critically raised in the lecture we heard this week – the idea of using captured learning data from cradle to university, and using this to provide customised experiences. Learning analytics requires ‘logging’, both in the sense of ‘logging in’, but also in terms of the data trails left behind. You have to put your name to your activity.

This has significant implications. Thinking about assessment, it invites considerations around assessors’ bias (unconscious or otherwise). There are implications for openness and scale too – it’s probably pretty easy for the EDC website to track our locations, even our IP addresses, but you can never know for sure who is reading the website and who isn’t. You can probably come up with an average ‘time’ it might take to read the website page. You can probably track clicks on the links for the readings, but you can’t be sure it’s been read. So there are potentially knock-on effects for the sorts of platforms and media by which teaching can be performed. This relates back to something Jeremy and I discussed a while ago – a sort of tyranny around providing constant evidence for the things that we do, for our engagement with course ideas and course materials. It also smacks of behaviourism which – as Siemens and Long (2011) argue, is not appropriate in HE.

But it also has implications for the ‘lurkers’ among us. Students who may not engage in the ‘prescribed’ way, whether that’s through volition, through a poor internet connection, lack of time, changes in circumstances. How might these people have a personalised learning experience? What data might be collected about these people, and how can it incorporate the richness and subjectivity of experience, of happenstance, of humanness.

My question then, is this: can learning analytics track engagement without framing it entirely within the context of participation, or logging in? Because while these are indicators of engagement, they are not the same thing.


Siemens, G. (2013). Learning Analytics: The Emergence of a Discipline. American Behavioral Scientist, 57(10), 1380–1400. https://doi.org/10.1177/0002764213498851

Siemens, G., & Long, P. (n.d.). Penetrating the Fog: Analytics in Learning and Education. Retrieved 19 March 2017, from http://er.educause.edu/articles/2011/9/penetrating-the-fog-analytics-in-learning-and-education