I’ve spent some time this weekend reading a couple of articles to help me to formulate the specific questions I’d like to focus on in the assignment. I was mostly enjoying myself, when I started on an article that elicited the reaction you can see in the tweet above. The phrase in the tweet – “a certain performative, post-human, ethico-epistem-ontology” is pretty much inaccessible, and this is a real bugbear of mine. Thankfully I’ve encountered it only a few times in this course. It took me a while to figure out what the author was getting at with his ethico-epistem-ontology, and when I did I found that it wasn’t half as fancy or clever as the language used might suggest.

Ideas should challenge, and language should challenge too, but one of the things about good academic writing (obviously something on my mind with the assignment coming up) is the ability to represent and communicate complex, nuanced, difficult ideas in a way that doesn’t throw up a huge great wall. There are times when that huge barrier is instrumental to the argument, I suppose: I remember reading Derrida…*

Yet largely if the aforementioned ‘challenge’ is located as much in the discrete individual words used as in the premises of the argument (assuming, of course, that the two can be separated), then what does that mean for the locus of academic literacy? And what does it mean for openness? The trend toward open access and open data, despite being fraught with issues around policy, the way technology is implicated, and other things, is generally a positive. But is representation of ideas like this even vaguely ‘open’ in anything but a literal sense?

Anyway, this is a total aside, and I’ll bring an end of the rant.  Authentic content for the lifestream, I think 🙂

*OK, I mainly looked at the words and panicked internally

What I’m reading


March 18, 2017 at 04:48PM
Open in Evernote

I included this because it so strongly chimed with what I was thinking about student profiling – in particular, the highlighted bit reflects my experience of working with young people in HE, and the (in my opinion) dangers of treating any people, but particularly young people, as linear, as models of themselves, or as unable to follow unpredictable paths.

It’s from here, by the way:

Lawson, C., Beer, C., Rossi, D., Moore, T., & Fleming, J. (2016). Identification of ‘at risk’ students using learning analytics: the ethical dilemmas of intervention strategies in a higher education institution. Educational Technology Research and Development, 64(5), 957–968.

[Edit on Sunday 19th March: it’s also, I notice very much retrospectively, an attempt for me to use the lifestream to model how I study, how I make notes, how I identify comments and other thoughts. There’s another example here. I didn’t really realise I was doing this.]

What I’m reading

Ethics and learning analytics: a short reading list

Lawson, C., Beer, C., Rossi, D., Moore, T., & Fleming, J. (2016). Identification of ‘at risk’ students using learning analytics: the ethical dilemmas of intervention strategies in a higher education institution. Educational Technology Research and Development, 64(5), 957–968.
Roberts, L. D., Howell, J. A., Seaman, K., & Gibson, D. C. (2016). Student Attitudes toward Learning Analytics in Higher Education: ‘The Fitbit Version of the Learning World’. Frontiers in Psychology, 7.
Rubel, A., & Jones, K. M. L. (2016). Student privacy in learning analytics: An information ethics perspective. The Information Society, 32(2), 143–159.
Scholes, V. (2016). The ethics of using learning analytics to categorize students on risk. Educational Technology Research and Development, 64(5), 939–955.
Sclater, N., Peasgood, A., & Mullan, J. (n.d.). Learning analytics in higher education. Retrieved 17 March 2017, from
Siemens, G. (2013). Learning Analytics: The Emergence of a Discipline. American Behavioral Scientist, 57(10), 1380–1400.
West, D., Huijser, H., & Heath, D. (2016). Putting an ethical lens on learning analytics. Educational Technology Research and Development, 64(5), 903–922.

March 18, 2017 at 10:58AM
Open in Evernote

What I’m reading

Initial thoughts on the JISC report on learning analytics

Sclater, N., Peasgood, A., & Mullan, J. (2016). Learning analytics in higher education. Retrieved 17 March 2017, from


Summary statement:


The executive summary identifies four areas in which learning analytics might be used.


1      “As a tool for quality assurance and quality improvement” – LA as diagnostic tool both at individual and systemic level; demonstrating compliance with new quality assurance arrangements.

2      “As a tool for boosting retention rates” – with institutions using analytics to identify at risk students, and intervening.

3      “As a tool for assessing and acting upon differential outcomes among the student population” – engagement and progress of e.g. BME students, students from low participation areas.

4      “As an enabler for the development and introduction of adaptive learning” – personalised learning delivered at scale.

Interested in the instrumentalist approach here: “as a tool”, “as an enabler” seems to make this inescapable. Needs – I would say – more recognition of the fact that the platforms, data sources, infrastructures are socio-technical: informed, at the very least, by the humans who created them. Who analyses the learning analytics? What would a posthuman analysis of learning analytics look like?

Some other interesting points made in the Introduction:

Imperative for universities to obtain value from the rich data sources they are building up about their learners. Information known about a person in advance of their application, data accumulated about educational progress, learners likely to withdraw can be identified (p. 12).

This is fair enough, but by heck there’s the potential for a lot of inequality and leveraging of privilege here. Students being judged by their past absolutely inhibits room for development, especially among young people for whom university may be a ‘fresh start’. Also issues around linearity of university experience, who (or what) defines what ‘progress’ looks like, and the fact that new university students are ‘starting’ from different places. Achievement at university level may be a line to reach (1st class, 2.i, 2.ii, etc.) but potential is not.

Learning analytics can furnish teachers with information on the quality of educational content and activities they are providing, and on teaching and assessment processes.

 Identifying a problem is great and useful, but solving that problem is even more important. Can learning analytics help here? Also, suggests that the quality of educational content and activities is fundamentally based on the – what, ability? – of the teacher, rather than the institutional pressures that teacher is under, things like the TEF disrupting good practices, austerity, the socio-economic climate, funding, university overcrowding, lack of resources, etc. It seems perhaps a little good to be true.

Benefits for learners include giving students better information on how they are progressing and what they need to do to meet their educational goals, which has the potential to transform learning and their understanding of how they learn by providing continual formative feedback.

That is, unless some of the things the students are doing – and possibly not doing well – are not trackable e.g. how well a student takes notes while they’re reading, for my students, is a pretty big deal. How will formative feedback be provided there? Tyranny of assessment, tyranny of evidence-base. And, if automated, couldn’t this be demotivating? 

 Adaptive learning systems are emerging to help students develop skills and knowledge in a more personalised way; “set to revolutionise the teaching of basic skills and the provision of educational content” (p. 13).

Basic skills? Such as…I don’t know enough about STEM to know if that would work there, but within HASS I can’t think of that many basic skills that this could help. Critical thinking, synthesising diverse opinions, forming an argument, developing a critical voice, clarity of expression – would these be ‘basic skills’?  Feels like quite a narrow view of what HE learning might be; in my experience, both as student and librarian it isn’t just box-ticking.


March 17, 2017 at 08:47AM
Open in Evernote

What I’m reading

Facebook and Visibility

Bucher, T. (2012). Want to be on the top? Algorithmic power and the threat of invisibility on Facebook. New Media & Society, 14(7), 1164–1180.

Ideas around visibility and agency are taken up by Bucher in an article about EdgeRank. Using notions of disciplinary power espoused by Foucault in his Panopticon, Bucher draws three conclusions to link visibility to discipline and power:
1.    Disciplinary power, understood by Foucault to be constraining and enabling, allowing subjects to reach their “full potentiality as useful individuals” (Foucault, 1991, p. 212). In Facebook, a useful individual is one who participates, communicates and interacts, argues Bucher. The ‘punishment’, then, for not doing so is invisibility.

2.    Given the content it pushes to the tops of news feeds, EdgeRank makes it appear as though everyone is commenting and liking things, providing an incentive for the individual to join in too. Disciplinary power judges according to what is considered to be normative; Facebook makes it seem as though participation is the norm here.

3.    Popularity on Facebook increases visibility, feeding further popularity, reinforcing an impression of visibility which, Bucher convincingly argues, “runs counter to […] discourse that focuses on democratization and empowerment” (p. 1176).
So in summary: the punishments for non-participation, the norms or behaviours which are privileged, and the cycles of interaction – might these be the ways we can uncover algorithms that are hidden?


Foucault, M. (1991). Discipline and punish: the birth of the prison (Reprint). London: Penguin Books.

March 11, 2017 at 03:26PM
Open in Evernote

What I’m reading

Algorithms in library catalogue results

Vaughan, J. (2011). Chapter 5: Ex Libris Primo Central. Library Technology Reports, 47(1), 39–47.


March 10, 2017 at 07:31AM
Open in Evernote

Included because – from my perspective as an academic librarian – the way in which library catalogues (or discovery layers) order results is absolutely crucial. I have a lot of anecdotal evidence to suggest that if it isn’t in the first few results, students will assume we don’t have it; decent relevancy rankings has a genuine impact on students’ ability to research, and clear and ostensible implications for their learning.

What I’m reading

Three challenges of big data according to Eynon:

  1. Ethics – privacy, informed consent, protection of harm. Example of student registration: social implications of telling students if they are likely to drop out (according to learner analytics). Makes it a self-fulfilling prophecy?
  2. Kindsof research – the availability of data biases in the types of research we carry out, the questions we can ask. Can advances in open data help with this?
  3. Inequality – how big data reinforces and exacerbates social and educational inequalities e.g. tracking only those in a specific socio-economic bracket. Digital divide, yes, but doesn’t it also work the other way round – social inequalities mean that some people are better equipped to avoid surveillance via big data?


Eynon, R. (2013). The rise of Big Data: what does it mean for education, technology, and media research? Learning, Media and Technology, 38(3), 237-240. Doi: 10.1080/17439884.2013.771783

March 12, 2017 at 12:16PM
Open in Evernote

What I’m reading

Week 8 reading

Bucher, T. (2012). Want to be on the top? Algorithmic power and the threat of invisibility on Facebook. New Media & Society, 14(7), 1164–1180.
Eynon, R. (2013). The rise of Big Data: what does it mean for education, technology, and media research? Learning, Media and Technology, 38(3), 237–240.
Goodreads Rolls Out Book Recommendation Feature. (2011). Library Journal, 136(17), 16–18.
Knox, J. (2015a). Active algorithms: Sociomaterial spaces in the e-learning and digital cultures MOOC. Campus Virtuales, 3(1), 42–55.
Knox, J. (2015b). Critical Education and Digital Cultures. In M. Peters (Ed.), Encyclopedia of Educational Philosophy and Theory (pp. 1–6). Singapore: Springer Singapore.
Vaughan, J. (2011). Ex Libris Primo Central. Library Technology Reports, 47(1), 39–47.

March 12, 2017 at 12:03PM
Open in Evernote