Comment from Cathy’s blog

Cathy, this is great! Nice work, and an innovative way to question what data is captured – it’s important to balance our interpretation of the meaning of data with what is captured (and what is missing). Thank you!
-Helen

from Comments for Cathy’s Lifestream http://ift.tt/2ohg295
via IFTTT

What I’m reading

At a conference today! #cctl2017

Tags:
March 23, 2017 at 11:53AM
Open in Evernote

I attended a Teaching Forum hosted by the Cambridge Centre for Teaching and Learning on Thursday, and this is a photo of some of the notes that I took during a presentation by Dr Sonia Ilie, on the LEGACY project. Dr Ilie discussed the results of a bit of qualitative research surrounding students’ understanding of learning gain.  One of her arguments put me in mind of learning analytics.

In case my handwriting isn’t clear, Dr Ilie reported that the research had demonstrated that students are variably equipped to reflect upon their own learning. I wondered – in the bottom comment of the photo – about the impact that learning analytics might have upon this. I’m interested in whether learning analytics might help students to develop critically reflective skills, or whether it might let them off the hook by effectively providing them with a shorthand version of that reflection.

What I’m reading

Note

Tags:
March 18, 2017 at 04:48PM
Open in Evernote

I included this because it so strongly chimed with what I was thinking about student profiling – in particular, the highlighted bit reflects my experience of working with young people in HE, and the (in my opinion) dangers of treating any people, but particularly young people, as linear, as models of themselves, or as unable to follow unpredictable paths.

It’s from here, by the way:

Lawson, C., Beer, C., Rossi, D., Moore, T., & Fleming, J. (2016). Identification of ‘at risk’ students using learning analytics: the ethical dilemmas of intervention strategies in a higher education institution. Educational Technology Research and Development, 64(5), 957–968. https://doi.org/10.1007/s11423-016-9459-0

[Edit on Sunday 19th March: it’s also, I notice very much retrospectively, an attempt for me to use the lifestream to model how I study, how I make notes, how I identify comments and other thoughts. There’s another example here. I didn’t really realise I was doing this.]

What I’m reading

Ethics and learning analytics: a short reading list

Lawson, C., Beer, C., Rossi, D., Moore, T., & Fleming, J. (2016). Identification of ‘at risk’ students using learning analytics: the ethical dilemmas of intervention strategies in a higher education institution. Educational Technology Research and Development, 64(5), 957–968. http://ift.tt/2mZMJdo
Roberts, L. D., Howell, J. A., Seaman, K., & Gibson, D. C. (2016). Student Attitudes toward Learning Analytics in Higher Education: ‘The Fitbit Version of the Learning World’. Frontiers in Psychology, 7. http://ift.tt/2mG8Z9H
Rubel, A., & Jones, K. M. L. (2016). Student privacy in learning analytics: An information ethics perspective. The Information Society, 32(2), 143–159. http://ift.tt/2mZPL1n
Scholes, V. (2016). The ethics of using learning analytics to categorize students on risk. Educational Technology Research and Development, 64(5), 939–955. http://ift.tt/2mGk1f5
Sclater, N., Peasgood, A., & Mullan, J. (n.d.). Learning analytics in higher education. Retrieved 17 March 2017, from http://ift.tt/1SDGa6m
Siemens, G. (2013). Learning Analytics: The Emergence of a Discipline. American Behavioral Scientist, 57(10), 1380–1400. http://ift.tt/2mG1M9x
West, D., Huijser, H., & Heath, D. (2016). Putting an ethical lens on learning analytics. Educational Technology Research and Development, 64(5), 903–922. http://ift.tt/2n00rgv

Tags:
March 18, 2017 at 10:58AM
Open in Evernote

Also, while we’re on podcasts, this one is also brilliant. The episode ‘The Rainbows of Inevitability’ is about propaganda, big data and targeted advertising, and has some interesting things to say about how self-perception can be changed through targeted ads based on browsing habits.

Some quotes:

“it’s become more about the collecting of data and for an advertiser buying placements based on that data than it’s been about marketing”

“that’s what really leads to this change in self-perception, that’s what is resulting in behavioural change beyond just being interested in an ad, but it’s really changing how you see yourself”

from Tumblr http://ift.tt/2mPtC4o
via IFTTT

Brilliant episode of one of my favourite podcasts, Criminal. It’s all about how difficult it is to fake your own death, and there’s a fascinating section on how big data (though it isn’t named as such) can inhibit this. It’s not just a matter of sorting out the legal or practical side of things, but data collected on your hobbies, your patterns of behaviour, etc. can all be used to track you down.

from Tumblr http://ift.tt/2n5a660
via IFTTT

What I’m reading

Initial thoughts on the JISC report on learning analytics

Sclater, N., Peasgood, A., & Mullan, J. (2016). Learning analytics in higher education. Retrieved 17 March 2017, from http://ift.tt/1SDGa6m.

  

Summary statement:

 
 

The executive summary identifies four areas in which learning analytics might be used.

 

1      “As a tool for quality assurance and quality improvement” – LA as diagnostic tool both at individual and systemic level; demonstrating compliance with new quality assurance arrangements.

2      “As a tool for boosting retention rates” – with institutions using analytics to identify at risk students, and intervening.

3      “As a tool for assessing and acting upon differential outcomes among the student population” – engagement and progress of e.g. BME students, students from low participation areas.

4      “As an enabler for the development and introduction of adaptive learning” – personalised learning delivered at scale.

Interested in the instrumentalist approach here: “as a tool”, “as an enabler” seems to make this inescapable. Needs – I would say – more recognition of the fact that the platforms, data sources, infrastructures are socio-technical: informed, at the very least, by the humans who created them. Who analyses the learning analytics? What would a posthuman analysis of learning analytics look like?

Some other interesting points made in the Introduction:

Imperative for universities to obtain value from the rich data sources they are building up about their learners. Information known about a person in advance of their application, data accumulated about educational progress, learners likely to withdraw can be identified (p. 12).

This is fair enough, but by heck there’s the potential for a lot of inequality and leveraging of privilege here. Students being judged by their past absolutely inhibits room for development, especially among young people for whom university may be a ‘fresh start’. Also issues around linearity of university experience, who (or what) defines what ‘progress’ looks like, and the fact that new university students are ‘starting’ from different places. Achievement at university level may be a line to reach (1st class, 2.i, 2.ii, etc.) but potential is not.

Learning analytics can furnish teachers with information on the quality of educational content and activities they are providing, and on teaching and assessment processes.

 Identifying a problem is great and useful, but solving that problem is even more important. Can learning analytics help here? Also, suggests that the quality of educational content and activities is fundamentally based on the – what, ability? – of the teacher, rather than the institutional pressures that teacher is under, things like the TEF disrupting good practices, austerity, the socio-economic climate, funding, university overcrowding, lack of resources, etc. It seems perhaps a little good to be true.

Benefits for learners include giving students better information on how they are progressing and what they need to do to meet their educational goals, which has the potential to transform learning and their understanding of how they learn by providing continual formative feedback.

That is, unless some of the things the students are doing – and possibly not doing well – are not trackable e.g. how well a student takes notes while they’re reading, for my students, is a pretty big deal. How will formative feedback be provided there? Tyranny of assessment, tyranny of evidence-base. And, if automated, couldn’t this be demotivating? 

 Adaptive learning systems are emerging to help students develop skills and knowledge in a more personalised way; “set to revolutionise the teaching of basic skills and the provision of educational content” (p. 13).

Basic skills? Such as…I don’t know enough about STEM to know if that would work there, but within HASS I can’t think of that many basic skills that this could help. Critical thinking, synthesising diverse opinions, forming an argument, developing a critical voice, clarity of expression – would these be ‘basic skills’?  Feels like quite a narrow view of what HE learning might be; in my experience, both as student and librarian it isn’t just box-ticking.

 Tags:

March 17, 2017 at 08:47AM
Open in Evernote

Tweets

It’s true, I was really chuffed there was a lecture.

It’s not because of nostalgia – I was never that keen on lectures when I was an undergraduate. It’s not about how I learn best – I pick up ideas best through reading and synthesising ideas, and I think I’m generally all right at self-directing my learning.

Instead, there’s something in the combination of passivity and activity in listening to a lecture for a change. Especially because the lecture slides were illustrative rather than instructive. It was just a change of pace, and therein one of the benefits of multimodality.