What I’m reading

At a conference today! #cctl2017

Tags:
March 23, 2017 at 11:53AM
Open in Evernote

I attended a Teaching Forum hosted by the Cambridge Centre for Teaching and Learning on Thursday, and this is a photo of some of the notes that I took during a presentation by Dr Sonia Ilie, on the LEGACY project. Dr Ilie discussed the results of a bit of qualitative research surrounding students’ understanding of learning gain.  One of her arguments put me in mind of learning analytics.

In case my handwriting isn’t clear, Dr Ilie reported that the research had demonstrated that students are variably equipped to reflect upon their own learning. I wondered – in the bottom comment of the photo – about the impact that learning analytics might have upon this. I’m interested in whether learning analytics might help students to develop critically reflective skills, or whether it might let them off the hook by effectively providing them with a shorthand version of that reflection.

What I’m reading

Note

Tags:
March 18, 2017 at 04:48PM
Open in Evernote

I included this because it so strongly chimed with what I was thinking about student profiling – in particular, the highlighted bit reflects my experience of working with young people in HE, and the (in my opinion) dangers of treating any people, but particularly young people, as linear, as models of themselves, or as unable to follow unpredictable paths.

It’s from here, by the way:

Lawson, C., Beer, C., Rossi, D., Moore, T., & Fleming, J. (2016). Identification of ‘at risk’ students using learning analytics: the ethical dilemmas of intervention strategies in a higher education institution. Educational Technology Research and Development, 64(5), 957–968. https://doi.org/10.1007/s11423-016-9459-0

[Edit on Sunday 19th March: it’s also, I notice very much retrospectively, an attempt for me to use the lifestream to model how I study, how I make notes, how I identify comments and other thoughts. There’s another example here. I didn’t really realise I was doing this.]

What I’m reading

Ethics and learning analytics: a short reading list

Lawson, C., Beer, C., Rossi, D., Moore, T., & Fleming, J. (2016). Identification of ‘at risk’ students using learning analytics: the ethical dilemmas of intervention strategies in a higher education institution. Educational Technology Research and Development, 64(5), 957–968. http://ift.tt/2mZMJdo
Roberts, L. D., Howell, J. A., Seaman, K., & Gibson, D. C. (2016). Student Attitudes toward Learning Analytics in Higher Education: ‘The Fitbit Version of the Learning World’. Frontiers in Psychology, 7. http://ift.tt/2mG8Z9H
Rubel, A., & Jones, K. M. L. (2016). Student privacy in learning analytics: An information ethics perspective. The Information Society, 32(2), 143–159. http://ift.tt/2mZPL1n
Scholes, V. (2016). The ethics of using learning analytics to categorize students on risk. Educational Technology Research and Development, 64(5), 939–955. http://ift.tt/2mGk1f5
Sclater, N., Peasgood, A., & Mullan, J. (n.d.). Learning analytics in higher education. Retrieved 17 March 2017, from http://ift.tt/1SDGa6m
Siemens, G. (2013). Learning Analytics: The Emergence of a Discipline. American Behavioral Scientist, 57(10), 1380–1400. http://ift.tt/2mG1M9x
West, D., Huijser, H., & Heath, D. (2016). Putting an ethical lens on learning analytics. Educational Technology Research and Development, 64(5), 903–922. http://ift.tt/2n00rgv

Tags:
March 18, 2017 at 10:58AM
Open in Evernote

What I’m reading

Initial thoughts on the JISC report on learning analytics

Sclater, N., Peasgood, A., & Mullan, J. (2016). Learning analytics in higher education. Retrieved 17 March 2017, from http://ift.tt/1SDGa6m.

  

Summary statement:

 
 

The executive summary identifies four areas in which learning analytics might be used.

 

1      “As a tool for quality assurance and quality improvement” – LA as diagnostic tool both at individual and systemic level; demonstrating compliance with new quality assurance arrangements.

2      “As a tool for boosting retention rates” – with institutions using analytics to identify at risk students, and intervening.

3      “As a tool for assessing and acting upon differential outcomes among the student population” – engagement and progress of e.g. BME students, students from low participation areas.

4      “As an enabler for the development and introduction of adaptive learning” – personalised learning delivered at scale.

Interested in the instrumentalist approach here: “as a tool”, “as an enabler” seems to make this inescapable. Needs – I would say – more recognition of the fact that the platforms, data sources, infrastructures are socio-technical: informed, at the very least, by the humans who created them. Who analyses the learning analytics? What would a posthuman analysis of learning analytics look like?

Some other interesting points made in the Introduction:

Imperative for universities to obtain value from the rich data sources they are building up about their learners. Information known about a person in advance of their application, data accumulated about educational progress, learners likely to withdraw can be identified (p. 12).

This is fair enough, but by heck there’s the potential for a lot of inequality and leveraging of privilege here. Students being judged by their past absolutely inhibits room for development, especially among young people for whom university may be a ‘fresh start’. Also issues around linearity of university experience, who (or what) defines what ‘progress’ looks like, and the fact that new university students are ‘starting’ from different places. Achievement at university level may be a line to reach (1st class, 2.i, 2.ii, etc.) but potential is not.

Learning analytics can furnish teachers with information on the quality of educational content and activities they are providing, and on teaching and assessment processes.

 Identifying a problem is great and useful, but solving that problem is even more important. Can learning analytics help here? Also, suggests that the quality of educational content and activities is fundamentally based on the – what, ability? – of the teacher, rather than the institutional pressures that teacher is under, things like the TEF disrupting good practices, austerity, the socio-economic climate, funding, university overcrowding, lack of resources, etc. It seems perhaps a little good to be true.

Benefits for learners include giving students better information on how they are progressing and what they need to do to meet their educational goals, which has the potential to transform learning and their understanding of how they learn by providing continual formative feedback.

That is, unless some of the things the students are doing – and possibly not doing well – are not trackable e.g. how well a student takes notes while they’re reading, for my students, is a pretty big deal. How will formative feedback be provided there? Tyranny of assessment, tyranny of evidence-base. And, if automated, couldn’t this be demotivating? 

 Adaptive learning systems are emerging to help students develop skills and knowledge in a more personalised way; “set to revolutionise the teaching of basic skills and the provision of educational content” (p. 13).

Basic skills? Such as…I don’t know enough about STEM to know if that would work there, but within HASS I can’t think of that many basic skills that this could help. Critical thinking, synthesising diverse opinions, forming an argument, developing a critical voice, clarity of expression – would these be ‘basic skills’?  Feels like quite a narrow view of what HE learning might be; in my experience, both as student and librarian it isn’t just box-ticking.

 Tags:

March 17, 2017 at 08:47AM
Open in Evernote

What I’m reading

Facebook and Visibility

Bucher, T. (2012). Want to be on the top? Algorithmic power and the threat of invisibility on Facebook. New Media & Society, 14(7), 1164–1180. http://ift.tt/2ngfW1X

Ideas around visibility and agency are taken up by Bucher in an article about EdgeRank. Using notions of disciplinary power espoused by Foucault in his Panopticon, Bucher draws three conclusions to link visibility to discipline and power:
 
1.    Disciplinary power, understood by Foucault to be constraining and enabling, allowing subjects to reach their “full potentiality as useful individuals” (Foucault, 1991, p. 212). In Facebook, a useful individual is one who participates, communicates and interacts, argues Bucher. The ‘punishment’, then, for not doing so is invisibility.

2.    Given the content it pushes to the tops of news feeds, EdgeRank makes it appear as though everyone is commenting and liking things, providing an incentive for the individual to join in too. Disciplinary power judges according to what is considered to be normative; Facebook makes it seem as though participation is the norm here.

3.    Popularity on Facebook increases visibility, feeding further popularity, reinforcing an impression of visibility which, Bucher convincingly argues, “runs counter to […] discourse that focuses on democratization and empowerment” (p. 1176).
 
So in summary: the punishments for non-participation, the norms or behaviours which are privileged, and the cycles of interaction – might these be the ways we can uncover algorithms that are hidden?

References

Foucault, M. (1991). Discipline and punish: the birth of the prison (Reprint). London: Penguin Books.

Tags:
March 11, 2017 at 03:26PM
Open in Evernote

What I’m reading

Algorithms in library catalogue results

Vaughan, J. (2011). Chapter 5: Ex Libris Primo Central. Library Technology Reports, 47(1), 39–47.

 

Tags:
March 10, 2017 at 07:31AM
Open in Evernote

Included because – from my perspective as an academic librarian – the way in which library catalogues (or discovery layers) order results is absolutely crucial. I have a lot of anecdotal evidence to suggest that if it isn’t in the first few results, students will assume we don’t have it; decent relevancy rankings has a genuine impact on students’ ability to research, and clear and ostensible implications for their learning.

What I’m reading

Three challenges of big data according to Eynon:

  1. Ethics – privacy, informed consent, protection of harm. Example of student registration: social implications of telling students if they are likely to drop out (according to learner analytics). Makes it a self-fulfilling prophecy?
  2. Kindsof research – the availability of data biases in the types of research we carry out, the questions we can ask. Can advances in open data help with this?
  3. Inequality – how big data reinforces and exacerbates social and educational inequalities e.g. tracking only those in a specific socio-economic bracket. Digital divide, yes, but doesn’t it also work the other way round – social inequalities mean that some people are better equipped to avoid surveillance via big data?

 

Eynon, R. (2013). The rise of Big Data: what does it mean for education, technology, and media research? Learning, Media and Technology, 38(3), 237-240. Doi: 10.1080/17439884.2013.771783

Tags:
March 12, 2017 at 12:16PM
Open in Evernote

What I’m reading

Week 8 reading

Bucher, T. (2012). Want to be on the top? Algorithmic power and the threat of invisibility on Facebook. New Media & Society, 14(7), 1164–1180. http://ift.tt/2ngfW1X
Eynon, R. (2013). The rise of Big Data: what does it mean for education, technology, and media research? Learning, Media and Technology, 38(3), 237–240. http://ift.tt/2mzKLQW
Goodreads Rolls Out Book Recommendation Feature. (2011). Library Journal, 136(17), 16–18.
Knox, J. (2015a). Active algorithms: Sociomaterial spaces in the e-learning and digital cultures MOOC. Campus Virtuales, 3(1), 42–55.
Knox, J. (2015b). Critical Education and Digital Cultures. In M. Peters (Ed.), Encyclopedia of Educational Philosophy and Theory (pp. 1–6). Singapore: Springer Singapore. http://ift.tt/2mfrEZI
Vaughan, J. (2011). Ex Libris Primo Central. Library Technology Reports, 47(1), 39–47.

Tags:
March 12, 2017 at 12:03PM
Open in Evernote