Read read read
Tags:
April 02, 2017 at 03:22PM
Open in Evernote
Education and Digital Cultures
Reading an article that includes this phrase: "a certain performative, post-human, ethico-epistem-ontology" #mscedc pic.twitter.com/JyuWwOoxDC
— Helen Murphy (@lemurph) April 1, 2017
I’ve spent some time this weekend reading a couple of articles to help me to formulate the specific questions I’d like to focus on in the assignment. I was mostly enjoying myself, when I started on an article that elicited the reaction you can see in the tweet above. The phrase in the tweet – “a certain performative, post-human, ethico-epistem-ontology” is pretty much inaccessible, and this is a real bugbear of mine. Thankfully I’ve encountered it only a few times in this course. It took me a while to figure out what the author was getting at with his ethico-epistem-ontology, and when I did I found that it wasn’t half as fancy or clever as the language used might suggest.
Ideas should challenge, and language should challenge too, but one of the things about good academic writing (obviously something on my mind with the assignment coming up) is the ability to represent and communicate complex, nuanced, difficult ideas in a way that doesn’t throw up a huge great wall. There are times when that huge barrier is instrumental to the argument, I suppose: I remember reading Derrida…*
Yet largely if the aforementioned ‘challenge’ is located as much in the discrete individual words used as in the premises of the argument (assuming, of course, that the two can be separated), then what does that mean for the locus of academic literacy? And what does it mean for openness? The trend toward open access and open data, despite being fraught with issues around policy, the way technology is implicated, and other things, is generally a positive. But is representation of ideas like this even vaguely ‘open’ in anything but a literal sense?
Anyway, this is a total aside, and I’ll bring an end of the rant. Authentic content for the lifestream, I think 🙂
*OK, I mainly looked at the words and panicked internally
Note
Tags:
March 18, 2017 at 04:48PM
Open in Evernote
I included this because it so strongly chimed with what I was thinking about student profiling – in particular, the highlighted bit reflects my experience of working with young people in HE, and the (in my opinion) dangers of treating any people, but particularly young people, as linear, as models of themselves, or as unable to follow unpredictable paths.
It’s from here, by the way:
Lawson, C., Beer, C., Rossi, D., Moore, T., & Fleming, J. (2016). Identification of ‘at risk’ students using learning analytics: the ethical dilemmas of intervention strategies in a higher education institution. Educational Technology Research and Development, 64(5), 957–968. https://doi.org/10.1007/s11423-016-9459-0
[Edit on Sunday 19th March: it’s also, I notice very much retrospectively, an attempt for me to use the lifestream to model how I study, how I make notes, how I identify comments and other thoughts. There’s another example here. I didn’t really realise I was doing this.]
Ethics and learning analytics: a short reading list
Tags:
March 18, 2017 at 10:58AM
Open in Evernote
Initial thoughts on the JISC report on learning analytics
Sclater, N., Peasgood, A., & Mullan, J. (2016). Learning analytics in higher education. Retrieved 17 March 2017, from http://ift.tt/1SDGa6m.
Summary statement:
The executive summary identifies four areas in which learning analytics might be used.
1 “As a tool for quality assurance and quality improvement” – LA as diagnostic tool both at individual and systemic level; demonstrating compliance with new quality assurance arrangements.
2 “As a tool for boosting retention rates” – with institutions using analytics to identify at risk students, and intervening.
3 “As a tool for assessing and acting upon differential outcomes among the student population” – engagement and progress of e.g. BME students, students from low participation areas.
4 “As an enabler for the development and introduction of adaptive learning” – personalised learning delivered at scale.
Interested in the instrumentalist approach here: “as a tool”, “as an enabler” seems to make this inescapable. Needs – I would say – more recognition of the fact that the platforms, data sources, infrastructures are socio-technical: informed, at the very least, by the humans who created them. Who analyses the learning analytics? What would a posthuman analysis of learning analytics look like?
Some other interesting points made in the Introduction:
Imperative for universities to obtain value from the rich data sources they are building up about their learners. Information known about a person in advance of their application, data accumulated about educational progress, learners likely to withdraw can be identified (p. 12).
This is fair enough, but by heck there’s the potential for a lot of inequality and leveraging of privilege here. Students being judged by their past absolutely inhibits room for development, especially among young people for whom university may be a ‘fresh start’. Also issues around linearity of university experience, who (or what) defines what ‘progress’ looks like, and the fact that new university students are ‘starting’ from different places. Achievement at university level may be a line to reach (1st class, 2.i, 2.ii, etc.) but potential is not.
Learning analytics can furnish teachers with information on the quality of educational content and activities they are providing, and on teaching and assessment processes.
Identifying a problem is great and useful, but solving that problem is even more important. Can learning analytics help here? Also, suggests that the quality of educational content and activities is fundamentally based on the – what, ability? – of the teacher, rather than the institutional pressures that teacher is under, things like the TEF disrupting good practices, austerity, the socio-economic climate, funding, university overcrowding, lack of resources, etc. It seems perhaps a little good to be true.
Benefits for learners include giving students better information on how they are progressing and what they need to do to meet their educational goals, which has the potential to transform learning and their understanding of how they learn by providing continual formative feedback.
That is, unless some of the things the students are doing – and possibly not doing well – are not trackable e.g. how well a student takes notes while they’re reading, for my students, is a pretty big deal. How will formative feedback be provided there? Tyranny of assessment, tyranny of evidence-base. And, if automated, couldn’t this be demotivating?
Adaptive learning systems are emerging to help students develop skills and knowledge in a more personalised way; “set to revolutionise the teaching of basic skills and the provision of educational content” (p. 13).
Basic skills? Such as…I don’t know enough about STEM to know if that would work there, but within HASS I can’t think of that many basic skills that this could help. Critical thinking, synthesising diverse opinions, forming an argument, developing a critical voice, clarity of expression – would these be ‘basic skills’? Feels like quite a narrow view of what HE learning might be; in my experience, both as student and librarian it isn’t just box-ticking.
Tags:
March 17, 2017 at 08:47AM
Open in Evernote
Facebook and Visibility
Tags:
March 11, 2017 at 03:26PM
Open in Evernote
Algorithms in library catalogue results
Tags:
March 10, 2017 at 07:31AM
Open in Evernote
Included because – from my perspective as an academic librarian – the way in which library catalogues (or discovery layers) order results is absolutely crucial. I have a lot of anecdotal evidence to suggest that if it isn’t in the first few results, students will assume we don’t have it; decent relevancy rankings has a genuine impact on students’ ability to research, and clear and ostensible implications for their learning.
Three challenges of big data according to Eynon:
Eynon, R. (2013). The rise of Big Data: what does it mean for education, technology, and media research? Learning, Media and Technology, 38(3), 237-240. Doi: 10.1080/17439884.2013.771783
Tags:
March 12, 2017 at 12:16PM
Open in Evernote
Week 8 reading
Tags:
March 12, 2017 at 12:03PM
Open in Evernote