Lifestream, Pocket, Informing Pedagogical Action

Excerpt:

Informing Pedagogical Action

Aligning Learning Analytics With Learning Design

First Published March 12, 2013 research-article

This article considers the developing field of learning analytics and argues that to move from small-scale practice to broad scale applicability, there is a need to establish a contextual framework that helps teachers interpret the information that analytics provides. The article presents learning design as a form of documentation of pedagogical intent that can provide the context for making sense of diverse sets of analytic data. We investigate one example of learning design to explore how broad categories of analytics—which we call checkpoint and process analytics—can inform the interpretation of outcomes from a learning design and facilitate pedagogical action.

via Pocket http://ift.tt/2nh0g1k


The basic premise of the article is:

Why do we need this framework?

To date, learning analytics studies have tended to focus on broad learning measures such as of student attrition (Arnold, 2010), sense of community and achievement (Fritz, 2011), and overall return on investment of implemented technologies (Norris, Baer, Leonard, Pugliese, & Lefrere, 2008). However, learning analytics also provides additional and more sophisticated measures of the student learning process that can assist teachers in designing, implementing, and revising courses (p. 1441)

Within learning design, research approaches such as focus group interviews are often used to inform redesign of courses and learning activities. The authors suggest that using analytics overcomes data inaccuracy that can be associated with focus group style research, as such approaches are reliant on self-reporting and accurate recollection of details by participants. However, they note that the interpretation of LA data against pedagogical intention is challenging, and propose a framework – “check-points and processes analytics” – for evaluating learning design.

Check-Points and Processes Analytics

In the proposed framework, two types of analytics (illustrated in the diagram above by circles and crosses in the final column) are utilised:

  1. checkpoint analytics“the snapshot data that indicate a student has met the prerequisites for learning by accessing the relevant resources of the learning design” (p. 1448)This type of data can be used during course delivery to ascertain whether learners have accessed the required materials and are progressing through the intended learning sequence, and prompt ‘just in time’ support (reminders, encouragement) when learners have not engaged in any required steps.
  2. process analytics“These data and analyses provide direct insight into learner information processing and knowledge application (Elias, 2011) within the tasks that the student completes as part of a learning design.” (p. 1448)

Again, this data could support interventions when students are involved in group work, for example, if patterns of interaction diverge from the intended patterns (unequal participation, for example, through social network visualisation).

My Reactions

On the one hand, this application of LA interests me because it puts LA into the work that I do as a teacher rather than at an institutional level. It feels more ‘real’ in that its focus is on pedagogy rather than the broad strokes of ‘student experience’. The institutional use of LA can sometimes seem to frame teachers as service providers and reflect the commodification of education. In contrast, this application seems like a teaching tool (with the caveat that the check-points analysis may be seen to adopt a transactional view of learning). However, I’m cautious because any plan to monitor and direct patterns of interaction is underpinned by assumptions about what effective learning looks like, and the ability to automate such monitoring and intervention through LA could enable blind adherence to a particular view of learning. Of course, even without LA we use such assumptions in our teaching: in the face to face classroom a teacher monitors group work and intervenes when students seem off task or are not communicating with each other as intended. Such interactions/engagement with tasks can (as the authors note) be more difficult to monitor in online learning, and LA could be a helpful tool for teachers online, and inform task setup and choice of technological tools used. In this sense, I would be very interested in utilising the analytics approach outlined – but I would be much less interested in it being used as an evaluative tool of my teaching, if, for example, it were based on a departmental ‘ruling’ about the types of interactions deemed to be supportive of learning, and very much less interested in using it as part of student assessment, wherein students were expected to conform to particular models of interaction in order to be ‘successful’ (see, for example, MacFarlane, 2015). As with all analytics and algorithms, the danger seems to be in the application.

4 Replies to “Lifestream, Pocket, Informing Pedagogical Action”

  1. ‘but I would be much less interested in it being used as an evaluative tool of my teaching, if, for example, it were based on a departmental ‘ruling’ about the types of interactions deemed to be supportive of learning’

    Indeed, I can’t help thinking that this kind of system ,as I understand it, would largely benefit an outside ‘viewer’ looking in on the pedagogical situation, much more than it would benefit a teacher. It seems that this ‘accounting’ for the unknown, unseen, unmeasured ‘successful teaching’ is really what this kind of system wants to expose – but who stands to benefit? Probably the teachers that teach most closely to predetermined measures of success, like student test scores, or even subsequent employment.

    And worrying close to ‘academic analytics’ isn’t it? Teachers may be the next group subjected to promises of intensive data-infused efficiency measures…

    You might also be interested in this project, Learning Designer: http://tel.ioe.ac.uk/wp-content/uploads/2012/07/LearningDesigner.pdf

    Lots of these projects seems to talk about ’empowering’ teachers, yet there also tends to be a condition that they have to respond to (often hidden) data analysis.

  2. Thanks for the link, Jeremy – it’s good to be reminded of. I came across Laurillard’s Conversational Framework when I was doing IDEL, and then Learning Designer (the demo software) and the Pedagogical Pattern Collector. They’re interesting tools – and helpful for reflection on the choices we make when planning learning.

    “And worrying close to ‘academic analytics’ isn’t it? Teachers may be the next group subjected to promises of intensive data-infused efficiency measures…”
    Indeed. All hail accountability.. (but when the TEFis rolled out and universities only offer courses with high rankings -which are influenced by graduate earnings- there won’t be any (new) teachers to deliver the promise to.. 😉

Comments are closed.