@james858499 if y'all believe that,you academics better also be data scientists,or be happy w/corporate interests running 'research'#mscedc
— Renée Hann (@rennhann) March 17, 2017
Lifestream, Tweets
@notwithabrush @nigelchpainting I just got my 2nd: for baby carrier vests&rucksacks, in German.With a carrier, are you transhuman? #mscedc
— Renée Hann (@rennhann) March 17, 2017
Lifestream, Tweets
@RossGarnerGP #mscedc alternative-give ss own'ship of data,let them filter it&use LA as a metacognitive tool for SDL https://t.co/ZncesdldUf
— Renée Hann (@rennhann) March 17, 2017
In ‘Learning Analytics as a Metacognitive Tool‘, Durall & Gros (2014) argue for making data more transparent to students, and providing visualisation tools which students can (selectively) use to view their data. In this way, students can make data more meaningful by only focusing on those metrics which they value. The authors align such an approach with Human-Data Interaction:
According to Haddadi et al. (2013) “The term Human-Data-Interaction (HDI) arises from the need, both ethical and practical, to engage users to a much greater degree with the collection, analysis, and trade of their personal data, in addition to providing them with an intuitive feedback mechanism” (p.3).
Durall & Gross, 2014, p. 382
Further, they suggest that giving access to and control of data to students in this way can support self-directed learning (SDL) and self-regulated learning (SRL). I agree that it can – but we need to support developing this kind of learner, so that such tools can be effective, and help establish support networks for students who prefer more communal learning approaches or seek greater direction/instruction. It’s a much more exciting (pedagogically) way of looking at LA, to me.
Here are the slides to go with the paper:
Lifestream, Aside
Something is ‘up’ with my husband’s FB recommendations algorithm. I support that it’s moved to ‘female’ products (Amazon is also recommending him female shoes, as noted here) but it could at least be things I would want him to buy me..
Or.. maybe I don’t know him as well as I thought, and it’s what he’s always wanted.. 😉
Lifestream, Pocket, Informing Pedagogical Action
Excerpt:
Informing Pedagogical Action
Aligning Learning Analytics With Learning Design
First Published March 12, 2013 research-article
Abstract
This article considers the developing field of learning analytics and argues that to move from small-scale practice to broad scale applicability, there is a need to establish a contextual framework that helps teachers interpret the information that analytics provides. The article presents learning design as a form of documentation of pedagogical intent that can provide the context for making sense of diverse sets of analytic data. We investigate one example of learning design to explore how broad categories of analytics—which we call checkpoint and process analytics—can inform the interpretation of outcomes from a learning design and facilitate pedagogical action.
via Pocket http://ift.tt/2nh0g1k
The basic premise of the article is:
Why do we need this framework?
To date, learning analytics studies have tended to focus on broad learning measures such as of student attrition (Arnold, 2010), sense of community and achievement (Fritz, 2011), and overall return on investment of implemented technologies (Norris, Baer, Leonard, Pugliese, & Lefrere, 2008). However, learning analytics also provides additional and more sophisticated measures of the student learning process that can assist teachers in designing, implementing, and revising courses (p. 1441)
Within learning design, research approaches such as focus group interviews are often used to inform redesign of courses and learning activities. The authors suggest that using analytics overcomes data inaccuracy that can be associated with focus group style research, as such approaches are reliant on self-reporting and accurate recollection of details by participants. However, they note that the interpretation of LA data against pedagogical intention is challenging, and propose a framework – “check-points and processes analytics” – for evaluating learning design.
Check-Points and Processes Analytics
In the proposed framework, two types of analytics (illustrated in the diagram above by circles and crosses in the final column) are utilised:
- checkpoint analytics“the snapshot data that indicate a student has met the prerequisites for learning by accessing the relevant resources of the learning design” (p. 1448)This type of data can be used during course delivery to ascertain whether learners have accessed the required materials and are progressing through the intended learning sequence, and prompt ‘just in time’ support (reminders, encouragement) when learners have not engaged in any required steps.
- process analytics“These data and analyses provide direct insight into learner information processing and knowledge application (Elias, 2011) within the tasks that the student completes as part of a learning design.” (p. 1448)
Again, this data could support interventions when students are involved in group work, for example, if patterns of interaction diverge from the intended patterns (unequal participation, for example, through social network visualisation).
My Reactions
On the one hand, this application of LA interests me because it puts LA into the work that I do as a teacher rather than at an institutional level. It feels more ‘real’ in that its focus is on pedagogy rather than the broad strokes of ‘student experience’. The institutional use of LA can sometimes seem to frame teachers as service providers and reflect the commodification of education. In contrast, this application seems like a teaching tool (with the caveat that the check-points analysis may be seen to adopt a transactional view of learning). However, I’m cautious because any plan to monitor and direct patterns of interaction is underpinned by assumptions about what effective learning looks like, and the ability to automate such monitoring and intervention through LA could enable blind adherence to a particular view of learning. Of course, even without LA we use such assumptions in our teaching: in the face to face classroom a teacher monitors group work and intervenes when students seem off task or are not communicating with each other as intended. Such interactions/engagement with tasks can (as the authors note) be more difficult to monitor in online learning, and LA could be a helpful tool for teachers online, and inform task setup and choice of technological tools used. In this sense, I would be very interested in utilising the analytics approach outlined – but I would be much less interested in it being used as an evaluative tool of my teaching, if, for example, it were based on a departmental ‘ruling’ about the types of interactions deemed to be supportive of learning, and very much less interested in using it as part of student assessment, wherein students were expected to conform to particular models of interaction in order to be ‘successful’ (see, for example, MacFarlane, 2015). As with all analytics and algorithms, the danger seems to be in the application.
Lifestream, Tweets
@james858499 Wld like to see more process LAs informing L-ing design&less check-point LAs matched to grades/satisfaction scores #mscedc
— Renée Hann (@rennhann) March 17, 2017
Lifestream, Tweets
@james858499 before I was a magician,now a scientist 😉 More seriously,depends on type of analytics,& what's valued in institution?#mscedc
— Renée Hann (@rennhann) March 17, 2017
Lifestream, Tweets
@BenPatrickWill @helenwalker7 #mscedc luddite ref is interesting:their concern was own'ship of machines&for many LALs,it's own'ship of data
— Renée Hann (@rennhann) March 17, 2017
Lifestream, Tweets
@dabjacksonyang @Eli_App_D @nigelchpainting given that choice, think I'd go for yoga. Just sayin'..#mscedc
— Renée Hann (@rennhann) March 17, 2017
Lifestream, Tweets
@c4miller @james858499 @HerrSchwindenh_ or each others' handles when replying😲 #mscedc
— Renée Hann (@rennhann) March 17, 2017