I discovered this article by Audrey Watters: How data and analytics can improve education (July 25, 2011) where Watters interviews George Siemens about the “possibilities and challenges for data, teaching, and learning.” Siemens points to the use of LMSs in higher education and of how social networks (like Twitter, etc.) can be used in the same way but could raise issues of privacy since they are public platforms (Watters 2011).
Siemens also points to the significant amount data LMSs like Desire2Learn can capture (Watters 2011). When I taught at Durham College this past fall, I discovered just how much data is available to teachers using this LMS. Prior to becoming a teacher (in higher ed.), I had only experienced LMSs from a student’s perspective; it was interesting to see it from a teacher’s point-of-view. For instance, I was intrigued to see how Desire2Learn (rebranded as ‘DC Connect’ at my College) provided me with information about whether or not a student had opened content I posted on the site. I leveraged this information to track my students’ progress during study units. Although this information was not in-depth, meaning all it revealed was if my students had opened the content, therefore pointing to possible engagement with the content, it did not tell me if they had actually read the content (much more important to learning).
So, is this information useful? I posit that it did help…a little…because I used this data as a sort of warning sign; I could identify students who were falling behind by seeing that they had not opened any content for four weeks, for example. In some cases, however, this was not always accurate as I had a few students who did not often open the content I posted but, in fact, did very well in the course. Knox (2014) reminds us that learning analytics in education “makes visible the invisible” and that we can interpret the results as we wish. Did these students work with their friends and get the necessary content from their peers? Did they share the content with each other via social networks? It is hard to know what happened in all cases. I must admit, when I first investigated the data provided on the LMS my reaction when seeing that students had not opened content was that they were not engaging. After talking to a few students, I was able to gather more qualitative data which led me to change my mind about some (some had extenuating circumstances, job commitments, etc.). I suppose the take-away here is that communication is key!
Siemens, quoted in Watters (2011) suggests that “authentic” interactions occur more often over social networks and not as often on LMSs because student participation is “purposeful” there. Siemens, in Watters (2011) also mentions the Hawthorne effect; I found this idea fascinating because I can relate this ‘effect’ to my own lifestream blog. I do feel as though I modify my behaviour here on the blog because I am “aware of being observed”. I have struggled with this lifestream blog and have (sometimes) felt paralyzed to post things because I am afraid of making mistakes or of not sounding ‘scholarly’ enough. Perhaps my students at Durham College felt the same way?
Siemens (in Watters, 2011) asks this important question:
“How much should learners know about the data being collected and analyzed?”
And here is his answer:
“I believe that learners should have access to the same dashboard for analytics that educators and institutions see. Analytics can be a powerful tool in learner motivation — how do I compare to others in this class? How am I doing against the progress goals that I set? If data and analytics are going to be used for decision making in teaching and learning, then we need to have important conversations about who sees what and what are the power structures created by the rules we impose on data and analytics access” (Siemens quoted in Watters 2011).
In my own case, I would find it motivating to be able to see U of Edinburgh Moodle analytics about my work, but in the case of my students at Durham College, I found this was not a source of motivation (for most). At many points throughout the semester, I informed my class that I could see who was viewing content and who wasn’t; this may sound like a threat (which I did not intend it to be) but as a gentle reminder that I am watching and paying attention to what they are doing (or not doing). As a new teacher (in higher ed), I know I have a lot to learn about learning analytics and about how they relate to student motivation.
Knox, J. (2014). Abstracting Learning Analytics. Code Acts in Education ESRC seminar series blog. http://codeactsineducation.wordpress.com/2014/09/26/abstracting-learning-analytics/
Watters, A. (2011, July 24). How data and analytics can improve education. Retrieved from https://www.oreilly.com/ideas/education-data-analytics-learning