My experience of learning analytics is at a fairly rudimentary level. The tools I have built into the Academy I manage enable me to look at data at a macro level through to individual learner completions. I don’t have the sophisticated tracking of learner navigation in terms of click through, time spent on task etc. immediately to hand, although, some of this information is being recorded and looking at the subject in more detail this week has prompted me to look again at data that could provide valuable insights.
Siemens makes the point that “To be effective, holistic, and transferable, future analytics projects must afford the capacity to include additional data through observation and human manipulation of the existing data sets”. The additional data and human observation I find most valuable in my own professional practice are the insights gained from the social tools I have built into the Learning Academy. Discussion and blog comments augment and add colour to the otherwise ‘dry’ learning analytics data. Together these two resources do enable me to “incorporate […] feedback into future design of learning content.” and in some cases into existing content.
I think the output of learning analytics alone, without the additional layer of human created metadata, would not provide me with sufficient information to judge what learning had taken place or the effectiveness of the materials provided. As Siemens suggest “The learning process is creative, requiring the generation of new ideas, approaches, and concepts. Analytics, in contrast, is about identifying and revealing what already exists.” and “The learning process is essentially social and cannot be completely reduced to algorithms.”
“Prior to launching a project, organizations will benefit from taking stock of their capacity for analytics and willingness to have analytics have an impact on existing processes.” I wonder how often this happens in reality. The business I work for is both purpose and numbers driven, the strategy (hope) being that the former drives the latter. There is certainly a willingness to react to analytics in all aspects of the business, whether that be customer satisfaction scores, unit sales or learning and development provision. In my view there is also a danger in reacting immediately to analytics, strategy being a long-game activity, where cultural and other changes can take months or even years to shift.
Privacy and scope
Siemens raises some important issues around privacy and scope. “Distributed and fragmented data present a significant challenge for analytics researchers. The data trails that learners generate are captured in different systems and databases. The experiences of learners interacting with content, each other, and software systems are not available as a coherent whole for analysis.” I’ve attempted to combat this by integrating everything into one platform, with a single view of the learner. Where this hasn’t been possible we’ve gone to great lengths to implement a single sign on solution, which is both easier and more convenient for the learner but also helps avoid some of the issues Siemens raises.
From a privacy perspective I’ve implemented as open a model as I’m able to with the data that is available. I’d love to be able to do more to personalise learning for individual learners but, as with all commercial operations, this comes back to the three levers of cost, time available and quality achievable. However, our learners are able to interrogate their own learner record and they have an achievements wall where all learning completed online is tracked, along with any external achievements the learner wishes to add manually. They can also see how their achievements compare to those of their peers. In this respect learners can “see what the institution sees”.
All references are from