Browsed by
Month: March 2017

#mscedc I made a quick video artefact in honour of our tweetorial 😀 https://t.co/E0Wkja7PxA

#mscedc I made a quick video artefact in honour of our tweetorial 😀 https://t.co/E0Wkja7PxA

from http://twitter.com/ notwithabrush

via IFTTT https://t.co/E0Wkja7PxA http://twitter.com/notwithabrush/status/847220499532984322

Article: How data and analytics can improve education

Article: How data and analytics can improve education

Click on photo for source link.

I discovered this article by Audrey Watters: How data and analytics can improve education (July 25, 2011) where Watters interviews George Siemens about the “possibilities and challenges for data, teaching, and learning.” Siemens points to the use of LMSs in higher education and of how social networks (like Twitter, etc.) can be used in the same way but could raise issues of privacy since they are public platforms (Watters 2011).

Siemens also points to the significant amount data LMSs like Desire2Learn can capture (Watters 2011). When I taught at Durham College this past fall, I discovered just how much data is available to teachers using this LMS. Prior to becoming a teacher (in higher ed.), I had only experienced LMSs from a student’s perspective; it was interesting to see it from a teacher’s point-of-view. For instance, I was intrigued to see how Desire2Learn (rebranded as ‘DC Connect’ at my College) provided me with information about whether or not a student had opened content I posted on the site. I leveraged this information to track my students’ progress during study units. Although this information was not in-depth, meaning all it revealed was if my students had opened the content, therefore pointing to possible engagement with the content, it did not tell me if they had actually read the content (much more important to learning).

So, is this information useful? I posit that it did help…a little…because I used this data as a sort of warning sign; I could identify students who were falling behind by seeing that they had not opened any content for four weeks, for example. In some cases, however, this was not always accurate as I had a few students who did not often open the content I posted but, in fact, did very well in the course. Knox (2014) reminds us that learning analytics in education “makes visible the invisible” and that we can interpret the results as we wish. Did these students work with their friends and get the necessary content from their peers? Did they share the content with each other via social networks? It is hard to know what happened in all cases. I must admit, when I first investigated the data provided on the LMS my reaction when seeing that students had not opened content was that they were not engaging. After talking to a few students, I was able to gather more qualitative data which led me to change my mind about some (some had extenuating circumstances, job commitments, etc.). I suppose the take-away here is that communication is key! 

Siemens, quoted in Watters (2011) suggests that “authentic” interactions occur more often over social networks and not as often on LMSs because student participation is “purposeful” there. Siemens, in Watters (2011) also mentions the Hawthorne effect; I found this idea fascinating because I can relate this ‘effect’ to my own lifestream blog. I do feel as though I modify my behaviour here on the blog because I am “aware of being observed”. I have struggled with this lifestream blog and have (sometimes) felt paralyzed to post things because I am afraid of making mistakes or of not sounding ‘scholarly’ enough. Perhaps my students at Durham College felt the same way?

Image source: http://www.serialoptimist.com/creators/so-much-love-dance-like-nobodys-watching-at-the-airport-12319.html

Siemens (in Watters, 2011) asks this important question:

“How much should learners know about the data being collected and analyzed?”

And here is his answer:

“I believe that learners should have access to the same dashboard for analytics that educators and institutions see. Analytics can be a powerful tool in learner motivation — how do I compare to others in this class? How am I doing against the progress goals that I set? If data and analytics are going to be used for decision making in teaching and learning, then we need to have important conversations about who sees what and what are the power structures created by the rules we impose on data and analytics access” (Siemens quoted in Watters 2011).

In my own case, I would find it motivating to be able to see U of Edinburgh Moodle analytics about my work, but in the case of my students at Durham College, I found this was not a source of motivation (for most). At many points throughout the semester, I informed my class that I could see who was viewing content and who wasn’t; this may sound like a threat (which I did not intend it to be) but as a gentle reminder that I am watching and paying attention to what they are doing (or not doing). As a new teacher (in higher ed), I know I have a lot to learn about learning analytics and about how they relate to student motivation.


References

Knox, J. (2014). Abstracting Learning Analytics. Code Acts in Education ESRC seminar series blog. http://codeactsineducation.wordpress.com/2014/09/26/abstracting-learning-analytics/

Watters, A. (2011, July 24). How data and analytics can improve education. Retrieved from https://www.oreilly.com/ideas/education-data-analytics-learning

 

@nigelchpainting @Tauraco OMG this spam is getting really strange! Too funny 😂 “reinforcing awareness never does any damage” ??? #mscedc

@nigelchpainting @Tauraco OMG this spam is getting really strange! Too funny 😂 “reinforcing awareness never does any damage” ??? #mscedc

from http://twitter.com/ notwithabrush
via IFTTT http://ift.tt/2nbgV2V http://twitter.com/notwithabrush/status/846472440905654273

@fleurhills @helenwalker7 I think it’s 2000 +/- 10% unless it’s an artefact then it must be equivalent to word count somehow -? #mscedc

@fleurhills @helenwalker7 I think it’s 2000 +/- 10% unless it’s an artefact then it must be equivalent to word count somehow -? #mscedc

from http://twitter.com/ notwithabrush
via IFTTT http://ift.tt/2mGGcG1 http://twitter.com/notwithabrush/status/846230125335203841

More on Tweetorial Analysis

More on Tweetorial Analysis

Questions from EDC:

  1. How has the Twitter archive represented our Tweetorial?
  2. What do these visualisations, summaries and snapshots say about what happened during our Tweetorial, and do they accurately represent the ways you perceived the Tweetorial to unfold, as well as your own contributions?
  3. What might be the educational value or limitations of these kinds of visualisations and summaries, and how do they relate to the ‘learning’ that might have taken place during the ‘Tweetorial’?

The Twitter archive for #mscedc has represented our Tweetorial event in the following ways (see quotation) in terms of quantitative data:

“Besides providing us with quantitative data (top users, top words, top URLs, source of tweets, language, volume over time, user mentions, hashtags, images and influencer index), what meaning do these numbers give us and what is the significance?” (quote taken from my other blog post found HERE)

But what are we missing? What about the qualitative side? For instance, we know that Philip had the most number of tweets, and that Angela did not participate (she did not contribute any tweets during this time). What does this say about these two learners? Given the numbers, did Philip learn more than Angela? This is a tough question and one that we cannot be sure of an accurate answer because Angela may have learned a lot – perhaps she followed every tweet in our Tweetorial and read all the links, etc. If this was the case, then Angela probably gained plenty of knowledge and experience from the Tweetorial. It is difficult to create an accurate analysis of knowledge gained or if learning was acquired simply from analysing the metrics.

In looking at my own contributions during our Tweetorial, I found the analytics section on my own Twitter profile (something I hadn’t really investigated before).

Here are some screenshots I took of the data:

From this data, I see that from day one of the Tweetorial to day two (Mar. 16-17) I gained more ‘likes’, engagements and clicks. Does this information mean that I was beginning to learn more or that I was creating more influence in our Tweetorial #mscedc community? I’m not sure of these answers, but it is certainly interesting (and exciting) to see the progression via analytical information.

And here are some screenshots from the Audience section of my Twitter analytics:

From this data, it is interesting to note the gender data: percentage of males to females is higher (what does this say/mean?)… And that the highest percentage of my audience is out of my age range (I’m engaging a younger audience!). There are no surprises, however, in that the highest percentage of my audience comes from Canada (where I live). I do credit the 16% of UK audience engagement from being a student at U of Edinburgh (obviously!).


I wonder how this use of analytical data could be applied to my primary professional practice of teaching figure skating; how can a physical (kinesthetic learning) sport be analysed via Twitter?

I sometimes have my figure skating students track their jump attempts in a log over a period of time. They perform either five or 10 of the same jump, then record how many out of those five or 10 they landed successfully (‘successfully’ meaning they landed on one foot, with minimal shakiness and with adequate flow, etc.). At the end of a month or so, my skaters can see if they have improved their consistency with a particular jump – the data reveals trends and results that are beneficial for learning and for their own progression in the sport. Perhaps, in future,  I could have my students incorporate Twitter in this exercise to have a digital analytical log of their jump progress!

Knox (2014) points to the significance of how we interpret learning analytics and of how we ‘frame’ the results. Coming back to my example of jump tracking in figure skating, it is important to note that sometimes basing achievement simply on number of  successful landings is not always accurate; for instance, I have a skater who is trying to progress from a double jump (two rotations in the air) to a triple jump (three rotations in the air). Going from a double jump to a triple jump is big step up and can take years to master, in some cases. If this skater tracked their triple jump attempts and found they were ‘unsuccessful’ over a period of time (i.e. they landed zero out of five), I could take this at face value and conclude that this skater was not progressing. This is conclusion is not always accurate though because what if those jump attempts were (fairly) well done and technically correct? What if that skater obtained the full rotation in the air (a positive thing), but just couldn’t quite get the landing right? This long-winded example tells me that the numbers cannot always convey an accurate representation of learning or of so-called ‘successful’ results.

Source: https://giphy.com/gifs/figure-skating-IHA5KHBVAOQww
Source: https://giphy.com/gifs/figure-skating-vevk7mPcs39ZK

I will end with an fantastic quotation from Pegrum (2010) in reference to digital networks:

“Digital resources are distributed across countless sites, services and channels, and can encompass material which students have located and evaluated; collections they have tagged and curated; and even artefacts they have individually or collaboratively created.”

I feel that our Tweetorial encompasses Pegrum’s observation, and demonstrated learning and engagement through the theories of connectivism and social constructivism, suggesting that the teacher’s role becomes one of a facilitator who shapes the discussion (Pegrum 2010). As Jeremy and James posted questions during the Tweetorial, they were shaping our discussion and leading it in certain directions to keep with our course topics. We did get derailed at some points though, with the endless yet hilarious posts about cheese!


References

Knox, J. (2014). Abstracting Learning Analytics. Code Acts in Education ESRC seminar series blog. http://codeactsineducation.wordpress.com/2014/09/26/abstracting-learning-analytics/

Pegrum, M. (2010) I link, therefore I am: Network literacy as a core digital literacy. E–Learning and Digital Media. 7(4): 346-354.

#mscedc check this out! Good to know my poor and ‘bothersome’ spelling isn’t preventing spammers from visiting my b
 https://t.co/sxGhmU9BzD

#mscedc check this out! Good to know my poor and ‘bothersome’ spelling isn’t preventing spammers from visiting my b
 https://t.co/sxGhmU9BzD

Where does this spam come from? I’ve deleted four more spam comments from my blog just this morning. I found a helpful article HERE about spam comments on blogs.

from http://twitter.com/ notwithabrush
via IFTTT https://t.co/sxGhmU9BzD http://twitter.com/notwithabrush/status/846031176808775680

Week 10 Summary

Week 10 Summary

Week 10 Summary: Mar. 20-26

This week I was thankful to be able to finally attend the group Hangouts tutorial on Mar. 21.

After missing the other tutorials due to work commitments, I was grateful to have the chance to listen to my colleagues in EDC and to make small contributions to the discussion. I also wrote a couple of Tweetorial analysis posts HERE and HERE. I found it difficult to come up with a scholarly analysis of our Tweetorial, but was thankful for my classmates’ excellent observations in their blog posts.

In my first Tweetorial analysis post, I discussed the micro-contribution of ‘liking’ tweets and of how I feel this act has value because it adds to sense of community and acknowledges my peers (I see you!).

In my second Tweetorial analysis post, I used my own Twitter analytics from our Tweetorial which led to some interesting questions surrounding influence, engagement, audience and gender via social networks. I also related the data to my own professional practice of coaching figure skating, using an example of tracking jump attempts and of how this data could be used for the betterment of my skaters. I found a great paper by Pegrum (2010) about network literacy as a core digital literacy. 

In other random posts, I added a cool video on combining human and tech HERE, posted Keating’s Optimist music for our final EDC project via Spotify HERE, and reminisced about my time teaching skating in Singapore in a post HERE.

I made a brief (and funny) video HERE about Twitter and our community there, and a quick post about our Hangouts tutorial HERE.

Cheers to a great week 10!

Tweetorial Analysis

Tweetorial Analysis

I see you, I hear you, I acknowledge you.

Source: https://giphy.com/gifs/twitter-10shHccb7Xfn2g/links

I’ve struggled to come up with a unique analysis of our Tweetorial in #mscedc. I am unsure of how to take the data presented in the archive and transform it into some sort of academic critique. Besides providing us with quantitative data (top users, top words, top URLs, source of tweets, language, volume over time, user mentions, hashtags, images and influencer index), what meaning do these numbers give us and what is the significance? Did we, as a class, create any broader connections to each other or to relevant academic work from our participation in the Tweetorial? If we did create connections and relationships through our intensive two days of tweeting, then what can we glean from these connections and relationships and what is the meaning and value of them (Eynon 2013)?

In our Hangouts tutorial on March 21, I mentioned my love for ‘liking’ tweets and how this miniscule effort of seemingly passive participation, albeit small and arguably insignificant, is important to me because it is my way of letting my colleagues know that I see them, I hear them and I acknowledge their efforts and contributions to the Tweetorial event. It is so easy to simply click the heart and ‘like’ a tweet, but I really feel that by doing so, others will (perhaps) feel validated and – dare I say – more confident to keep contributing. I also believe that ‘liking’ provides a sense of belonging for both me and for those whose tweets I like.

 

Through our ‘data trails’, we did seem to connect through strings of tweets – of 140 character digital conversations that created relationships between classmates, professors and outsiders, and that encouraged and produced learning (Siemens 2013). Our conversations directed us to articles, images and to our own EDC lifestream blogs. I took some time to review my classmates’ Tweetorial analysis posts, and have collected their posts here:

Matthew, Renée, Eli, Colin, Chenée, Clare, Stuart, Daniel 1, Daniel 2, Philip, Helen M 1, Helen M 2, Helen W, Myles, Linzi, Dirk 1, Dirk 2, Cathy, Angela, Nigel

Via Daniels’s blog post, I found the Tags Explorer site, which I plan to use in a video artefact I will create for the Tweetorial event.


References
Eynon, R. (2013) The rise of Big Data: what does it mean for education, technology, and media research? Learning, Media and Technology. 237-240.

Siemens, G. (2013) Learning Analytics: the emergence of a discipline. American Behavioral Scientist, 57(10): 1380-1400