Linked from Pocket: The music video that changes each time you click play

The algorithm automatically pulls in short clips from video-sharing sites like YouTube when you hit play on Shaking Chains’ Midnight Oil. The short bits of footage are shown back to back with the band’s track playing over the top.

from Pocket http://ift.tt/2onD3Ym
via IFTTT

After all our talk of algorithms and education, I found this a really nice “smiler” so thought I’d share. A music group using an algorithm to change the experience for viewers watching their music video. A nice change from algorithms pulling information out, instead, algorithms creating art?

Playing with algorithms

My play with algorithms this week was unemphatically dull. I am aware of ad changes connected with my surfing habits, particularly with Amazon, so I was expecting to see a lot more come from a controlled experiment but alas it was a bit lacklustre which I suspect is due to my online security habits. My previous job involved supporting people with their digital footprints and making them aware of their computers security and the potential risks so out of habit, I tend to have things like cookies controlled. I suspect this is why I didn’t experience as much influence of the algorithms as I was expecting. However, I chose to leave my settings as they are and look at this from my real life perspective.

I chose to look at how my actions on amazon affect the ads I see elsewhere in my internet world. I am vaguely aware that amazon shopping trips have resulted in corresponding ads on facebook in the past and also with google form search words so I aimed at deliberately spiking things to see what would happen. To do this, I had to ensure that my amazon searches were for things I would not normally search for so that I could be sure the results were to do with this experiment.

Lunch break, over a cup of tea and a sarnie, I browsed for ballet slippers on amazon (the idea came to me after chatting with Linzi who is a dancer, I am most definitely not). On first results this was unremarkable. I didn’t even see ballet slippers come up next time I logged into amazon. Epic fail.

MacBook: related searches on amazon

Again I searched for ballet slippers and this time I added pink satin to the description. I also changed behaviour, and this time I clicked on specific items that came up. This seemed to trigger the amazon algorithm which then shows relational items against your previous history (previous history, is that a real thing?). So result number one.

 

Surface Pro 4: facebook ad for amazon
Surface Pro 4: facebook ad for amazon

My expectations were that I would now see this filter through and at the very least see related advertising on things like facebook. Did I? Well a little bit of facebooking that evening and nope. There were no changes to my standard side-bar advertising on facebook, and even the featured ad for amazon wasn’t related to my searches.

 

 

 

OK so disappointing so far, but what about search engines? Surely the cookies stored on the computer would result in search engines picking up on my search,  I know this happens I’ve seen it on multiple occasions.

Nope

Surface pro 4: google search

 

 

 

 

About now I was ready to quit, I’m certain I’ve seen the searches spread across platforms so why wasn’t this working? I gave up for the night and decided to try again before work in the morning.

The next morning, sitting at my desk eating my shreddies it all clicked into place. The google bar instantly gave me pink ballet shoes in my search.

This is when the penny dropped. I was using one computer at home and a different computer at work, the algorithm seemed to be taking effect at work on my MacBook, but not at home on my surface pro 4. Cookies! As I mentioned previously, I lock down the cookies on my personal computer, but I am not in charge of the set up of my work computer so there it may be slightly more open to cookies, hence why I was seeing ballet slippers appear in google as well as amazon. Still nothing on either machine for facebook though, so it would appear that only items purchased or added to my wish list cross into facebook, but it would take more investigation to see if this works cross computers or only on the computer the purchase was made on. More investigation will be needed, but I wasn’t buying ballet slippers to test this theory out. I’m now wondering about adding mobile devices to the test…

Algorithms produce worlds rather than objectively account for them
(Knox, 2015).

 

Yup, and in this instance, the world it was creating couldn’t quite see the full picture, the algorithm knew I’d searched for ballet slippers when I was on the macbook, because it could read the cookies that were stored there but once I was home and on a different computer, with no cookies to read the algorithm didn’t recognise me as part of the world it was building around my shopping habits.

References

Knox, J. (2015)Algorithmic Cultures. Excerpt from Critical Education and Digital Cultures. In Encyclopedia of Educational Philosophy and Theory. M. A. Peters (ed.). DOI 10.1007/978-981-287-532-7_124-1

What to share and what to withhold

 

 

 

 

 

If humans are programming algorithms, does that mean human biases won’t affect the algorithm? What about an unconscious bias that we are not aware of ourselves, can we ensure we won’t influence the algorithm if we are not aware of our own bias? From another perspective, if we are using big data and algorithms to identify things deemed important in education,  for instance, the potential failure of a student, what steps do we need to take, if any, to ensure that the data doesn’t negatively influence the student or bias those receiving the data?  The example in this week’s reading is the “Course Signals” system by Purdue University, one of the earliest and most-cited learning analytics systems.

Using an algorithm to mine data collected from a VLE, the purpose of signals is identify students at risk of academic failure in a specific course. It does this by identifying three main outcome types – a student who is at a high risk, moderate risk, and not at risk of failing the course. These three outcomes are then represented as traffic light (red, orange, and green respectively). The traffic lights serve to provide an early warning “signal” to both instructor and student (Gašević, D., Dawson, S. & Siemens, G., 2014). Few could argue that this is not a noble gesture, the opportunity to intervene before a student fails and potentially change their path is every educator’s dream, however, with only data and algorithms to make these decisions we run a very real risk of this knowledge adversely influencing the student. What of the student who for one reason or another is not of a frame of mind sufficient enough to be told they must work harder, might being told they are at risk of failing the course nudge them in an unwanted direction, potentially one where the student gives up instead of being influenced to try harder?  In this instance, surely human intervention to make the decision of whether or not students see data about themselves is essential?  Are the choices to use technology in this instance for the benefit of the student?  Is it a case of the  “warm human and institutional choices that lie behind these cold mechanisms” (Gillespie, 2012) where good intentions are at the heart of the introduction of the technology but the cold heart of administration and economies of business are the driving force behind its use?
However, the decision to withhold information also comes with its pitfalls in relation to bias, could, for example, the knowledge that a student behaves in a particular way, is weak in particular areas or stronger in others influence how their teacher interacts with them? Would a teacher have different expectations of one new student over another if they had prior knowledge to show that students strengths and weaknesses? This may not be solely about big data and algorithms, as this type of information can be known on a much smaller scale however if we take it up a notch and say a student’s behaviour is on record and shows that the student is prone to anger, outbursts and potentially violence. If we choose not to share that information so as to not unduly bias the interaction with that student, would the person making the decision to withhold that information then be responsible if that student attacked a teacher or another student and we potentially had knowledge which could have prevented this?

 

References

Sclater, N., Peasgood, A., Mullan, J., 2016. Learning Analytics in Higher Education, JISC. Available at: https://www.jisc.ac.uk/sites/default/files/learning-analytics-in-he-v3.pdf.

Gašević, D., Dawson, S. & Siemens, G., 2014. Let’s not forget: Learning analytics are about learning. TechTrends, 59(1), pp.64–71. Available at: http://link.springer.com/article/10.1007/s11528-014-0822-x [Accessed December 7, 2016].

Gillespie, T., Gillespie, T. & Boczkowski, P., 2014. The relevance of algorithms. technologies: Essays on …. Available at: https://books.google.com/books?hl=en&lr=&id=zeK2AgAAQBAJ&oi=fnd&pg=PA167&dq=Relevance+Algorithms+Gillespie&ots=GmoJNXY0we&sig=BwtHhKix2ITFbvDg5jrbdZ8zLWA.

 

 

 

Influencing with algorithms: Amazon, YouTube, Google and Facebook oh my….

Although I confess to knowing of the existence of algorithms and even seeing their impact on my net use, I’ve never really paid attention to it. My bad. So I am going to specifically play with  4 tools I use often, amazon, facebook, google, youtube to see what the impact is on each other, or how joined up my web use it.

I will look at the impact of searching on google and see if this permeates through to the other tools and then systematically do the same for each.

Things to consider – I have an enormous digital footprint, therefore for the purpose of this experiment I will be specifically trying to influence my digital footprint using items I would not normally search for.