I started to explore #algorithms while searching for YouTube videos. In keeping with Christian Sandvig’s Show and Tell, I started to type ‘residential’ and before I could finish, Google Instant came through with this prediction:
Apparently Google thinks I want to see Resident Evil 7 – a horror video game (yikes)… Interestingly, as soon as I typed the ‘i’ in ‘residential’, Google knew that I was searching for info on the residential school system in Canada. Given the last course I took in the MSc programme was digital game-based learning, I’m assuming these predictive results turned to games since I had researched this subject in the past.
In this POST, I included shots of all the predictive results of my search for residential schools in Canada (or also find it here on Tumblr). As you can see, some of the results branch out to include documentaries about native Americans.
Although I was aware that algorithms were working behind the scenes to tailor personalised recommendations on Facebook, Google, Netflix, Amazon, etc., until this section of EDC, I didn’t realise the extent of the algorithm’s reach and influence. As Knox (2015) points out, Amazon’s algorithm has significant influence over spending habits of consumers as does Facebook ads. In this brief article, Jerry Kaplan discuss the impact of Amazon’s algorithm and the concept of ‘information asymmetry’ where one party has more or better information than the other, creating an invisible imbalance in power. Although they are human-created, these algorithms seem to have the ability to outsmart us and (perhaps) cause us to shell out more cash than we should!
Why did the following ad appear on my Facebook page?
This ad for Stella Artois appeared on my Facebook page and struck me as somewhat unusual. From my observation, most of the ads that pop up on my page come from (at least I think they do) my recent Google searches and from my personal preferences. Beer, however, and specifically Stella, is rarely something I search for (if ever). Looking closer, I noticed at the bottom it says “Purchase a chalice. Help end the global water crisis.” Is this ad really a call for social action or is it just trying to get me to buy beer?
Upon further investigation, I visited Stella’s website (buy a lady a drink) and discovered their Chalice promotion and partnership with water.org. A ‘limited edition’ Chalice can be purchased for $13.00. Stella’s disclaimer on their website states that $6.25 provides clean drinking water to 1 person for 5 years. Stella Artois will donate to water.org $6.25 for every chalice sold in the U.S. in 2017, up to 200,000 chalices.
Upon reflection, I’ve come to realise that perhaps, in some cases, Facebook ads and the algorithms that create them can be viewed in a positive light in terms of social impact. I taught a marketing class at Durham College this past fall and one of our topics of discussion centred around social responsibility in corporate marketing. Stella’s Chalice programme seems to be participating in this kind of marketing in an effort to aid in the global water crisis. Can algorithms lead individuals and/or organisations to partake in social action for positive change? Again as Knox (2015) points out, we must remember that algorithms are political and biased, leading us to think about “what kind of individuals and societies are advantaged or excluded through algorithms.”
Who benefits?
No doubt that Stella Artois is adding to its bottom line by implementing the Chalice programme, but it seems they are also trying to create a company image of social responsibility. Is Stella’s partnership with water.org creating a positive impact on those who are in desperate need of clean water?
A few years ago, I was involved with an environmental group in my local area: The Enniskillen Environmental Association (EEA). The EEA, who are a handful of concerned citizens, fought Hydro One in order to stop a mega transformer station from being built on the Oak Ridges Moraine – a large water-rich protected area. As this was a David and Goliath type of fight, the EEA didn’t have enough power to combat the enormous wealth and spite of Hydro One and eventually lost the battle. That being said, perhaps my past involvement with water conservation had some influence on why the Stella ad appeared on my Facebook page? Do the algorithmic ‘gods’ somehow know I was involved in social justice practices? I could be reading into this too much, but how far can the reach of algorithms extent?
In preparation for my next cohort of marketing students, I’m thinking about how to incorporate the analysis of algorithms in marketing and the implications of these algorithms in terms of education. What kind of class activities can I create to discover and/or track organisations who participate in social marketing practices and unpack the resulting research in terms of impact for society at large, for the organisation and for education?
Watch my EEA video here:
References
Knox, J. 2015. Algorithmic Cultures. Excerpt from Critical Education and Digital Cultures. In Encyclopedia of Educational Philosophy and Theory. M. A. Peters (ed.). DOI 10.1007/978-981-287-532-7_124-1
Algorithms: the villains and heroes of the ‘post-truth’ era