Acceptance Creep

Algorithms constitute much of our online reality by their gathering and interpretation of our data and their presentation back to us of what they deem relevant, newsworthy or trending.

Algorithms produce worlds rather than objectively account for them
(Knox, 2015).

We often don’t know the “warm human and institutional choices that lie behind these cold mechanisms” (Gillespie, 2012) and our efforts to do so are frustrated by information providers’ frequent tweaking and the algorithms’ own shifting nature as they are fed by our interaction with them.

Has the algorithm been conscripted for daemonic hegemonic practice or more innocently put to work for market forces? Is Google, as a major information provider, attempting to take over the world or (merely) seduced by its self-imposed heady mission to catalogue and present the world’s information (a misguided vocation, like Edward Casaubon’s in Middlemarch?), refusing to shoulder responsibility for the political and social consequences of doing so?

Mager (2014) comments,

… the capitalist ideology is inscribed in code and manifests in computational logics

Why do we comply?

An answer might be what I term acceptance creep. A commentator in the Privacy Paradox podcast warns,

In our shopping behaviour we are rehearsing the idea that it is ok to give up our data

We want to do the searching, the shopping, the socialising and the sharing without continually thinking of world issues. We want certainty and trust where there is none, so we accede to the demands of global capitalism, come to fill the post-human vacuum, because it suits us, too.

Mager (2014) describes a symbiotic relationship between Google (and other global IT corporations), content providers (website creators) and users:

This dynamic perfectly exemplifies Gramsci’s central moment in winning hegemony over hegemonized groups, the moment “in which one becomes aware that one’s own corporate interests […] become the interests of other subordinate groups” (Gramsci 2012, 181). It is the moment where “prosumers” start playing by the rules of transnational informational capitalism because Google (and other IT companies) serve their own purposes; a supposedly win-win situation is established. Prosumers are “steeped into” the ruling ideology to speak with Althusser: “All the agents of production, exploitation and repression, not to speak of the ‘professionals of ideology’ (Marx), must in one way or another be ‘steeped’ in this ideology in order to perform their tasks ‘conscientiously’ – the tasks of the exploited (the proletarians), of the exploiters (the capitalists), of the exploiters’ auxiliaries (the managers), or of the high priests of the ruling ideology (its ‘functionaries’), etc” (Althusser 1971).

If we are rehearsed (performing conscientiously) in our leisure and social lives, we will accept it, too, in our educational lives.


Knox, J. (2015)Algorithmic Cultures. Excerpt from Critical Education and Digital Cultures. In Encyclopedia of Educational Philosophy and Theory. M. A. Peters (ed.). DOI 10.1007/978-981-287-532-7_124-1

Mager, A. (2014) Defining Algorithmic Ideology: Using Ideology Critique to Scrutinize Corporate Search Engines. Triple C Journal for a Global Sustainable Information Society, 12(1).

On and Off

What a week, I couldn’t seem to fire a single algorithm. Early on I enthusiastically toggled Show me the best tweets first on different devices on my Twitter account but with no real discernible difference [1] [2] [3] [4].  I hardly ever use my sparse and locked down Facebook account, but I wandered around in the Settings basement and hauled some levers to ON. Still nothing personalised except a lonely effort by Alison Courses to get me to learn something. I could endorse it, inflicting it on my friends and spawning a million more of the same and similar for me.

It seemed that not only had I somehow gained the right to be forgotten, I had been. What was going on? Normally I only have to think the word hotel for my IP address to be swiped and the price hiked.

Clearly, algorithmically speaking, I should get out more. I started frantically browsing holiday cottages and choosing stuff in online swim shops to provoke a stream of targeted ads. Nothing. How long should it take? Where were the mono fin recommendations? These are algorithms, they shouldn’t show signs of pique. I considered asking a friend to experiment with his Fb timeline settings, but using another person’s data for my own gain seemed, well, dirty. I distracted myself by typing rude words into Google and was blanked, instantly. Naughty me. I did discover that many of us must be contemplating marrying our cousin (is it legal to …).

I headed to YouTube and logged in and out of my Google account like a mad thing, turning the pop up Allow Notifications to Block reflexively. I was impressed by the extent I could analyse my videos – I could get watch time reports, audience retention, playback locations, devices, comments (none) … the list went on. Nothing for Demographics, but the heading was there.

I had wanted to demonstrate how the

arrangements of comments, and thus the spatial qualities of the YouTube page … come together through multiple and contingent relations between the human users … as well as the non-human algorithms which operate beneath the surface
(Knox 2014, p.49)

I wanted to investigate how

the spaces utilised for educational activity cannot be entirely controlled by teachers, students, or the authors of the software”
(Knox, 2014, p.50)

but it seemed unlikely now.

So back in subterranean boiler rooms I struggled rusted faucets to OPEN and tapped the barometers to DELUGE. Sprinting back upstairs, I Googled myself to check I was still alive. Phew, a few of my selves had faint pulses. From a Kafkaesque corridor I dragged down my Google archive to the desktop but found only slim pickings. Seemingly I hadn’t been anywhere on the map for years. I travelled as far as Amazon where, at last, I was greeted with a jaunty Hello C, and I burst into tears of relief at their intimate knowledge of my hoover bag preferences and proffered book recommendations. They were accurate, useful and interesting except for the History book suggestions which must, I dimly remember, be a result of ordering revision guides for my children some hundred years ago.

I never thought I would be so glad to chum up with people who bought this and also bought that. I was back in the human race.

What had I been missing? What friendly, self-affirming world had I separated myself from by turning off tracking and not using Facebook? I’d denied myself even the decision to let Fb decide what I see. Am I doomed to be alone and un-liked with my own dull agency, forced to wander about to achieve serendipity myself instead of having it tastefully sprinkled on top of my carefully-aimed long tail niche cappuccino of recommendations?

“Recommendation algorithms map our preferences against others, suggesting new or forgotten bits of culture for us to encounter” (Gillespie, 2012).

Author of my own destiny? Perhaps not, thanks. I wouldn’t know which of the 52,000 Facebook categories were mine (Beyond Boring? Underactive Thyroid? Paranoid Meanie?). But then I wouldn’t know that anyway,

Categorization is a powerful semantic and political intervention
(Gillespie, 2012).

Best kept hidden.

Is it really consume like crazy, like and retweet in overdrive, complete complicated cameos, share lolcats and link this to that – or –  walk the wilderness? I suspect it’s a bit more nuanced.

I created and later updated a Storify to make sense of other people’s experiences. Perhaps I should keep my settings turned on and just frustrate the algos. I could have fun. I should have believed the boiler room posters (proclamations of “the legitimacy of these functioning mechanisms” (Gillespie, 2012), part of the “providers’ careful discursive efforts” (p.16) which assured me that my experience would be improved.

This articulation of the algorithm is just as crucial to its social life as its material design and its economic obligations
(Gillespie, 2012)

I should have heeded the signs in the (lack of) Control Room which shouted Cookies are Vital and threatened politely to forget which pages I like in Cyrillic. Manovich states that computer games are the “projection of the ontology of the computer onto culture itself” (Manovich, 1999, p.28); shouldn’t I just start to play?

But what was I doing with Storify? Temporarily fixing a contingent assemblage of student and teacher tweets sourced from filtered searches within the affordances of a particular technology? Was this,

the pedagogy of networked learning in which knowledge construction is suggested to be ‘located in the connections and interactions between learners, teachers and resources, and seen as emerging from critical dialogues and enquiries
(Knox, 2014, p.51, quoting Ryberg et al, 2012) ?

Was it like EDMOOC News in which

a set of dependencies and relations that entwine participants and algorithms in the production of educational space
(Knox, 2014, p.51) ?

Not really, but getting closer.

As someone who regularly gets lost rather than turn on their GPS, changing my preferences isn’t going be easy. Yet if I really want to map how “Complex algorithms and codes of the web shape and influence educational space” (Knox, 2014, p.52), untangle, as far as I can, the sociomaterial “procedures irreducible to human intention or agency” (p.53) and discern the power structures encoded in the code, I might have to take the plunge. Lucky I’ve got ten new costumes.

I should augment the number of actors in the “recursive loop between the calculations of the algorithm and the “calculations” of people” (Gillespie, 2012), lifesaving idealistic hopes and avoiding my cousins.


Recommended for me


Gillespie, T. (2012). The Relevance of Algorithms. in Media Technologies, ed. Tarleton Gillespie, Pablo Boczkowski, and Kirsten Foot. Cambridge, MA: MIT Press.

Knox, J. K. (2014). Active algorithms: sociomaterial spaces in the E-learning and Digital Cultures MOOC. Campus Virtuales, 3(1): 42-55.

Manovich, L. (1999). Database as a Symbolic Form. Millenium Film Journal, 34, pp.24-43.

Favourite tweets!

The video shared by Chenée exemplifies Gillespie’s Patterns of Inclusion,

Patterns of inclusion: the choices behind what makes it into an index in the first place, what is excluded, and how data is made algorithm ready

Gillespie, T. (2012). The Relevance of Algorithms. in Media Technologies, ed. Tarleton Gillespie, Pablo Boczkowski, and Kirsten Foot. Cambridge, MA: MIT Press.

Liked on YouTube! This Panda Is Dancing

This Panda Is Dancing

I like this video on YouTube because technologies work to keep us on their site as long as possible, gathering data gleaned from our likes and views, chats and shares. This is the data algorithms like to feed on.
A poetic short film by Max Stossel & Sander van Dijk:

In the Attention Economy, technology and media are designed to maximize our screen-time. But what if they were designed to help us live by our values?

What if news & media companies were creating content that enriched our lives instead of catering to our most base instincts for clicks?

As technology gets more and more engaging, and as AI and VR become more and more prevalent in our day-to-day lives we need to take a look at how we’re structuring our future.

Time Well Spent is a movement to align technology with our humanity:

Favourite tweets! Neologism

Creating new words for this community of practice and suggesting a requirement for new language to plot and perform our changing digital and educational landscape.