The Relevance of Algorithms – NOTES

The relevance of Algorithms

Tarleton Gillespie

http://culturedigitally.org/2012/11/the-relevance-of-algorithms/

 

 

Algorithms are key actors in deciding what information is relevant to us. They have the power to assign meaningfulness. They are not always software.

If human culture is now expressed mainly through computational tools then we are applying a certain epistemology to all culture. Algorithms are contesting the position of previous cultural arbiters: experts, scientific methods, common sense, word of god.

6 potential political effects of algorithms:

  1. Patterns of inclusion – What counts?
  2. Cycles of anticipation – Can the algo. Predict what you need? Should it?
  3. Evaluation of relevance – How is this measured? Are the metrics kept secret?
  4. The promise of objectivity – Algo. Is portrayed as impartial. How?
  5. Entanglement with practice – Users change their behaviour to fit the algo.
  6. Production of calculated publics – algo presents a public thus shaping their sense of self. Who benefits?

 

Remember: Algos are not fixed entities. They are being tinkered with and negotiated at all times. They are designed by humans and represent certain institutional values.

PATTERNS OF INCLUSION

Study and algo. You need to ask what’s the database it’s drawing on. Don’t confuse the algo and the database as one thing.

3 stages – 1.) Collection – who decides what data is acceptable to collect? Is this universal?

2.) Prepared for algorithm  – Categorising asserts certain ontologies. E.g. what books on amazon are “adult” and thus excluded from the front page sales rank?

3.) Exclusion and demotion – who decides what should be presented? This practice is hidden in algos.

 

CYCLES OF ANTICIPATION

Search algorithms change from user input.

To give “better” results that are more relevant to you search algorithms need info about you. How does the algo determine what is it you want? What data does it draw upon? How can algo. Providers get you to offer up more info? How accurate can the user’s “data shadow” be? How much info is enough to make predictions?

 

EVALUATION OF RELEVANCE

Relevant is a fluid concept. There is no set metric. What metrics are used to approximate it? Since no unbiased measure is available there will always be elements of bias.

  1. Criteria used – Often kept private by the companies who make them. Don’t want users gaming the system.
  2. Commercial aims – Is there financial incentives for the algo producers to promote certain info? Can the social data and commercial data really be easily separated? E.g. what if you follow a product endorsing social media influencers? Is their info an ad or social data?
  3. Epistemological premises – Assumptions are made, is what is popular more relevant? Language used more relevant? Commercial or public sites more relevant? The algo. Is a cognitive artefact where the expertise and judgement of those who made it is hidden and automated.

 

THE PROMISE OF ALGOIRTHMIC OBJECTIVITY

Gives the algo. Credibility. How the creators of the algo. Portray this is important. Methods include repeated claims, obscuring how the algo works, the language used (“best results”)

Parallels with journalism – public interest and objective “fair”reporting. These are more ideals with attached rituals than something that is actually achieved.

 

ENTANGLEMENT WITH PRACTICE

Particularly true for businesses who deliver information as a commodity (google). If people don’t use it in practice the algo. Has failed. Users can shape their practice to better fit algorithms. E.g. twitter hashtags

Algos. Give power to those who know their workings.

As new algos. Become part of public life users may interact with them in unexpected ways. Although because how they work is kept secret it is harder for them to be truly public goods.

Shape how people think about things – knowledge logics. Facebook encourages us to “participate and share” more. Inoculates us with certain idea of privacy. Google – popular is more relevant.

PRODUCTION OF CALCULATED PUBLICS

Algo. Shape how digital publics/communities are formed, shaped and dissipate. Creation of echo chambers and filter bubbles.

Algo. Can create groups. E.g. facebook privacy setting “friends of friends”, only the algo. Can calculate the membership.

Social science based on big data is often drawing upon public data that has already been filtered through social media algo. Theories and knowledge based on this data tells us something about the public that has been calculated.

In creating these notes in some ways I have had to grapple with the same questions about informational relevance that are automated in algorithms. I have condensed a 32 page essay into a few pages of notes. That is inclusion and exclusion. To do this I am trying to anticipate what I will need to know in the future, particularly relating to what texts I might write on algorithms for the DigCul course. What are the key words that can tell me what is in the essay? If I can get this right I can look at my notes then decide if I need to read part of the essay again. To a certain extent my note taking is tangled with practice, the content may not end up being relevant but the discipline of writing precis and making judgements on what’s relevant is vital for studying. I’m still trying to improve these skills.