Wars, and rumours of wars: bots battling it out for ‘truth’ on Wikipedia. Unpredictable algorithms at work? https://t.co/XSorIvN44v #mscedc

This is a cleverly crafted piece from the Guardian:

Clever, in that it personifies (and illustrates) in a shockingly anthropomorphic fashion. The language of fights, battles, wars, and conflicts feeds an epic quality. A techno equivalent to the Greek pantheon, slogging it out within the sphere of us mere mortals. “Humans usually cool down after a few days, but the bots might continue for years.” The article even reports two bots arguing about God.

Clever, too, in that “some conflicts mirrored those found in society”, whereas “others were more intriguing.” A great mix of the familiar and the unfamiliar, another reworking of Bayne’s notion of the ‘uncanny’.

Clever, yet again, in that the piece shows how Wikipedia, often touted as communally constructed, a knowledge base of a ‘community culture’, is also interwoven with ‘algorithmic culture’. In the words of the underlying research article, “Wikipedia is an ecosystem of bots” and automated services: their research, here reported, focusing on editor bots, just one part of that ecosystem. Also, there are differences between territories, among different language editions of Wikipedia. This stuff has its own geography, with its own strata and underlying plate tectonics.

If geography, then also need for ecological nous, management, or at least respect. In the words of the Guardian report, “The findings show that even simple algorithms that are let loose on the internet can interact in unpredictable ways.” Having wiped out the Dodo, and introduced the rabbit to Australia, what on earth is going on in Wikipedia? We will see. If we can, indeed, see it. “We know very little about the life and evolution of our digital minions.”

The article ends with a great bridge between our ‘Community Cultures’ and ‘Algorithmic Cultures’ blocks:

“Often people are concerned about what AI systems will ultimately do to people,” he [Hod Lipson] said. “But what this and similar work suggests is that what AI systems might do to each other might be far more interesting. And this isn’t just limited to software systems – it’s true also for physically embodied AI. Imagine ACLU drones watching over police drones, and vice versa. It’s going to be interesting.”

 

from http://twitter.com/Digeded
via IFTTT