— Renée Hann (@rennhann) February 19, 2017
In following links ‘down the rabbit hole’, two of my paths of inquiry this week converged: investigating the tension between public and private led me back to community moderation.
My original thinking on privately owned, profit generating digital social infrastructure was connected to ownership of user-generated data and the manipulation of relationships through the use of algorithms (i.e. a corruption of ‘community’ because it is, I feel, a thinly veiled quest for data and profit). However, since many ‘community’ platform providers also rely on user input for moderation, concerns can also be raised about unpaid digital labour.
users are, in the words of labor researcher Kylie Jarrett, their “digital housewives” — volunteering their time and adding value to the system while remaining unpaid and invisible, compensated only through affective benefits.
Part of the problem seems to be that we use civic-minded metaphors to describe the spaces our communities inhabit online, despite them not being ‘ours’ (they are owned by companies, and our activities ‘there’ generate profits for those companies):
Astra Taylor, author of The People’s Platform, says, “I’m struck by the fact that we use these civic-minded metaphors, calling Google Books a ‘library’ or Twitter a ‘town square’ — or even calling social media ‘social’ — but real public options are off the table, at least in the United States.” Though users are responsible for providing and policing vast quantities of digital content, she points out, we then “hand the digital commons over to private corporations at our own peril.”
The article also made me aware of a dark side to web 2.0 that I had never considered: the outsourcing of moderation of the worst, most violent images to the global south:
Many US-based companies, however, continue to consign their moderators to the margins, shipping their platforms’ digital waste to “special economic zones” in the Global South. As Roberts recounts in her paper “Digital Refuse,” these toxic images trace the same routes used to export the industrial world’s physical waste — hospital hazardous refuse, dirty adult diapers, and old model computers. Without visible consequences here and largely unseen, companies dump child abuse and pornography, crush porn, animal cruelty, acts of terror, and executions — images so extreme those paid to view them won’t even describe them in words to their loved ones — onto people desperate for work.