Labouring in the digital economy: The people making content (in)visible online

5 minutes to read
Column
Piia Varis
04/02/2017

 

In December 2016, two former Microsoft employees, Henry Soto and Greg Blauert, filed a lawsuit against the company. Their job in the company's online safety team involved screening different Microsoft services for 'objectionable' material and reporting and removing it. In their work they encountered distressing content posted by users - for instance (sexual) abuse, exploitation and killings. In the lawsuit, Soto and Blauert claim that the company did not properly prepare them for the content moderation job, nor give them adequate support once it became clear that being exposed to thousands of disturbing images was taking its toll. Soto, for instance, started suffering from symptoms of post-traumatic stress disorder such as insomnia, nightmares, anxiety, and auditory hallucinations

Content moderation is a type of digital labour that we still don't know very much about, and it's difficult to estimate the exact number of people employed as moderators. Adrian Chen, who has written the popular Wired piece on the issue, suggested in an interview that "[Social media] companies just want to make it seem like their entire product is being created by a couple thousand really rich 20-year-olds in Silicon Valley." What we do know is that those working in commercial content moderation (CCM) - content moderation for companies which rely on the content displayed to generate visits and engagement by users - "are frequently relatively low-status and low-wage" (Roberts 2014, 16). Reportedly, Facebook for instance "lists the work at a remarkably low rate: spotting and weeding out suspected spam, pornography, hate speech and cyber bullying pay about one-quarter of a penny per item."

The Philippines and India are often mentioned as some of the main hubs for the kinds of low-status, low-wage digital labour such as CCM. The fact that there are also click farms or like farms generating the appearance of online popularity in these countries has been known for a while already. People working in these 'new sweatshops' manually work to generate web traffic and trends by liking, sharing and following. For ordinary social media users, all of this so-called astroturfing has the appearance of 'organic' activity, while in fact it has been generated with cheap click labour. 

Outsourced digital work now also features in the making of presidents

The practice of outsourcing digital work to poor countries and communities has already invited some interesting studies. Researchers at the Oxford Internet Institute (OII) have visualized geographies of digital work by looking at data from one of the world's largest online work platforms, oDesk.com, which Facebook for instance has used. They noted that "Even though most demand [for digitally deliverable IT and service work] comes from the Global North, much of the work actually performed is carried out in low-income countries. India and the Philippines, in particular, perform much of the work on the platform."

While the OII researchers point out that "A significant amount of work remains carried out in wealthy countries such as the United States, Canada, and the United Kingdom", they also ask based on their data visualization why there are "such distinct geographic agglomerations of digital work that, in theory, can be done from anywhere; and why wages (even for the same job types) remain much lower than the average in Asia and Africa"

In the meanwhile, a lot of the debate around content removal has focused - quite rightly so - on free speech and what exactly is being censored in the name of moral hygiene. The question of content moderation also has to do with the status of social media companies; hence the ongoing discussion as to whether Facebook for example is a media company, or a technology company, and what might each status mean for their content moderation practices. 

While this question as well as the potential power of social media and other online companies in shaping public discourse is debated a lot, controversial content removal on Facebook for instance is largely reported to be about technical 'glitches' (as in the fatal Philando Castile shooting) or automatic spam filters (as with the Dakota pipeline protests). This is also in line with the view that Facebook is a tech company that doesn't make editorial decisions regarding content. The very human aspect of digital labour in general and content moderation in particular is often invisible - perhaps because, in the words of Adrian Chen"Tech companies like to envision themselves as neutral platforms, efficiently governed by code and logic."

What you see or don't see online is often a result of such invisible digital labour

Whether in the future content moderation will be carried out by artificial intelligence or artificial artificial intelligence (seemingly automated activity actually carried out by human beings), for the moment at least, it will have to rely on human labour too. At the same time, the 'tech company' narrative regarding the content users see or don't see is one about glitches, filters and seemingly mysterious algorithms. This perhaps serves to obscure the fact that there are actual humans involved.

Outsourced digital work now also features in the making of presidents. You may have heard the story about the 15-year-old Singaporean teen who created a slideshow presentation for Donald Trump's 'Students for Trump' campaign, a task she landed through an online marketplace for digital work (she appears to be doing this kind of online work to pay for her dental braces). There's also something interesting about Trump's Facebook likes; at one point in time, one in every 27 of his Facebook followers seems to have come from the Philippines.

For ordinary social media users it remains difficult to judge whether these are genuine Filipino fans, or sweatshop likes. Ordinary users don't know much in general about the digital labour that goes into making sure that certain things remain invisible, nor about the work being done clicking away at like farms to make other things more visible. 

At the same time, what you see or don't see online is often a result of such invisible digital labour, which effectively is a lot about keeping up appearances: an appearance of either something not being there, as with disturbing and distressing content (in which case we can also be thankful there are moderators who look at violence and abuse so that we don't have to) or with whatever might be deemed too 'controversial' for the company in question (as in Facebook and female nipples) - or an appearance of something being more popular than it actually is. A lot of the time this involves challenging and consuming jobs as with Henry Soto and Greg Blauert, and often for measly pay in low-income countries. Prominent cases such as that of Soto and Blauert should help bring to light the kind of labour that goes into manufacturing people's online experiences. That is, unless we're content with how things appear to be. 

 

References 

Roberts, Sarah T. 2014. Behind the screen: The hidden digital labor of commercial content moderation. PhD Dissertation, University of Illinois at Urbana-Champaign.