See more of the story

'Behind the Screen'

Sarah Roberts, Yale University Press, 280 pages, $30. They are paid to spend their days watching filth: beheadings and chemical-weapons attacks, racist insults and neo-Nazi cartoons. When batches of images leap onto their screens, they must instantly sort them into categories, such as violence or hate speech. They must decide whether to mark for review or take the content down. Some earn $15 an hour, some a piecework rate of a few cents per item, sorting anywhere from 400 to 2,000 a day. An estimated 150,000 people work in content moderation worldwide, mainly on short-term contracts with little kudos or chance at promotion. Roberts' book is one of just a few about them, based on research done earlier in the decade (although work conditions mainly remain the same). For years, tech activists have called for more transparency about these boundaries. As Roberts shows, the opacity is ingrained. Social media sites have often been reluctant to tell malefactors precisely what they did wrong. The companies in fact prefer not to talk about it at all. They have never been comfortable with their role as gatekeepers. Like much of Silicon Valley, their culture reflects the libertarian optimism of the internet's pioneers, which Roberts terms "an origin myth of unfettered possibility for democratic free expression." Besides the political risks, they fear that would let provocateurs flirt with the edges of prohibitions and furnish endless fodder for challenges to their decisions.

ECONOMIST