This is the ugly conundrum of the digital age: When you traffic in outrage, you get death.
So when the Sri Lankan government temporarily shut down access to American social-media services like Facebook and Google’s YouTube after the bombings there on Easter morning, my first thought was “good.”
Good, because it could save lives. Good, because the companies that run these platforms seem incapable of controlling the powerful global tools they have built. Good, because the toxic digital waste of misinformation that floods these platforms has overwhelmed what was once so very good about them. And indeed, by Sunday morning so many false reports about the carnage were already circulating online that the Sri Lankan government worried more violence would follow.
It pains me as a journalist, and someone who once believed that a worldwide communications medium would herald more tolerance, to admit this — to say that my first instinct was to turn it all off. But it has become clear to me with every incident that the greatest experiment in human interaction in the history of the world continues to fail in ever more dangerous ways.
In short: Stop the Facebook/YouTube/Twitter world — we want to get off.
Obviously, that is an impossible request and one that does not address the root cause of the problem, which is that humanity can be deeply inhumane. But that tendency has been made worse by tech in ways that were not anticipated by those who built it.
It is no surprise that we are where we are now, with the Sri Lankan government closing off its citizens’ access to social media, fearing misinformation would lead to more violence. A pre-crime move, if you will, and a drastic one, since much critical information in that country flows over these platforms. Facebook and YouTube, and to a lesser extent services like Viber, are how news is distributed and consumed and also how it is abused. Imagine if you mashed up newspapers, cable, radio and the internet into one outlet in the United States and you have the right idea.
A Facebook spokesman stressed to me that “people rely on our services to communicate with their loved ones.” He told me the company is working with Sri Lankan law enforcement and trying to remove content that violates its standards.
Just a month ago in New Zealand, a murderous shooter apparently radicalized by social media broadcast his heinous acts on those same platforms. Let’s be clear, the hateful killer is to blame, but it is hard to deny that his crime was facilitated by tech.
In that case, the New Zealand government did not turn off the tech faucets, but it did point to those companies as a big part of the problem. After the attacks, neither Facebook nor YouTube could easily stop the ever-looping videos of the killings, which proliferated too quickly for their clever algorithms to keep up. One insider at YouTube described the experience to me as a “nightmare version of Whac-A-Mole.”
New Zealand, under the suffer-no-foolish-techies leadership of Jacinda Ardern, will be looking hard at imposing penalties on these companies for not controlling the spread of extremist content. Australia already passed such a law in early April. Here in the U.S., our regulators are much further behind, still debating whether it is a problem or not.
It is a problem, even if the manifestations of how these platforms get warped vary across the world. They are different in ways that make no difference and the same in one crucial way that does. Namely, social media has blown the lids off controls that have kept society in check. These platforms give voice to everyone, but some of those voices are false or, worse, malevolent, and the companies continue to struggle with how to deal with them.
In the early days of the internet, there was a lot of talk of how this was a good thing, getting rid of those gatekeepers. Well, they are gone now, and that means we need to have a global discussion involving all parties on how to handle the resulting disaster, well beyond adding more moderators or better algorithms.
Shutting social media down in times of crisis isn’t going to work. I raised that idea with a top executive at a big tech company I visited last week, during a discussion of what had happened in New Zealand.
“You can’t shut it off,” the executive said flatly. “It’s too late.”
Kara Swisher, editor at large for the technology news website Recode and producer of the Recode Decode podcast and Code Conference, is a contributing opinion writer for the New York Times.