This is the ugly conundrum of the digital age: When you traffic in outrage, you get death.
So when the Sri Lankan government temporarily shut down access to social media services like Facebook and YouTube after the bombings on Easter morning, my first thought was “good”.
Good, because it could save lives. Good, because the companies that run these platforms seem incapable of controlling the powerful global tools they have built. Good, because the toxic digital waste of misinformation that floods these platforms has overwhelmed what was once so very good about them. And indeed, by Sunday morning so many false reports about the carnage were already circulating online that the Sri Lankan government worried more violence would follow.
It pains me as a journalist, and someone who once believed that a worldwide communications medium would herald more tolerance, to admit this — to say that my first instinct was to turn it all off. But it has become clear to me with every incident that the greatest experiment in human interaction in the history of the world continues to fail in ever more dangerous ways.
In short: Stop the Facebook/YouTube/Twitter world — we want to get off.
Obviously, that is an impossible request and one that does not address the root cause of the problem, which is that humanity can be deeply inhumane. But that tendency has been made worse by tech in ways that were not anticipated by those who built it.
I noted this in my very first column almost a year ago, when I called social media giants “digital arms dealers of the modern age” who had, by sloppy design, weaponised pretty much everything that could be weaponised. “They have weaponised civic discourse,” I wrote. “And they have weaponised, most of all, politics. Which is why malevolent actors continue to game the platforms and why there’s still no real solution in sight anytime soon, because they were built to work exactly this way.”
So it is no surprise that we are where we are now, with the Sri Lankan government closing off its citizens’ access to social media, fearing misinformation would lead to more violence. A pre-crime move, if you will, and a drastic one, since much critical information in that country flows over these platforms. Facebook and YouTube, and to a lesser extent services like Viber, are how news is distributed and consumed and also how it is abused. A Facebook spokesman stressed to me that “people rely on our services to communicate with their loved ones”. He told me the company is working with Sri Lankan law enforcement and trying to remove content that violates its standards.
But while social media had once been credited with helping foster democracy in places like Sri Lanka, it is now blamed for an increase in religious hatred. That justification was behind another brief block a year ago, aimed at Facebook, where the Sri Lankan government said posts appeared to have incited anti-Muslim violence.
Just a month ago in New Zealand, a murderous shooter apparently radicalised by social media broadcast his heinous acts on those same platforms. Let’s be clear, the hateful killer is to blame, but it is hard to deny that his crime was facilitated by technology.
In that case, the New Zealand government did not turn off the tech faucets, but it did point to those companies as a big part of the problem. After the attacks, neither Facebook nor YouTube could easily stop the ever-looping videos of the killings, which proliferated too quickly for their clever algorithms to keep up. It is a problem, even if the manifestations of how these platforms get warped vary across the world. They are different in ways that make no difference and the same in one crucial way that does. Namely, social media has blown the lids off controls that have kept society in check. These platforms give voice to everyone, but some of those voices are false or, worse, malevolent, and the companies continue to struggle with how to deal with them.
In the early days of the internet, there was a lot of talk of how this was a good thing, getting rid of those gatekeepers. Well, they are gone now, and that means we need to have a global discussion involving all parties on how to handle the resulting disaster, well beyond adding more moderators or better algorithms.
I raised that idea with a top executive at a big tech company I visited last week, during a discussion of what had happened in New Zealand.
“You can’t shut it off,” the executive said flatly. “It’s too late.”
— New York Times News Service
Kara Swisher, editor-at-large for the technology news website Recode and producer of the Recode Decode podcast and Code Conference, is a contributing opinion writer.