This post is the second in a JustPeace Labs blog series on peacebuilding, extremism, and social media, looking at different approaches for mitigating social media’s impact on radicalization and violent extremism, particularly in the wake of the violent insurgency at the US Capitol on January 6. Check out the first post in the series here.
Effective violence prevention, or just more “cancel culture”?
The suspension of Donald Trump’s social media accounts for contributing to the violent insurgency at the US Capitol on January 6 was one of the week’s top global headlines, even as news emerged that the violence had resulted in the loss of five lives. And with a failed attempt at a senate impeachment conviction and stalled investigations at the US federal prosecutor level, those account suspensions stand as one of the only mechanisms of accountability Trump faces at present for inciting that violence.
The account suspensions sparked heated debate. Many have argued that the ability to unilaterally suspend or cancel individual users’ accounts imbues social media companies with too much authority and sets a dangerous precedent by anointing the heads of these companies as “arbiters of truth.” Others criticize the suspensions as too little too late and ask what distinguishes Trump’s social media posts on January 6 from those made repeatedly via the platforms for the four years prior. Experts following Trump supporters on social media knew that violence on January 6 was likely–so why wasn’t Trumps’ account suspended before then?
This controversy is not new. It began more than a decade ago with the original rise of social media platforms that saw themselves as “neutral” amplifiers of free speech who were augmenting democratic movements worldwide–a sort of post-censorship utopia. But those platforms are now widely used to amplify extremism and incite violence. Private corporations are now required to balance important considerations of fundamental human rights, such as the freedom of expression, with the need to prevent violence and conflict. Are account suspensions a helpful tool in doing so?
Kate Klonick outlines the complexities of this situation in a recent New Yorker analysis:
“Facebook now has some three billion users—more than a third of humanity—many of whom get their news from the site. In 2016, Russian agents used the platform in an attempt to sway the U.S. Presidential election. Three years later, a white supremacist in New Zealand live-streamed a mass shooting. Millions of people joined groups and followed pages related to QAnon, a conspiracy theory holding that the world is controlled by a cabal of Satan-worshipping, pedophilic Democrats. The First Amendment has made it difficult for the U.S. government to stop toxic ideas from spreading online. … As a result, Facebook has been left to make difficult decisions about speech largely on its own.”
Although undoubtedly difficult for tech companies to navigate, this is largely a problem of their own making. Mark Zuckerberg has decried the absurdity of Facebook and the other tech giants being made the internet’s “arbiters of truth.” But for over a decade, Twitter, Facebook, and other Silicon Valley giants have vigorously lobbied the US government to avoid regulation. Most recently, Amazon, Apple, Facebook, Google, and Microsoft together spent more than 60 million dollars lobbying US lawmakers in 2020, according to the companies’ disclosures.
The fact that decisions balancing free speech and incitement of violence are left up to private companies has also prompted sharp reactions from political leaders worldwide. A spokesperson for German Chancellor Angela Merkel said, “The fundamental right [of freedom of expression] can be interfered with, but along the lines of the law and within the framework defined by the lawmakers. Not according to the decision of the management of social media platforms.” In the US, by comparison, there is no constitutional free speech protection to say whatever you want on social media as a matter of law.
According to the ACLU, a more significant concern is the absence of due process and imbalance in power inherent in these account suspension decisions. The lack of due process is a serious issue–and one that we’ll address in our next post. With regards to account suspensions, though, the imbalance of power that suspensions like these present is particularly troubling, especially in high-risk settings. Grievances over power imbalances–whether perceived or real–are common conflict drivers and can push a fragile situation over the edge into full-blown conflict.
During the four years that Trump was permitted to espouse dangerous speech via his Twitter and Facebook accounts under the “newsworthiness” exception, racial justice activists in the US frequently had posts removed because they shared aggressive racist threats made against them. A similar dynamic emerged during a campaign of violence against Rohingya in Myanmar when Facebook repeatedly blocked activists’ accounts when they posted evidence and reports of violence. Facebook didn’t take concerted action to block Burmese government and military accounts until the UN found that the social media platform played a significant role in the genocide there. The rules about what is “newsworthy” are criticized for being opaque–and indeed, it is unclear why politicians get a pass while human rights activists reporting on issues in the public interest do not.
In addition to concerns about due process and inequalities, another concern is the effectiveness of account suspensions for preventing violence. Social media companies waited until after serious violence occurred before suspending accounts, in the US and elsewhere. As controversial as the decisions are, the tactic doesn’t work to prevent the violence incited.
As Susan Benesch from Dangerous Speech project says, “Dangerous speech is potent precisely because it is gradual, so it’s easy to get inured to the dribbling, like a slow gas leak.”
Over five years, Trump laid the groundwork for violence, spreading it like noxious gas around the US. On January 6, he finally lit the match.
353 thoughts on “Peacebuilding, Extremism and Social Media, Part 2: Social Media Account Suspensions”
Comments are closed.