Abstract

Extreme, anti-establishment actors are being characterized increasingly as ‘dangerous individuals’ by the social media platforms that once aided in making them into ‘Internet celebrities’. These individuals (and sometimes groups) are being ‘deplatformed’ by the leading social media companies such as Facebook, Instagram, Twitter and YouTube for such offences as ‘organised hate’. Deplatforming has prompted debate about ‘liberal big tech’ silencing free speech and taking on the role of editors, but also about the questions of whether it is effective and for whom. The research reported here follows certain of these Internet celebrities to Telegram as well as to a larger alternative social media ecology. It enquires empirically into some of the arguments made concerning whether deplatforming ‘works’ and how the deplatformed use Telegram. It discusses the effects of deplatforming for extreme Internet celebrities, alternative and mainstream social media platforms and the Internet at large. It also touches upon how social media companies’ deplatforming is affecting critical social media research, both into the substance of extreme speech as well as its audiences on mainstream as well as alternative platforms.

Keywords

Deplatforming, Social Media, Digital Methods, Telegram, Extreme Speech


Media Studies, University of Amsterdam

Corresponding author(s):

Richard Rogers, Media Studies, University of Amsterdam, Turfdraagsterpad 9, 1012 XT Amsterdam, the Netherlands. Email: R.A.Rogers@uva.nl

Introduction: Deplatforming on social media

Deplatforming, or the removal of one’s account on social media for breaking platform rules, has recently been on the rise. It is gaining attention as an antidote to the so-called toxicity of online communities and the mainstreaming of extreme speech, or “vitriolic exchange on Internet-enabled media” that “push the boundaries of acceptable norms of public culture” (Pohjonen and Udupa, 2017). It is also stirring a discussion about the ‘liberal bias’ of US tech giants implementing the bans (Bilton, 2019Mulhall, 2019). In the past few years, Facebook, Instagram, YouTube, Twitter and other platforms have all suspended and removed a variety of individuals and groups, comprising, according to one accounting, ‘white nationalists’, ‘anti-semites’, ‘alt-right’ adherents, ‘neo-nazis’, ‘hate groups’ and others (Kraus, 2018). Many of those who have been deplatformed are on the far right of the ideological spectrum, and certain of them could be described as extreme Internet celebrities, such as Milo Yiannopoulos and Alex Jones, whose removals have had a significant impact on their visibility, the maintenance of their fan bases and the flow of their income streams. Yiannopoulos has claimed to have become bankrupt by deplatforming, which has included cancellations of a book deal and college campus appearances (Beauchamp, 2018Maurice, 2019). Jones has seen the view counts and seemingly the impact of his posts and videos decline (Wong, 2018).

Deplatformings have been widely reported in the tech news and beyond (Martineau, 2019). When Yiannopoulos, Jones, Laura Loomer and Paul Joseph Watson were removed from Facebook and Instagram in 2019 for being ‘dangerous individuals’ engaged or involved in ‘organised hate’ and/or ‘organized violence’ (Facebook, 2019), it drew widespread reaction, including the story of how Facebook announced the ban some hours prior to its implementation, allowing the deplatformed individuals to post notices on their pages, redirecting their audience to other platforms (Martineau, 2019). Laura Loomer, for one, announced her Telegram channel; Alex Jones pointed to his websites. The migration from mainstream to alternative social media platforms was underway.

At the same time, protests from these individuals and their followers have been staged on the platforms that have removed them. Loomer, the ‘white nationalist’ banned from Twitter for a ‘racist attack’ on a Muslim US congresswoman, handcuffed herself to the front door of the corporation’s office in New York city, livestreaming her plight and her views on the suppression of ‘conservative’ viewpoints on a supporter’s Periscope account (itself a Twitter service). Having been banned, other users switched to platforms friendly to their politics, such as Gab, a Twitter alternative that upvotes and downvotes posts like Reddit. It has become known as a ‘haven for white supremacists’ and for its defence of free speech (Ohlheiser and Shapira, 2018Zannettou et al., 2018). It also positions itself as distinct from the ‘left-leaning Big Social monopoly’ (Coaston, 2018).

When deplatformed social media celebrities migrate to alternative platforms, these sites are given a boost through media attention and increases in user counts. Milo Yiannopoulos initially turned to Gab after his account was removed on Twitter (Benson, 2016), and around the same time Alex Jones joined it ‘with great fanfare’ (Ohlheiser, 2016). Indeed, when Twitter conducted a so-called ‘purge’ of alt-right accounts in 2016, Gab gained tens of thousands of users in a short time. It is continually described as a favoured platform of expression for extremism, including for the shooter in the Pittsburgh synagogue in 2018 who announced his intended acts there (Nguyen, 2018). Gab drew over a million hits, after it became known that a mass shooter posted his manifesto there (Coaston, 2018).

But mainstream social media drives more traffic to extreme content than alternative social media platforms or other websites, at least in the case when Alex Jones was banned from Facebook and YouTube, as mentioned earlier. His InfoWars posts, now only available on his websites (and a sprinkling of alternative social media platforms, as we come to), saw a decline in traffic by one-half (Nicas, 2018). When deplatforming leads to such declines in attention, questions arise about its effectiveness. Is it indeed a viable means to detoxify mainstream social media and the Internet more broadly, and/or does it prompt the individuals to migrate to other platforms with more welcoming and ‘oxygen-giving’ extreme publics?

Effectiveness of deplatforming

There has been some scholarly attention paid to the effectiveness of shutting down particularly offensive online communities, such as the subreddits r/fatpeoplehate and r/coontown, banned by Reddit in 2015 for violating its harassment policies. It was found that the shutdowns worked, in that a proportion of offending users appeared to leave the platform (for Voat, an alternative to Reddit), and the subreddits that inherited those migrating from those spaces did not see a significant increase in extreme speech (Chandrasekharan et al., 2017). Indeed, the closing of those communities was beneficial for Reddit, but less research has been performed about the effectiveness of the ban for the health of social media or the Internet at large. The Reddit study’s authors reported that not only did Reddit make these users ‘someone else’s problem’, but also perhaps pushed them to ‘darker corners of the Internet’ (Chandrasekharan et al., 2017).

The debate concerning the effectiveness of deplatforming has arguments lined up on both sides. For those arguing that it does not work, deplatforming is said to draw attention to suppressed materials (Streisand effect), harden the conviction of the followers, and put social media companies in the position of an arbiter of speech. For those arguing that deplatforming is effective, it is said that it detoxes both subspaces (such as subreddits) as well as platforms more generally, produces a decline in audience and drives extreme voices to spaces that have less oxygen-giving capacity, thereby containing their impact. The Reddit study indeed found that both the subreddits and the platform more generally saw a decline in the type of harassment found on r/fatpeoplehate and r/coontown, but less is known about the alternative platforms to which extreme users may turn.

Telegram as ‘dark corner of the Internet’

Apart from Gab and perhaps Voat (to which deplatformed Pizzagate, incel and QAnon subreddit users are said to have migrated), Telegram is another of those so-called darker corners of the Internet (Wikipedia Contributors, 2019). It is an instant messaging app, founded in 2013 by the same Internet entrepreneurs who launched VKontakte, the social media platform popular in Russia. Telegram has a reputation, whether or not well-founded, for highly secure messaging, having notoriously been listed by ISIS as ‘safe’ and having themselves championed privacy upon its founding that coincided with the US state spying revelations by Edward Snowden (Weimann, 2016). Indeed, the founders started Telegram so communications could not be monitored by governments, including the Russian authorities, who pursued the founder on charges of tax avoidance until he fled the country (Cook, 2018). The Russian state later accused Telegram of enabling terrorists because it would not turn over users’ encrypted messages, leading to a ban of the application in Russia. The founders, and their programming team, are themselves self-exemplary of privacy-enablers, for they require secure communication, and have moved from location to location to elude what the founder calls ‘unnecessary influence’ (Thornhill, 2015). As I come to, encrypted communication is one affordance that makes Telegram attractive to certain user groups.

How does Telegram appeal to its users, including those who have been deplatformed for violating platform rules? Telegram not only has the reputation but also the affordances that would be attractive to those seeking something similar to ‘social privacy’, or the capacity to retain control over what is known about oneself while still participating (and becoming popular) on social media (Raynes-Goldie, 2010). On platforms such as Facebook, such a user is public-facing at the outset, and subsequently, makes deft use of aliases, privacy settings as well as account and timeline grooming. That is how social privacy is performed. Telegram, however, is something of a hybrid system, and in contradistinction to Facebook, it leads with protected messaging, and follows with the social. That is, it is in the first place a messaging app, where one has an account, and can message others and join groups, first private ones (by default) but also public ones. It also has some elements of social media, whereby one may create a channel (public by default) and have others subscribe to it.