Parler is not the problem — Gateway platforms are

Tal-Or Cohen
9 min readJan 26, 2021

In the aftermath of the deplatforming of the frontier social media app, Parler, alongside the prior banning of Donald Trump from multiple social media and internet platforms, an important debate regarding the private sector’s ability to limit free speech and put fringe networks into internet purgatory has been sparked. Most recently, a judge struck down the antitrust law suit brought by Parler against Amazon for colluding to render Parler incapable of functioning online.

However, these swift and extreme steps by big-tech look like a bad case of scapegoating for a reason. Deplatforming Parler and other fringe networks with radicalized activists, folks with unpopular opinions and conspiracy followers fails to address the real problem — gateway platforms.

What is a gateway platform?

Since I am trying to coin the use of an original term here, allow me to explain. A gateway platform is a platform where misinformation, disinformation, extremist thought, and conspiracy theories breed like bacteria in a petri dish and eventually ‘jump’ into the mainstream social media platforms today. A gateway platform serves a dual purpose in the perpetuation of online hate by: a) spreading the misinformation, disinformation and conspiracy theories to would-be believers within the fringe social media networks such as Gab, Parler or even social messenger apps, like Telegram b) lending a veneer of seeming legitimacy to those who might spread the misinformation, purposeful spreaders and regular internet and mainstream social media users not equipped with investigative internet, research and open-source analysis skills (since investigative internet research skills are not part of any public school curriculum today, I think we can agree this second description includes most users)

There is no such thing as radicalization without misinformation. Today’s gateway platforms provide individuals susceptible to radicalization the pools of phony ‘research’ that stoke their outrage, fear, hate and move some to organize violently. The people who coordinated and partook in the insurrection at Capitol Hill were exposed to consistent amounts of misinformation, consolidated into major thematic conspiracy theories, before they were stirred to action. Medium writer, anonymous data scientist, recently published a three-part analysis regarding the data that was scraped from Parler, detailing not only the most popular beliefs and conspiracy theories by its users, but the gateway platforms that were hosting them as well — namely, Google-owned YouTube and BitChute. A closer examination of the QAnon conspiracy theory, one of the most widely cited conspiracy theories on Parler, suggests that the evolution of QAnon from a fringe belief to a movement with an estimated millions of followers, happened due to gateway platforms stoking the exposure to misinformation and lies.

Wikimedia Commons, Demonstrator outside of the Capitol Building in Washington D.C. on January 6, 2021

The Origin of the Q-Anon Conspiracy Theory

The QAnon theory originated via 4Chan message board user “Q Clearance Patriot,” also referred to as “Q” a presumed government official with US Department of Energy Q Clearance, a security clearance which allows officials to access classified information on nuclear weapons.

On October 28, 2017 on 4-Chan’s Politically Incorrect board in a thread called “Calm Before the Storm,” an anonymous user, “Q” stated, “Hillary Clinton will be arrested between 7:45 AM — 8:30 AM EST on Monday — the morning on Oct 30, 2017.” Subsequent posts by the user stated that other countries were warned to extradite Clinton if she attempted to leave the US, and for the National Guard to be prepared for large riots.

Screenshot of original Q-Post, Tweeted by Ben Collins, 10/29/2018 (originally posted on since-deleted 4-Chan platform)
Ben Collins, Twitter, published 10/29/2018 (originally posted on since-deleted 4-Chan platform)

This initial 4Chan post was followed by others posts by the same anonymous user, called “breadcrumbs,” or “Q Drops,” referring to hints directed at Q’s followers. Q expanded on his initial post, including planned indictments of the Clinton campaign staff by Trump, and alluded to their involvement in a satanic cult. Followers still track posts by Q, which they interpreted as clues about a clandestine operation led by then President Trump against a large pedophile ring. Followers argue that Q’s clues will portend “The Storm,” or an imminent mass arrest and execution of alleged ringleaders.

QAnon lore continued to grow from the first post in October 2017 into 2020, with followers combing for clues found outside of the initial 4chan posts.

The jump: How the QAnon conspiracy theory spread from fringe to mainstream social media.

From 2016 to 2017 several anonymous users posting on Politically Incorrect claimed a connection to US government intelligence. In November 2017 two moderators of Politically Incorrect reached out to YouTube conspiracy theorist Tracy Diaz, showing her the anonymous posts by Q and the burgeoning QAnon conspiracy emerging on 4chan. On November 3, 2017, only six days after the first post on 4chan, Diaz posted her first QAnon-focused video to her YouTube channel Tracy Beanz. This since-removed video netted 250,000 views between November 2017 — August 2018. Diaz also recommended that QAnon chats be moved from 4chan to the more user-friendly Reddit message board in order to gain more of a mainstream audience. From here, several QAnon-centric subreddits (online Reddit communities) emerged. The first QAnon subreddit CBTS_Stream, moderated by Diaz under username tracybeanz, garnered 14.6 thousand users in February 2018.

Let’s pause for a moment to understand — a conspiracy theorist on YouTube, who monetizes her online presence like so many other YouTubers, essentially provided consulting services and recommended that the misinformation be nurtured and developed on Reddit in order to produce momentum and believers for the movement.

Social Media Platform Responses to QAnon

In March 2018, Reddit announced it was shutting down the subreddit, becoming the first internet platform to ban QAnon content. Other major social media platforms followed suit — however, these steps were often taken with significantly larger gaps of time between decisions than with the response to Parler by AWS, Apple and Google.

YouTube’s first steps toward curbing some QAnon activity came in January 2019, when it stopped recommending conspiracy videos via the “Up Next” sidebar. In October 2020, YouTube announced further steps to limit QAnon content. It was joined by Facebook, which also announced QAnon bans that month. Similarly, Twitter announced a QAnon ban in July 2020. The moves came as President Trump and Joe Biden launched their campaigns ahead of the 2020 election. Several commentators and journalists attributed the timing of the ban to political pressures and concerns among platform management that QAnon content could influence election results. In September 2020, Twitter extended its ban to include candidates and elected officials who use the platform to promote the conspiracy theory.

Did it work?

As part of my job consulting and managing projects in open-source research and business intelligence, I have acquired a particular interest in the subject of online hate and radicalization. Using an advanced web tool that aggregates hundreds of thousands of global sources, including articles and YouTube videos, I conducted an ongoing search for terms and hashtags associated with QAnon over the past 12 months and found that YouTube is one of the most dominant platforms comparatively to other media sources with QAnon content.

There was more engagement with strong QAnon terms on YouTube than the leading QAnon website.

Additionally, the search results demonstrated that the five most popular content items, with the highest level of engagement on social media, containing QAnon terms, all originated from YouTube.

5 most popular items, with highest engagement level on social media, containing QAnon terms all originated from YouTube

YouTube claims that as a result of its January 2019 anti-QAnon measures “the number of views that come from non-subscribed recommendations to prominent Q-related channels dropped by over 80% since January 2019.” However, YouTube engagement analysis from November 2019–2020 contradicts that claim, considering the high volume of QAnon content and channels that were shared months after January 2019. Moreover, many of the videos which originated on YouTube continued to be shared on other platforms. Social media content sharing analysis also revealed that although Reddit announced the removal of QAnon pages from the platform in 2018, a considerable amount of QAnon content continued to be shared on Reddit from 2019–2020.

Platform-specific social media analysis on Facebook and Twitter similarly demonstrates that QAnon content remained on the platforms weeks after ban announcements. Though Facebook’s QAnon ban addressed QAnon hashtags, pages, and likes, QAnon pictures, slogans, and content shared on the personal accounts of conspiracy adherents still remain on the platform. Importantly, Facebook has also failed to ban QAnon posts from celebrity influencer Shemane Nugent, wife of far-right rock star Ted Nugent. Shemane Nugent has 212.8 thousand Facebook followers, and 159.5 thousand likes. Her QAnon posts average 1–2 thousand reactions and hundreds of shares.

Facebook post by user Rick Lap— Picture features QAnon — American soldiers, black and white picture, calm before the storm
Rick Lap, Facebook, published 11/06/2020
Facebook post by Kevin Sorenson White Rabbit Point to Pocketwatch and the letter Q
Kevin Sorensen, Facebook, published 11/09/2020

Months after Twitter’s July 2020 announcement that it was restricting QAnon activity, recent open-source analysis revealed that despite QAnon-specific hashtags being banned from the platform, several QAnon-devoted accounts are still active on Twitter. Many of these accounts have tens of thousands of followers.

Twitter Account named QAnon Report WWG1WGA Where we go one we go all Masked Figure with American Flag
QAnon Report, Twitter, accessed 11/2020

The truth (or lack thereof) hurts.

I would agree with the conclusion that some decentralization advocates have recently pointed out — the challenge of navigating the public squares of the internet today is not free speech — it is discerning the truth or absence thereof. The Capitol Hill rebels on January 6th, 2021 were repeatedly exposed to a host of conspiracy theories like QAnon, anti-vaccination, 5G and others. Gateway platforms like Reddit, YouTube, BitChute and Quora continue to provide a toxic space of ‘independent research,’ with unverified, extreme content and false information about the state of the world, governments, health, our neighbors and the rule of law. While no exposure to this kind of material excuses or justifies the twist in agency that would lead a person to storm the legislature, commit treason, or attack a police officer, much less their fellow citizen, radicalization does not happen without misinformation.

Cleaning up our social media spaces and preventing them from becoming hothouses of radicalization depends, long term, on tools and algorithms that promote truth, verifiable information and alert to disinformation. To clarify, Reddit and all other social media platforms have proven that we cannot assume that crowd-source popularity means truth. However, as far as addressing hate, big tech and services needs to be taking a long hard look at themselves and their own user-based platforms before casting the cancellation stone. Their gateway platforms are lending legitimacy to a host of conspiracy theories, ideas and propaganda, leading users down the rabbit hole, contributing to extreme hate, both in belief and practice. If popular opinion and the current news cycle distracts us into a discussion about whether or not AWS has the right to refuse service to Parler, we will be giving big-tech a pass on addressing the prevalence of gateway platforms as major catalyzers of hate.

Note: In the spirit of promoting folks to think discerningly about what is true when they read things on the internet, some lawyer friends have advised me to disclaim some of the information you read here. Any reliance on the information provided in this article by anyone should be based on their own independent investigation and verification as to all research and data contained here. This article is provided as an opinion based on supplemental research on the topics contained therein. No part of this article shall be utilized for any other purpose other than informational.

--

--

Tal-Or Cohen

Lawyer. Feminist. OSINT enthusiast. Belief: Most problems can be solved through education. Challenge: Finding a subject I won’t have an opinion on. Tribe: ✡︎