Civic News
Elections / Federal government / Guest posts / Politics / Social justice

A group of activists and researchers shut down Terrorgram at a critical moment. Here’s how

"Our goal was to give communities floored by the events of Jan. 6 a direct, meaningful way of taking action against far-right radicalization, and the collective response was swift and overwhelming."

A Trump-supporting mob ascends the Capitol steps in D.C. on Jan. 6, 2021. (Screenshot via NBC News)
This is a guest post by Gwen Snyder, a Philadelphia movement strategist. It originally appeared on sister site Generocity and is republished here with permission.
For many people, the Capitol coup attempt on Jan. 6 was a wake-up call, the first real indicator that violent far-right extremism in our country had evolved from the province of a fringe few to a militarized movement large enough and bold enough to attack and even momentarily seize control of our national government’s seat of power.

My perspective was a bit different that day.

The attackers in D.C. weren’t all strange faces. These were the men who had visited my apartment building in the night to threaten me and vandalize my West Philly street in 2019, declared a rally for the stated purpose of harassing me in my home last fall, and screamed at me from behind police lines after the November election.

These were the Philadelphia Proud Boys, and their presence in the Capitol that day was no surprise. I’d watched them fundraising for the trip online.

I never set out to be an antifascist researcher, exactly. I’m a grassroots organizer out of the labor movement, and in the period following the 2016 election I had increasingly begun to focus my activism around the work of antiracism. The more engaged I became, the more it became clear to me that one of the most effective uses of my whiteness and the privileges that came with it was to directly counter white supremacy.

These were the men who had visited my apartment building in the night to threaten me and vandalize my West Philly street in 2019.

One particularly important moment in this realization came in November 2018, when the Philadelphia Proud Boys organized their first (and, thanks to organized antifascist resistance, only) major rally here. Talking to a Black friend, I mentioned my plans to join the counter-protest, asking her if she also planned to attend. Her response was unequivocal: She did not feel safe downtown during that rally, period.

In retrospect, it should have been obvious to me that counter-protest of a white supremacist gang would not necessarily feel like a safe space for a person of color. In that moment, though, it was a revelation: The problem wasn’t just that these racists were spreading a hateful message. It was that their presence literally made marginalized people feel unsafe in their own streets.

I photographed the Proud Boys at the rally that day, clumsily working to try and identify them through a combination of Facebook-combing and circulating the photos in local forums.

The more involved I became in the work, the more I began to look beneath the hood of overtly violent white supremacy, scouring the pro-terror /pol/ forum on 8chan to try and dissect their internal logic, their insular culture, the jokes and memes and talking points that inevitably made their way to Proud Boy social media.

By summer 2019, I was working hard on two projects. Locally, I was engaged in exposing Philadelphia Proud Boys, holding them publicly accountable, and counter-organizing that routinely forced them to cancel their scheduled public events.

I was also increasingly involved in writing about and analyzing the project of violent and explicit white supremacy as it evolved on 8chan and, increasingly, in even less-monitored or moderated extremism chambers on the chat client Telegram.

I accidentally landed a first row seat to the formation of the self-dubbed Terrorgram, a network of violent Nazi channels on Telegram.

It was in doing this work that I accidentally landed a first row seat to the formation of the self-dubbed Terrorgram, a network of violent Nazi channels on Telegram that openly embraced violent far-right accelerationism, a philosophy that endorses the intentional speeding-up of a civilization-wide collapse that many fascists believe to be inevitable. From this point of view, violence is not just justified but necessary to advance the creation of a new, whites-only Western order.

From the beginning, the influence of this informal, pro-terror Nazi network on broader U.S. white supremacy was apparent and chilling.

Terrorgram became associated directly with explicit terror plots by hardline Nazis within the U.S., but its jokes and visuals also percolated into the relative mainstream of 4chan and MAGA spaces, proving their ability to influence reactionary culture well beyond the capital-N Nazi fringe. The far-right “boogaloo boys” subculture that grabbed headlines in 2020 during Black Lives Matter uprising took its name directly from race war memes created and spread on Terrorgram the previous year.

In the aftermath of the coup attempt, many mainstream outlets turned their attention to the platforms the rioters had used to generate enthusiasm and support for the events of Jan. 6. Parler, a Twitter clone that catered to far right exiles from more mainstream social media networks, came under particular scrutiny, leading Amazon to withdraw its web hosting services.

Parler’s users, violently emboldened but digitally displaced, sought a new home for their inflammatory rhetoric.

Parler’s users, violently emboldened but digitally displaced, sought a new home for their inflammatory rhetoric. Tens of thousands turned to the flocked to the platform well-known for its tolerance of violent extremism: Telegram.

Channels in the Terrorgram networked crowed at this development. They described this MAGA influx to their preferred platform as a golden opportunity to further radicalize former Parler users and recruit them to more explicitly violent, more overtly white supremacist channels and subcultures already organized in this digital space.

This drive towards recruitment troubled many who watch these channels, myself included. Pre-inaugural desperation and the thrill of seeing compatriots invading the Capitol had created a situation where these Parler exiles were particularly primed for further violent radicalization.

How could we counter this escalating threat?

The question of how to shut down Terrorgram has long been a head-scratcher in antifascist circles. Telegram is a self-funded project by Pavel Durov, a Russian billionaire who considers his platforming of violent fascist chats to be a principled stand against censorship. [Editor’s note: In mid-January, Telegram did remove some extremist channels.] Still, we knew Durov had announced plans to use advertisers to monetize Telegram, and the shutdown of one major Terrorgram channel signaled that perhaps the company had already begun to fear the possibility of succeeding Parler as a focus of post-coup controversy — potentially severely dampening its ability to attract advertisers in the future.

Long before Jan. 6, I had been working to build support for an effort to force Telegram into shutting down its terror channels by putting pressure on Apple and Google to issue an ultimatum to the company: Abide by their app stores’ terms of service by removing Terrorgram, or risk wholesale removal from U.S. iPhone and Android app stores.

As deplatformed Parler users flocked to Telegram, I began to float the idea of launching such an online campaign with trusted researchers and activists.

It would have two prongs: First, we would wage a Twitter campaign targeting Apple and Google demanding an app store ultimatum; second, we would encourage mass reporting of Terrorgram channels directly to Telegram through its in-app reporting feature.

The first would threaten Telegram’s ability to grow its user base, a key concern for advertisers; the second would remove Telegram’s ability to plausibly claim lack of knowledge of the Nazi terror channels. With hundreds of Twitter users publicly announcing they’d reported these chats directly to the company, neither Telegram nor the app stores would be able to pretend that this network operated outside the awareness of its platformer.

I began to float the idea of launching such an online campaign with trusted researchers and activists.

This was a strategy I had discussed with potential partners for months, but the events in the Capitol created a moment ripe for putting the plan into action.

It began with a few of us, and grew as our reach expanded. Our goal was to give communities floored by the events of Jan. 6 a direct, meaningful way of taking action against far-right radicalization, and the collective response was swift and overwhelming.

Over the course of the next two weeks, Telegram would shutter dozens of worst-offender Nazi terror channels, leaving the Terrorgram in disarray. By the time this network began to reassemble itself, their peak opportunity for new user recruitment had passed.

Our January Terrorgram Takedown campaign didn’t erase terrorist Nazi channels permanently from Telegram’s network. But it very effectively mobilized community to take action and threw a Nazi terrorist network into disarray at the very moment it was primed to use the Capitol coup attempt, the shutdown of Parler, and the end of the Trump presidency as a means to radicalize a new micro-generation of potentially violent extremists.

It created a proof-of-concept for future campaigns against Terrorgram, giving us a strategic window into the governance of Telegram and what pressure points will be effective when they do follow through on their plans for advertising and monetization.

The Terrorgram Takedown campaign followed a fairly simple checklist of actions, a format that can be useful for justice campaigns even outside the realm of antifascism and deplatforming.

First, long before the coup, we had researched our target (Telegram) identified their self-interest (future revenue through advertising) and figured out a way to disrupt that self-interest (advertiser fear and limitation of market reach by threat of app store exclusion).

Our campaign got people who felt helpless against a tide of online far-right radicalization to realize their own power.

We had built a strategy, fostered relationships with potential campaign partners, and created buy-in around the nuts and bolts of the campaign already.

When the terrible events in D.C. came to pass, we were already positioned with an action plan.

Every effective direct action campaign aims not only to achieve its external goal, but to give its constituents a sense of their own power.

I believe that more than just shutting down a terror propaganda network at a critical time, our campaign got people who felt helpless against a tide of online far-right radicalization to realize their own power.

It’s that knowledge that makes future wins possible, and it’s that realization of the power of the organized that will drive the action we need to eventually take down Terrorgram, Nazis, and ultimately, white supremacy itself, once and for all.

Engagement

Join the conversation!

Find news, events, jobs and people who share your interests on Technical.ly's open community Slack

Trending

Philly daily roundup: Women's health startup wins pitch; $204M for internet access; 'GamingWalls' for sports venues

Philly daily roundup: East Market coworking; Temple's $2.5M engineering donation; WITS spring summit

Philly daily roundup: Jason Bannon leaves Ben Franklin; $26M for narcolepsy treatment; Philly Tech Calendar turns one

Philly daily roundup: Closed hospital into tech hub; Pew State of the City; PHL Open for Business

Technically Media