Advertisement

‘Videos promoting violence’ spread on TikTok in lead-up to Capitol riot, DHS reveals – but feds didn’t understand app

Rioters loyal to then-President Donald Trump outside of U.S. Capitol on  Jan. 6, 2021, in Washington. (AP)
Rioters loyal to then-President Donald Trump outside of U.S. Capitol on Jan. 6, 2021, in Washington. (AP)

US counter-terror agents may not be paying enough attention to TikTok’s role in fomenting domestic terrorism because they do not know how it works, federal officials have warned.

A bulletin to law enforcement agencies from the Department of Homeland Security (DHS) this April said that extremist groups had used TikTok to “recruit adherents, promote violence, and disseminate tactical guidance”, including in the run-up to the storming of the US Capitol on 6 January.

“Prior to the 6 January US Capitol Complex breach, multiple videos promoting violence were on the platform, including one from a user who posted a video encouraging attendees to bring firearms to Washington, DC. Carrying firearms is illegal in areas of protest activity in DC, and promoting such activity is in violation of TikTok’s terms of service, according to reliable open source press reporting,” the DHS document, obtained by the transparency campaign group Property of the People, says. The organisation shared a copy of the report with Politico, who first reported on the details.

ADVERTISEMENT

Over the course of 2020, as the presidential election churned and political divisions deepened across the US, the DHS memo describes other videos of concern.

“In early to mid-2020, three identified TikTok users shared separate videos discussing how to sabotage railroad tracks; methods to interfere with the US National Guard during riots; and how to access the White House via tunnels, presumably for use by individuals seeking unlawful access,” the report says.

Yet some agencies, it said, had “limited awareness” of how the Chinese video-sharing app works and might not know that extremist groups were “exploiting standard features of the platform” to spread their messages.

The document, according to the report, gives an insight into the US government’s struggle to keep pace with the ever-shifting world of social media.

It explains how far-right groups and Islamic fundamentalists have attempted to evade TikTok’s automated moderation systems by embedding web links and messages inside their videos or linking to messaging channels on Telegram.

It cites one user who encouraged viewers to bring guns to the protest at the US Capitol on 6 January, and other videos from early to mid 2020 that described how to sabotage train tracks and enter the White House via tunnels.

“TikTok’s application layout and algorithms can unintentionally aid individuals’ efforts to promote violent extremist content,” the bulletin says, noting the app’s heavy use of AI to select videos that its users might enjoy even from people they do not follow.

“A user’s account may have zero followers but could have substantial viewership on some videos, which could aid violent extremist TikTok users in evading TikTok’s content moderation efforts.”

A spokesperson for TikTok said: “There is absolutely no place for violent extremism or hate speech on TikTok, and we work aggressively to remove any such content and ban individuals that violate our Community Guidelines.”

The DHS bulletin echoes the findings of the Election Integrity Partnership, a panel of disinformation experts who investigated the role of social networks and online influencers in the 2020 election and the Capitol riot.

Their report argued that TikTok served as a repository for false claims that had first appeared on other platforms such as Facebook and Twitter, sometimes after they had been removed elsewhere.

Many people used the app’s green screen feature to turn images or videos from other platforms into a background which they then stood in front of and discussed.

In July, an investigation by the Wall Street Journal found that TikTok’s algorithms could rapidly send users down “rabbit holes” of videos about depression or eating disorders based on their previous activity.