Threads already has a hate speech problem, civil rights groups warn

Carrying over Instagram's Community Guidelines might not be enough.
By Chase DiBenedetto  on 
The Threads and Meta logos.
Civil rights groups flag early proliferation of hate speech on the new Meta platform. Credit: Jaap Arriens / NurPhoto via Getty Images

The list of ways Twitter could be better is long. Many users think the platform should trash its unwelcome subscription models. Others call out CEO Elon Musk's tanking of accessibility tools for profit. And, apart from the vocal few who see it as a form of free speech, many think the proliferation of hate and disinformation needs to be addressed stat. 

It might make sense, then, to build these concerns into the launch of what could be Twitter's most successful rival. But the first week of Meta's new, text-based community forum Threads suggests that hasn't been done sufficiently, according to advocates and civil rights groups.

In addition to the absence of accessibility and other features in its launch, the new social platform is already home to the same kinds of hate speech and extremist accounts that have soured Twitter's reputation, with no visible Threads-specific conduct or community policies outlining how the platform will address the problem, advocates warn.

In a letter released by 24 civil rights, digital justice, and pro-democracy organizations — including nonprofit watchdog group Media Matters for America, the Center for Countering Digital Hate, and GLAAD — the platform's parent company is criticized for taking a step backwards in relation to creating a safer digital environment for users:

Rather than strengthen your policies, Threads has taken actions doing the opposite, by purposefully not extending Instagram's fact-checking program to the platform and capitulating to bad actors, and by removing a policy to warn users when they are attempting to follow a serial misinformer. Without clear guardrails against future incitement of violence, it is unclear if Meta is prepared to protect users from high-profile purveyors of election disinformation who violate the platform's written policies. To date, the platform remains without even the most basic tools for researchers to be able to analyze activity on Threads. Finally, Meta rolled out Threads at the same time that you have been laying off content moderators and civic engagement teams meant to curb the spread of disinformation on the platform.

Prior to the July 5 Threads launch, Meta reportedly fired members of a mis- and disinformation team hired to combat election misinformation, part of a larger group tasked with countering disinformation campaigns online. 

The letter also noted "neo-Nazi rhetoric, election lies, COVID and climate change denialism, and more toxicity" on the new platform, including accounts posting "bigoted slurs, election denial, COVID-19 conspiracies, targeted harassment of and denial of trans individuals' existence, misogyny, and more." According to a July report from the Anti-Defamation League (ADL), Meta flagship Facebook is the highest reported platform where hate and harassment occur. In addition, Instagram and Facebook both received failing grades in GLAAD's 2023 Social Media Safety Index, while Twitter was named least safe.

Mashable Light Speed
Want more out-of-this world tech, space and science stories?
Sign up for Mashable's weekly Light Speed newsletter.
By signing up you agree to our Terms of Use and Privacy Policy.
Thanks for signing up!

In response to "concerning initial observations" within days of Threads' launch, the ADL is monitoring the platform's policies on hate speech, protection, and privacy. The organization pointed to Threads' blocked accounts policy as a positive, user-forward move by the tech giant, automatically blocking users on Threads that have been previously blocked on Instagram.

However, the organization also highlighted instances of Threads allegedly exposing vulnerable targets to hate and harassment, including displaying personal information like hidden legal names, that could pose future problems for at-risk users.

At Threads' launch, known social media accounts accused of routinely spreading misinformation were reportedly preemptively flagged by the platform, with many right-wing figures sharing their dissatisfaction with the site's policy of warning fellow users of the account's history. The warnings appeared to be removed not long after, with Mashable unable to replicate the profile flags. Instagram's Community Guidelines currently read, "In some cases, we allow content for public awareness which would otherwise go against our Community Guidelines — if it is newsworthy and in the public interest. We do this only after weighing the public interest value against the risk of harm and we look to international human rights standards to make these judgments."

As of this story's publication, Threads has yet to publish its own on-site community guidelines or conduct policy, writing in its launch that the platform would "enforce Instagram’s Community Guidelines on content and interactions in the app." Threads' Terms of Use can be found in Instagram's Help Center and state, "When using the Threads Service, all content that you upload or share must comply with the Instagram Community Guidelines as the service is part of Instagram." The Instagram Community Guidelines, in turn, link to Facebook Community Standards on hate speech. Currently, when trying to report abuse or spam on Threads, the platform redirects users to the Instagram Help page for "How do I report a post or profile on Instagram?"

In response to Mashable's request for comment, and in a statement to Media Matters for America, a Meta spokesperson said: "Our industry leading integrity enforcement tools and human review are wired into Threads. Like all of our apps, hate speech policies apply. Additionally, we match misinformation ratings from independent fact checkers to content across our other apps, including Threads. We are considering additional ways to address misinformation in future updates."

The advocates' letter also includes three urgent recommendations for Threads:

  • Implement strong policies unique to Threads that meet the needs of a rapidly growing text-based platform, including strong policies against hate speech to protect marginalized communities. 

  • Prioritize safety and equity by taking a proactive, human-centered approach to preventing machine learning bias and other AI-malfeasance.

  • Implement governance and leadership practices to engage regularly with civil society, including transparent and accessible data and methods for researchers to analyze Threads' business models, content and moderation practices.

"For the safety of brands and users, Threads must implement guardrails that stem extremism, hate, and anti-democratic lies," the letter reads. "Doing so isn't just good for people: it's good for business."

Want more Social Good and tech stories in your inbox? Sign up for Mashable's Top Stories newsletter today.

Chase sits in front of a green framed window, wearing a cheetah print shirt and looking to her right. On the window's glass pane reads "Ricas's Tostadas" in red lettering.
Chase DiBenedetto
Social Good Reporter

Chase joined Mashable's Social Good team in 2020, covering online stories about digital activism, climate justice, accessibility, and media representation. Her work also touches on how these conversations manifest in politics, popular culture, and fandom. Sometimes she's very funny.


Recommended For You
Instagram teens now have the option to interact with only their 'close friends'
An in-app Instagram notification pop-up.

Threads now lets you swipe posts to like or dislike them, Tinder-style
Threads app

What is Cara, the anti-AI social media app for artists?
A screenshot of Cara, the new social media app for artists


Instagram is rolling out AI chatbot versions of creators, Mark Zuckerberg says
instagram logo back to back with another logo

Trending on Mashable
NYT Connections today: See hints and answers for July 11
A phone displaying the New York Times game 'Connections.'

'Wordle' today: Here's the answer hints for July 11
a phone displaying Wordle


Webb telescope may have just revealed an alien world with air
A super-Earth orbiting a red dwarf star

'The Acolyte' keeps referencing 'The Last Jedi' — here's why
The Stranger on the unknown planet.
The biggest stories of the day delivered to your inbox.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.
Thanks for signing up. See you at your inbox!