content moderation content moderation
Stories About

content moderation

The U.S. Supreme Court Catie Dull/NPR hide caption

toggle caption
Catie Dull/NPR

Supreme Court justices appear skeptical of Texas and Florida social media laws

  • Download
  • <iframe src="https://www.npr.org/player/embed/1233506273/1234110894" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

Visitors stand near screens displaying the Meta logo in Berlin on June 6. Under a U.S. judge's new ruling, much of the federal government is now barred from working with social media companies to address removing content that might contain "protected free speech." Tobias Schwarz/AFP via Getty Images hide caption

toggle caption
Tobias Schwarz/AFP via Getty Images

Elon Musk while attending a conference in Norway earlier this year. The billionaire new owner of Twitter is releasing information about the company's high-profile moderation decisions. Carina Johansen/NTB/AFP via Getty Images hide caption

toggle caption
Carina Johansen/NTB/AFP via Getty Images

Elon Musk is using the Twitter Files to discredit foes and push conspiracy theories

  • Download
  • <iframe src="https://www.npr.org/player/embed/1142666067/1142669913" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">

Facebook says former President Donald Trump cannot use its social media platforms until at least Jan. 7, 2023. Pool/Getty Images hide caption

toggle caption
Pool/Getty Images

Trump Suspended From Facebook For 2 Years

  • Download
  • <iframe src="https://www.npr.org/player/embed/1003284948/1003388036" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

Facebook's Oversight Board says the company, led by CEO Mark Zuckerberg, must take responsibility for its decisions. Saul Loeb/AFP via Getty Images hide caption

toggle caption
Saul Loeb/AFP via Getty Images

In 1st Big Test, Oversight Board Says Facebook, Not Trump, Is The Problem

  • Download
  • <iframe src="https://www.npr.org/player/embed/994436847/995019262" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

Facebook indefinitely suspended then-President Donald Trump's accounts in January after a mob of his supporters stormed the U.S. Capitol. Joe Raedle/Getty Images hide caption

toggle caption
Joe Raedle/Getty Images

Facebook Ban On Donald Trump Will Hold, Social Network's Oversight Board Rules

  • Download
  • <iframe src="https://www.npr.org/player/embed/987679590/993994607" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

Apple's app store is poised to reinstate Parler, which it suspended after the Capitol riots over what it described as violations of its guidelines around violent content. Rafael Henrique/SOPA Images/LightRocket via Getty Images hide caption

toggle caption
Rafael Henrique/SOPA Images/LightRocket via Getty Images

Facebook created the panel of experts to review the hardest calls the social network makes about what it does and does not allow users to post. Jeff Chiu/AP hide caption

toggle caption
Jeff Chiu/AP

Facebook 'Supreme Court' Orders Social Network To Restore 4 Posts In 1st Rulings

  • Download
  • <iframe src="https://www.npr.org/player/embed/961391277/961816522" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

Millie Weaver, a former correspondent for the conspiracy theory website Infowars, hosts nearly 7 hours of live coverage on her YouTube channel. Conservative influencers like Weaver who often broadcast live are increasingly worrisome to misinformation researchers. YouTube hide caption

toggle caption
YouTube

From Steve Bannon To Millennial Millie: Facebook, YouTube Struggle With Live Video

  • Download
  • <iframe src="https://www.npr.org/player/embed/933235773/933605345" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">

Facebook and other tech companies sent workers home to protect them from the coronavirus. That's creating new challenges about how to handle harmful content on their platforms. Glenn Chapman/AFP via Getty Images hide caption

toggle caption
Glenn Chapman/AFP via Getty Images

Facebook, YouTube Warn Of More Mistakes As Machines Replace Moderators

  • Download
  • <iframe src="https://www.npr.org/player/embed/820174744/824358268" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript