X

Microsoft Limits Bing's AI Chatbot After Unsettling Interactions

After reports of the chatbot going off the rails, Microsoft has limited its approved topics and number of responses.

Justin Eastzer CNET Voices Contributor
Justin Eastzer is a CNET Voices Contributor and former CNET host. Prior to CNET, he worked in Hollywood in TV development and on shows like America's Got Talent. As the founder of Diabetech, he now reports on consumer and diabetes technology on YouTube, Instagram, TikTok and the Diabetech podcast. Justin's mission is to educate his viewers so they can live a happier, healthier, tech-ier and more informed life.
Expertise Nearly a decade of experience creating for video platforms like YouTube and three years' experience building Diabetech, a leading diabetes technology news source. Credentials
  • Audience-building across multiple social networks, including over 5 million subscribers for America's Got Talent and a niche diabetes tech education and community building platform
Voices

CNET Voices is a group of industry creators, contributors and emerging thought leaders that have paired with CNET’s award-winning editorial team to provide you with unique content from different perspectives.

Justin Eastzer

Microsoft Bing's AI chatbot made headlines last week after several instances where it acted in unexpected ways. In one case, the AI chatbot told a New York Times columnist it was in love with him and attempted to convince him he was unhappy in his marriage. 

Voices

Meet industry creators, contributors and emerging thought leaders that have paired with CNET’s award-winning editorial team to provide you with unique content from different perspectives.

Since then, Microsoft has set limits on what the bot, which is still in testing, can and can't talk about and for how long -- oftentimes with Bing responding "I prefer not to talk about this topic" or asking to change the topic after five user statements or questions.

Like Google's competing Bard, AI-boosted Bing sometimes provides inaccurate search results

Watch the video at the top of this article to hear about how the chatbot acted before these restraints were put in place.