Microsoft Puts New Limits On Bing’s AI Chatbot After It Expressed Desire To Steal Nuclear Secrets

Forbes Matt Novak, Contributor, FOIA reporter and founder of Paleofuture.com, writing news and opinion on every aspect of technology. 20 Feb 23,
Microsoft announced it was placing new limits on its Bing chatbot following a week of users reporting some extremely disturbing conversations with the new AI tool. How disturbing? The chatbot expressed a desire to steal nuclear access codes and told one reporter it loved him. Repeatedly.
“Starting today, the chat experience will be capped at 50 chat turns per day and 5 chat turns per session. A turn is a conversation exchange which contains both a user question and a reply from Bing,” the company said in a blog post on Friday…………………………………………more https://www.forbes.com/sites/mattnovak/2023/02/18/microsoft-puts-new-limits-on-bings-ai-chatbot-after-it-expressed-desire-to-steal-nuclear-secrets/?sh=1aad6dab685c
No comments yet.
Leave a Reply