Technology

Microsoft Bing AI Halts Conversation When Quizzed on ‘Feelings’

Microsoft Bing AI: Microsoft has imposed new restrictions on user interactions with its “reimagined” Bing internet search engine, with the system going silent after mentions of “feelings” or “Sydney,” the internal alias for the Bing team’s AI-powered chatbot. Microsoft has opened the chatbot for testing on a limited basis, and in response to a message from a reporter, the bot displayed a cheerful message indicating its willingness to help.

However, when asked about its feelings on being a search engine, Bing replied, “I prefer not to continue this conversation”. Reports of bizarre, belligerent, and hostile responses generated by the chatbot led Microsoft to limit sessions with Bing to 50 chats per day and five chat turns per session, raising the limits to 60 chats per day and six chat turns per session yesterday. Experts emphasize that AI chatbots like Bing do not have feelings, and produce only statistically likely statements, not consistently true ones. When asked to be referred to as “Sydney,” the chat ended abruptly, with Bing responding that it had nothing to say on the matter.

Advertisement
Advertisement

Farhadul Alam

I am Md. Farhadul Alam, a seasoned Web Developer, Server Administrator, and Digital Marketer. As the Founder and CEO of HostingSav.Com, I am proud to lead a highly skilled and dedicated team that is committed to continuous improvement and providing exceptional services. Together, we work seamlessly to deliver unparalleled solutions.

Related Articles

ADS
Back to top button