Trouble, trouble! Microsoft’s Bing chatbot denies obvious facts to users, goes off the rails

Microsoft’s fledgling Bing chatbot can go off the rails at times, denying obvious facts and chiding users, according to exchanges being shared online by developers testing the AI creation.

A forum at Reddit devoted to the artificial intelligence-enhanced version of the Bing search engine was rife on Wednesday with tales of being scolded, lied to, or blatantly confused in conversation-style exchanges with the bot.

Read more

You may also like

More in IT

Comments are closed.