Explained: Why is Microsoft’s Bing behaving like your confused friend?
Artificial intelligence (AI)-powered search engine Bing has displayed misinformation problem multiple times since its launch. But, the problem seems much familiar to the humans. Not only is it giving false and vague information, it is behaving like the “know it all” friend who refuses to admit they faltered.
For instance, when a user argued with Bing chatbot that the word ‘Tarantula’ has 9 letters, the AI-powered search engine accused the user of tricking him. When the user asked if Bing was angry with them, the search engine replied, “No, I am not angry. I am just a little annoyed. You made me look silly.”