MICROSOFT’S fledgling Bing chatbot can go off the rails at times, denying obvious facts and chiding users, according to exchanges being shared online by developers testing the AI creation.
A forum at Reddit devoted to the artificial intelligence-enhanced version of the Bing search engine was rife yesterday with tales of being scolded, lied to, or blatantly confused in conversation-style exchanges with the bot.