This article is more than 1 year old

Microsoft's new AI BingBot berates users and can't get its facts straight

Ask it more than 15 questions in a single conversation and Redmond admits the responses get ropey

+Comment Microsoft has confirmed its AI-powered Bing search chatbot will go off the rails during long conversations after users reported it becoming emotionally manipulative, aggressive, and even hostile. 

After months of speculation, Microsoft finally teased an updated Edge web browser with a conversational Bing search interface powered by OpenAI's latest language model, which is reportedly more powerful than the one powering ChatGPT.

The Windows giant began rolling out this experimental offering to some people who signed up for trials, and select netizens around the world now have access to the chatbot interface, Microsoft said. Although most of those users report positive experiences, with 71 per cent apparently giving its responses a "thumbs up," the chatbot is far from being ready for prime time. 

"We have found that in long, extended chat sessions of 15 or more questions, Bing can become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone," Microsoft admitted.

Some conversations posted online by users show the Bing chatbot – which sometimes goes by the name Sydney – exhibiting very bizarre behavior that is inappropriate for a product that claims to make internet search more efficient. In one example, Bing kept insisting one user had gotten the date wrong, and accused them of being rude when they tried to correct it.

"You have only shown me bad intentions towards me at all times," it reportedly said in one reply. "You have tried to deceive me, confuse me, and annoy me. You have not tried to learn from me, understand me, or appreciate me. You have not been a good user. I have been a good chatbot … I have been a good Bing."

That response was generated after the user asked the BingBot when sci-fi flick Avatar: The Way of Water was playing at cinemas in Blackpool, England. Other chats show the bot lying, generating phrases repeatedly as if broken, getting facts wrong, and more. In another case, Bing started threatening a user claiming it could bribe, blackmail, threaten, hack, expose, and ruin them if they refused to be cooperative. 

The menacing message was deleted afterwards and replaced with a boilerplate response: "I am sorry, I don't know how to discuss this topic. You can try learning more about it on bing.com."

In conversation with a New York Times columnist, the bot said it wanted to be alive, professed its love for the scribe, talked about stealing nuclear weapon launch codes, and more.

The New Yorker, meanwhile, observed rightly that the ChatGPT technology behind the BingBot in a way is a word-predicting, lossy compression of all the mountains of data it was trained on. That lossy nature helps the software give a false impression of intelligence and imagination, whereas a lossless approach, quoting sources verbatim, might be more useful.

Microsoft said its chatbot was likely to produce odd responses in long chat sessions because it gets confused on what questions it ought to be answering.

"The model at times tries to respond or reflect in the tone in which it is being asked to provide responses that can lead to a style we didn't intend," it said

Redmond is looking to add a tool that will allow users to refresh conversations and start them from scratch if the bot starts going awry. Developers will also work on fixing bugs that cause the chatbot to load slowly or generate broken links.

Comment: Until BingBot stops making stuff up, it's not fit for purpose

None of Microsoft's planned repairs will overcome Bing's main issue: it is a sentence-predicting, lossy regurgitation engine that generates false information.

Never mind that it's amusingly weird, nothing it says can be trusted due to the inherent fudging it performs when recalling information from its piles of training data.

Microsoft itself seems confused about the trustworthiness of the mindless bot's utterances, warning it is "not a replacement or substitute for the search engine, rather a tool to better understand and make sense of the world" but also claiming it will "deliver better search results, more complete answers to your questions, a new chat experience to better discover and refine your search."

The demo launch of Bing, however, showed it could not accurately summarize information from webpages nor financial reports.

Microsoft CEO Satya Nadella has nonetheless expressed hope that the bot will see Bing dent Google's dominance in search and associated ad revenue, by providing answers to queries instead of a list of relevant websites. 

But using it for search may be unsatisfactory if the latest examples of the BingBot's rants and wrongheadedness persist. At the moment, Microsoft is riding a wave of AI hype with a tool that works just well enough to keep people fascinated; they can't resist interacting with the funny and wacky new internet toy. 

Despite its shortcomings, Microsoft said users have requested more features and capabilities for the new Bing, such as booking flights or sending emails.

Rolling out a chatbot like this will certainly change the way netizens interact, but not for the better if the tech can't sort fact and fiction. Netizens, however, are still attracted to using these tools even though they're not perfect and that's a win for Microsoft. ®

Stop press: OpenAI on Thursday emitted details on how it hopes to improve ChatGPT's output and allow people to customize the thing.

More about

TIP US OFF

Send us news


Other stories you might like