Microsoft wants to see what Bing's new AI is capable of. The Redmond tech giant just added this feature to the Edge browser a week ago.

According to the testers, the chatbot is showing bizarre behaviors when it answers certain questions. As a matter of fact, it's still far from being perfect software.

Bing Chatbot's Bizarre Behavior

Microsoft Testers Notice Bing AI Chatbot's Unusual Responses: What's Bizarre About it?
(Photo : JASON REDMOND/AFP via Getty Images)
Microsoft released a latest update on how its Bing's AI answers some issues about various topics such as entertainment and sports.

As expected from a newly-released product, Microsoft's dedicated AI for the Edge browser came out unpolished. Some users think that the company further needs to improve its ability to respond to different issues.

According to a report by Engadget, the AI has been displaying an "unhinged behavior" which appears to be bizarre in every manner.

As such, it spat out false information about "Avatar: The Way of the Water." Bing's AI chatbot told the user "unreasonable and stubborn" and kept on telling that the movie has not yet been released in 2022.

Although the user told Bing that the information was erroneous, the AI refused to accept the criticism.

With regard to this series of strange responses, Microsoft wrote in its recent post that Bing's AI is flawed. Using it on the search engine could be a "bad idea" for the users.

"Bing can become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone," the tech firm said.

The generated error could be a result of continuous asking of questions to the point that it tends to forget what it has to say to the user.

Related Article: AI Photo Apps Continue To Fall In Popularity Amid ChatGPT's Emergence

How Can Microsoft Engineers Fix this Chatbot Behavior

Microsoft proposes a solution to solve the problem with Bing AI's behavior. The company plans to introduce a tool that will reset the context of a question on the search bar.

 

Microsoft acknowledges the complexity of the model as it responds with a corresponding tone. The company suggests that non-stop prompting could fix it. With that being said, the company will give more control to the users who will utilize AI.

Even though Bing's AI is discovered to produce strange responses, it was cited in the report that there are areas that go well with it.

For instance, the chatbot provides timely data for live sports events. It can give you a piece of advice on how to improve the financial reports, as well.

At the moment, the trial for the Bing chatbot is ongoing. The testers hope that through this experiment, they could add more improvements to this product so it will be more usable for the next years to come.

Meanwhile, The Korea Herald reported that ChatGPT might not be suited to identifying Korean names. The program tends to be confused in differentiating the identity of Yoon Suk Yeol, the country's president, and Lee Jae-Myung, his archrival.

Elsewhere, CNBC wrote that Google employees can train Bard by giving good examples of the problem. This way, they can fix how the AI replies to simple queries.

Read Also: ChatGPT's Alter-ego: Here's How DAN Works and Why It Was Created

Joseph Henry

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion