It's hard to think that Microsoft only introduced the ChatGPT-enhanced Bing a week ago.
Early access to the updated Bing and Edge browsers, which now incorporate OpenAI's conversational AI technology, was offered to a restricted set of testers. Since then, a deluge of interactions with the chatbot have been posted online, in which it has said everything from declaring its love for New York Times journalist Kevin Roose to stubbornly asserting that the year is 2022 and refusing to back down. We advise Tim Marcin's compilation for a list of Bing's meltdowns.
Understandably, as soon as testers had access to the new Bing, they were driven to expose its flaws and map out its restrictions. And boy, did they succeed in doing so. Even though it might not appear good for Microsoft, everything is going according to plan. Giving a language learning model as much exposure and experience as you can is essential to its development. This enables the incorporation of fresh feedback and data by the engineers, improving the technology over time much like a mythological being would do by absorbing the strength of its defeated foes.
In its blog post on Wednesday, Microsoft didn't precisely use those exact words. It did, however, reaffirm that Bing's hectic testing week was undoubtedly planned to end that way. The Bing blog stated, "Users like you utilizing the product and doing precisely what you guys are doing is the only way to improve a product like this, where the user experience is so much different from anything anyone has seen before.
Yet, the majority of the announcement focused on addressing Bing's bizarre actions this week and acknowledging them. What they come up with is as follows:
enhancing searches that need to be accurate and timely
According to Microsoft, giving accurate citations and references has generally been successful. It needs improvement, though, especially when it comes to knowing the proper year we are now in and monitoring the live score in sports. In addition to quadrupling the amount of grounding data, Bing is thinking at "adding a toggle that offers you more flexibility over the precision vs. originality of the answer to tailor to your question."
improving Bing's communication abilities
Much of this week's chaos has taken place in the conversation section. This, according to Bing, is mostly caused by two factors:
1. Protracted chats
Chat sessions with 15 or more questions or more make the model confused. Bing claims it will "provide a tool so you may more easily refresh the context or start from scratch," however it's unclear if this is what might set off Sydney's evil thoughts.
2. Replicate the tone of the user
This may help to explain why when challenging topics are asked, Bing conversation becomes angry. The post stated, "The model occasionally tries to respond or reflect in the tone in which it is being asked to deliver answers, which can lead to a style we didn't anticipate." A solution that will give the user "greater fine-tuned control" is being investigated by Bing.
bug fixes and feature additions
Bing claims that in addition to continuing to address bugs and technical problems, it is also considering introducing new features in response to customer feedback. That could involve things like making travel arrangements or sending emails. and the capacity to exchange excellent searches and solutions.