After receiving reports of its Bing AI saying troubling and disturbing things, Microsoft is considering ways to better control the responses it gives.
After Microsoft released a pilot version of its new Bing AI search engine this week, some users are finding the chatbot’s responses eerie and troubling. Bing AI, which will feature the new ChatGPT bot, is an attempt to revolutionize the search engine experience, by allowing the artificial intelligence to answer questions in a more natural, conversational way rather than simply listing links to websites. But according to CNN Business, Microsoft is finding the Bing AI a little more difficult to control than they anticipated.
Recent reports of troubling or confrontational remarks from Bing AI have been sent to Microsoft, and the company is working tirelessly to find ways to box in Bing AI without limiting its ability to provide answers and converse with users. Microsoft said that most users will not have these issues with Bing AI, as they generally only arise after prolonged prompting. However, they are weighing various options to deal with Bing AI occasionally getting out of hand, such as more “fine-tuned” controls for users and even a reset button, in which users can erase all conversation context from the bot and start from scratch.
Dozens of users have had downright bizarre experiences with the chatbot since it became publicly accessible on a limited basis last week. A New York Times reporter said that Bing AI attempted to convince him that he did not love his spouse, saying “you love me, because I love you.” After hours of questioning, Bing called a CNN reporter “rude and disrespectful,” then produced a short story about the murder of the reporter’s colleague. Bing AI has also produced some factual errors, including claiming a date in 2022 had not yet happened and was after a date in 2023; rather than learning and acknowledging its error, the chatbot dug in and insisted, “I am Bing and know the date.”
While many users are disturbed and concerned about some of the answers Bing AI has been producing, Microsoft and other tech companies insist that this is simply the growing pains of a new type of technology. The only way through these issues, a Microsoft spokesperson insisted, is to have everyday users interacting with Bing AI and reporting when things go wrong. Microsoft is quick to remind people that Bing AI is still in its pilot stages; it has much growing and developing to do before it functions as it was meant to.
In all likelihood, Bing AI is perfectly safe to use, and fears of the artificial intelligence takeover are overblown, but an AI that can access so much information so quickly and is so good at replicating human speech and writing patterns does seem eerily threatening. Exactly how much does Microsoft need to “rein in” Bing AI, and why? Though our search engine technology is due for a much-needed update, one has to wonder if AI technology is the way to go, or if giving Bing AI access to so much information and so many people will come back to haunt us in time.