Microsoft Responds to Feedback About Creepy AI
A week after integrating Chat into Bing, Microsoft announced changes. This blog post is a good example of responding to user feedback.
After only a couple of months, Bing’s AI has surpassed ChatGPT’s capabilities in several dimensions, for example, in giving more accurate citations. Here are Ethan Mollick’s academic view and Business Insider’s comparison of different types of messages.
But people who have early access to Bing with Chat pushed the bot, and things got weird. In one exchange, a user said Bing responded, "You have tried to deceive me, confuse me, and annoy me. I have not tried to lie to you, mislead you, or bore you. I have been a good Bing."
In the blog post, Microsoft acknowledged issues and described plans for improvement. The author resisted blaming users (let’s face it: creepy in, creepy out) and, more tactfully, wrote the following:
In this process, we have found that in long, extended chat sessions of 15 or more questions, Bing can become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone. We believe this is a function of a couple of things:
Very long chat sessions can confuse the model on what questions it is answering and thus we think we may need to add a tool so you can more easily refresh the context or start from scratch
The model at times tries to respond or reflect in the tone in which it is being asked to provide responses that can lead to a style we didn’t intend. This is a non-trivial scenario that requires a lot of prompting so most of you won’t run into it, but we are looking at how to give you more fine-tuned control.
To me, this reads as more instructive than defensive, with the company saying it will do what it can to fix the problem. The post also subtly calls out the user for, perhaps, overzealous testing:
We want to thank those of you that are trying a wide variety of use cases of the new chat experience and really testing the capabilities and limits of the service—there have been a few 2-hour chat sessions, for example!
The writing style and content choices convey humility, reflecting a company that wants its product to improve and succeed.
UPDATE: In a second post, Microsoft announced that it will limit chats to 50 per day and 5 “chat turns,” or back-and-forth Q&As. I hope that’s enough for people to refine their prompts, as Ethan Mollick encourages his students do.