Microsoft has announced that it is implementing some conversation limits for its Bing AI chatbot, just days after the chatbot went off the rails several times for users. After being caught insulting, lying to, and emotionally manipulating users, Bing chats will now be limited to 50 questions per day and five per session.
“Our data shows that the vast majority of people find the answers they’re looking for within 5 turns, and that only about 1% of chat conversations have 50+ messages,” the Bing team writes in a blog post. If a user exceeds the five-per-session limit, Bing will prompt them to start a new topic in order to avoid lengthy back-and-forth chat sessions.
Microsoft warned earlier this week that longer chat sessions of 15 questions or more could cause Bing to “become repetitive or be prompted / provoked to give responses that are not necessarily helpful or in line with our designed tone.” Wiping a conversation after only five questions ensures that “the model does not become confused,” according to Microsoft.
Earlier this week, reports of Bing’s “unhinged” conversations surfaced, followed by The New York Times publishing an entire two-hour-plus back-and-forth with Bing, in which the chatbot said it loved the author and they couldn’t sleep that night. However, many intelligent people have failed the AI Mirror Test this week.
Microsoft is still working to improve Bing’s tone, but it’s unclear how long these restrictions will last. “As we continue to receive feedback, we will consider expanding the caps on chat sessions,” Microsoft says, so this appears to be a temporary limit.
Bing’s chat function is getting better by the day, with technical issues being addressed and larger weekly drops of fixes to improve search and answers. Microsoft stated earlier this week that it does not “fully envision” people using its chat interface for “social entertainment” or “general world discovery.”