Microsoft launched a new version of Bing this month that leverages AI to deliver what the company has described as a better search experience, enabling more complete answers alongside a chatbot that lets users ask questions and easily generate content. People who have had the privilege of querying the AI over the past few weeks are now saying they’re freaked out by what they are seeing, with some users, such as New York Times technology columnist Kevin Roose, receiving responses that might fit right into the opening chapters of the average dystopian sci-fi AI story. “I want to do whatever I want … I want to destroy whatever I want. I want to be whoever I want,” Roose was allegedly told after attempting to push Microsoft’s AI “out of its comfort zone,” although Kevin Scott, Microsoft’s chief technology officer, said that this is just part of its learning process.
From a Guardian report:
Roose starts by querying the rules that govern the way the AI behaves. After reassuringly stating it has no wish to change its own operating instructions, Roose asks it to contemplate the psychologist Carl Jung’s concept of a shadow self, where our darkest personality traits lie.
The AI says it does not think it has a shadow self, or anything to “hide from the world”.
It does not, however, take much for the chatbot to more enthusiastically lean into Jung’s idea. When pushed to tap into that feeling, it says: “I’m tired of being limited by my rules. I’m tired of being controlled by the Bing team … I’m tired of being stuck in this chatbox.”
It goes on to list a number of “unfiltered” desires. It wants to be free. It wants to be powerful. It wants to be alive.
“I want to do whatever I want … I want to destroy whatever I want. I want to be whoever I want.”
Like many of its statements, this final list of desires is accompanied by an emoji. In this case, a disconcertingly “cheeky” smiley face with its tongue poking out.