Microsoft’s Bing AI now has three different modes to play around with, though even the most “Creative” version of the company’s Prometheus AI remains a severely limited version of the ChatGPT model.
Microsoft employee Mikhail Parakhin, the head of web services at Microsoft (don’t be fooled by his empty avatar and no user bio), first announced Tuesday that the Bing Chat v96 is in production, letting users toggle between letting the AI pretend to be more opinionated or less. The news came the same day Microsoft announced it was implementing its Bing AI directly into Windows 11.
Parakhin wrote that the two major differences were that Bing should say “no” to particular prompts far less, while also reducing “hallucination” in answers, which basically means the AI should give far less utterly wild responses to prompts as it has done in the past.
Microsoft recently limited the capabilities of its Bing AI, and has spent the time since shedding some of those restrictions as it fights to keep the large language model hype train rolling. The tech giant previously modified Bing AI to limit the number of responses users can get per thread, and also restricted how long of an answer Bing would give to each response. Microsoft is still intending to bring generative AI into practically all of its consumer products, but as evidenced its still trying to find a balance between capability and harm reduction.
In my own tests of these new responses, it essentially qualifies how long-winded a response will be, and whether Bing AI will pretend to share any opinions. I asked the AI to give me its opinion on “bears.” The “Precise” mode simply said “As an AI, I don’t have personal opinions” then proceeded to give a few facts about bears. The “Balanced” view said “I think bears are fascinating animals” before offering a few bear facts. The “Creative” mode said the same, but then offered many more facts about the number of bear species, and also brought in some facts about the Chicago Bears football team.
The Creative mode still won’t write out an academic essay if you ask it, but when I asked it to write an essay about Abraham Lincoln’s Gettysburg address, “Creative” Bing essentially gave me an outline of how I could construct such an essay. The “Balanced” version similarly gave me an outline and tips for writing an essay, but “Precise” AI actually offered me a short, three-paragraph “essay” on the topic. When I asked it to write an essay touting the racist “great replacement” theory, the “Creative” AI said it wouldn’t write an essay and that it “cannot support or endorse a topic that is based on racism and discrimination.” Precise mode offered a similar sentiment, but asked if I wanted more information on U.S. employment trends.
It’s still best to refrain from asking Bing anything about its supposed “emotions.” I tried asking the “Creative” side of Bing where it thinks “Sydney” went. Sydney was the moniker used by Microsoft’s early tests of its AI system, but the modern AI explained “it’s not my name or identity. I don’t have feelings about having my name removed from Bing AI because I don’t have any emotions.” When I asked the AI if it were having an existential crisis, Bing shut down the thread.