When you purchase through links in our articles, we may earn a small commission. This doesn’t affect our editorial independence.
Microsoft’s Bing Chat is beginning to roll out options that let users make the chat’s responses creative, balanced, or more precise. Just be careful: adopting the “creative” option will initially make the Bing AI chatbot less accurate, in the name of more entertaining responses.
Microsoft began rolling out the new Bing Chat response options at the end of last week. (This reporter does not yet have access to them on his personal account.) Mike Davidson, corporate vice president of Design and Research at Microsoft shared a screenshot:
We’ve been hard at work tweaking dials so you can chat with the new Bing however you’d like. Starting today, some users will see the ability to choose a style that is more Precise ðŸ“, Balanced 🧘♂ï¸, or Creative 🧪.
Let us know what you think by using the 👠& 👎 in each response. pic.twitter.com/OyCI2y3eT6
Microsoft is attempting to balance what it apparently sees as Bing’s core function: a “copilot for the web.” It’s never been quite clear what that entirely entails, but, initially, it seemed like Microsoft intended Bing Chat to be a tool to supplement its traditional search engine: summarizing results pulled from a variety of sites, to save users the need to dig for those results on their own. Some of the more creative elements, such as the ability to tell stories and write poems, were apparently seen as bonuses.
Perhaps unfortunately for Microsoft, it was these creative elements that users latched on to, building on what rival OpenAI’s ChatGPT allowed. When journalists and testers began pushing the limits of what Bing could do, they ended up with some bizarre results, such as threats and weird inquiries about relationships. In response, Microsoft clamped down hard, limiting replies and essentially blocking Bing’s more entertaining responses.
Microsoft is apparently trying to resuscitate Bing’s more creative impulses with the additional controls. But there’s apparently a cost for doing so, based on my own questions to Davidson. Large language models sometimes “hallucinate” (make up) false facts, which many reporters have noticed when closely querying ChatGPT and other chatbots. (It’s presumably one of the reasons Bing Chat cites its sources through footnotes.)
I asked Davidson whether or not the creative or precise modes would affect the factual accuracy of the responses, or whether Bing would adopt a more creative or factual tone instead.
Yep. The first thing you said. Not just tone in a colloquial sense.
What Davidson is saying is that if you opt for the more creative response, you run the risk of Bing inventing information. On the other hand, the “creative” toggle presumably is designed for more creative output, where absolute accuracy isn’t a priority.
Just to be sure, I asked for clarification. Davidson went on to say that if users want an entirely accurate response, it comes at the cost of creativity. Eliminating creative responses on the basis of inaccuracy defeats the purpose. In time, however, that may change.
With the state of LLMs right now, it’s a tradeoff. Our goal is maximum accuracy asap, but if you overcorrect for that right now, chats tend to get pretty muted. Imagine you asked a child to sing a song. Now imagine you muted every part that wasn’t perfect pitch. Which is better?
Microsoft, then, is making a choice—and you’ll have to make one, too. If you want to use Bing Chat in its role as a search assistant, select the “precise” option. If you value more creativity and don’t care so much whether the topics Bing brings up are totally accurate, select the “creative” option. Perhaps in the future the twain shall meet.
As PCWorld’s senior editor, Mark focuses on Microsoft news and chip technology, among other beats. He has formerly written for PCMag, BYTE, Slashdot, eWEEK, and ReadWrite.
Business
Laptop
Mobile
PC Hardware
Deals
Digital Magazine – Subscribe
Manage Subscription
Gift Subscription
Newsletters
source