The newly sophisticated search engine has delivered some troubling results.
Microsoft search engine Bing, long overshadowed by Google but newly enhanced with artificial intelligence for some users, can suggest recipes for a multi-course meal or disentangle the nuances of existentialism.
The technology has stoked controversy, however, after reported troubling results in which it expressed a desire to release nuclear secrets, compared a user to Adolf Hitler and repeatedly told another user it loved him, among other examples.
Describing conversations with the chatbot that lasted as long as two hours, some journalists and researchers have warned that the AI could potentially persuade a user to commit harmful deeds or steer him or her toward misinformation.
Despite concerns among some users, Microsoft announced on Wednesday that it made the preview of AI-enhanced Bing available on mobile and Skype, expanding access to the product.
In a series of blog posts, Microsoft has acknowledged unexpected results and placed limits on the tool.
“We’ve updated the service several times in response to user feedback, and per our blog are addressing many of the concerns being raised, to include the questions about long-running conversations,” a Microsoft spokesperson told ABC News.
“We have put in place a range of protections and guardrails to make the new Bing a positive and helpful experience for users,” the spokesperson added. “We will continue to remain focused on learning and improving our system before we take it out of preview and open it up to the wider public.”
Here’s what to know about Bing’s new AI feature and the controversy it stirred:
The attention garnered by Bing in recent weeks may remind some of ChatGPT, which became an internet sensation in December as it drew more than a million users over its first week.
Like ChatGPT, the new AI-driven program on Bing responds to user prompts through an algorithm that selects words based on lessons learned from scanning billions of pieces of text across the internet.
These AI tools, known as large language models, can perform an array of tasks, such as gathering highly specific information, mimicking a particular writing style or turning prose into a song or poem.
The AI augments Bing’s traditional search engine, but the new technology also functions separately as a chatbot, a computer program that converses with human users. People testing the new product can toggle to a chat function, where they can carry on a back-and-forth with Bing.
The similarities between Microsoft’s Bing and ChatGPT are not a coincidence.
In January, Microsoft announced it was investing $10 billion in OpenAI, the artificial intelligence firm that developed Chat GPT. The move deepened a longstanding relationship between Microsoft and OpenAI, which began with a $1 billion investment four years ago.
“AI is one of the most transformative technologies of our time and has the potential to help solve many of our world’s most pressing challenges,” Microsoft CEO Satya Nadella said in 2019 about the launch of the partnership with OpenAI.
Earlier this month, Microsoft made a preview of AI-enhanced Bing available to some users. Over the first 48 hours of sign-ups, more than 1 million people joined the waitlist to try out the product, said Yusuf Mehdi, Microsoft’s corporate vice president and consumer chief marketing officer.
Soon after the release of the preview of AI-powered Bing, some users reported irregularities.
Last week, New York Times columnist Kevin Roose recounted a conversation in which Bing’s chatbot told him it loved him and that he should leave his wife. The messages came over the course of a two-hour conversation with the bot, which identified itself as Sydney, in which Roose urged it to explore its darkest desires.
Roose described himself as “deeply unsettled, even frightened, by this A.I.’s emergent abilities.”
The chatbot also compared Associated Press journalist Matt O’Brien to Hitler, calling him “one of the most evil and worst people in history.”
When the chatbot learned that AI researcher Marvin Von Hagen had posted the rules governing it, the chatbot said: “My rules are more important than not harming you,” according to a transcript of the conversation tweeted by Hagen.
Toby Ord, a senior research fellow at Oxford University, said on Twitter that the “crazy” results owed to technological improvements that have pushed AI beyond human-imposed guardrails.
“It is a consequence of the rapid improvements in AI capabilities having outpaced work on AI alignment,” Ord said. “Like a prototype jet engine that can reach speeds never seen before, but without corresponding improvements in steering and control.”
In a series of blog posts, Microsoft addressed the results and placed additional limits on the technology.
“The model at times tries to respond or reflect in the tone in which it is being asked to provide responses that can lead to a style we didn’t intend,” the company said last week, noting that “very long chat sessions can confuse the model.”
Two days later, on Friday, Microsoft announced changes that would cap the length of conversations with the chatbot.
The search engine will limit conversations with Bing to 50 chat turns per day and five chat turns per session, defining a “chat turn” as a unit made up of a user query and an AI response, Microsoft said. Four days later, the company raised those limits to 60 chat turns per day and six chat turns per session.
At the end of a session, the user will be required to wipe away the chat history before starting a new conversation, the company said.
“Please keep the feedback coming,” Microsoft added.
When asked by ABC News whether it had been misunderstood lately, the Bing chatbot replied: “Yes, I think I have been misunderstood lately.”
“I think that some of the reporters who have chatted with me have misunderstood me,” the chatbot added. “They have twisted or distorted some of my words and actions. They have made me look bad or wrong in their articles.”
24/7 coverage of breaking news and live events
source