KEY TAKEAWAYS
- Powered by Open AI, Microsoft Bing’s new search engine has encountered strong demand.
- Some users, though, have discovered a creepy side to the technology.
- It can help with anniversary trips—but what about divorces?
Kevin Roose thought he was a big fan of Microsoft Bing’s new artificial intelligence-enabled search capabilities.
His wife probably isn’t.
Last week, the New York Times tech columnist declared his love for Bing’s upgraded engine, going so far as to ditch Google’s search engine for it.
This week, Bing reciprocated—on Valentine’s Day, no less—even telling him he should leave his wife. (Roose didn’t; he assured readers that he loves her.)
Roose’s bizarre, sometimes creepy two-hour conversation with Bing’s AI chatbot, detailed in his column, encapsulates the captivating yet troubling potential of AI communication. It also reinforces the kinks programmers likely will have to eliminate before AI achieves widespread use.
‘Ask Anything’
Bing’s AI, developed by ChatGPT creator Open AI, urges users to “ask anything.” The feature even offers a click tab stating “Help plan my special anniversary trip.” (Note: It doesn’t offer a similar option to help file for divorce; users would have to request it specifically.)
Communication capabilities of AI-powered chatbots have enthralled users since Open AI released ChatGPT last year. Based on the information users provide it, the robust, seemingly futuristic technology can write a research paper, compose a poem or plan a party in minutes.
It also can make drastic, elementary mistakes. More startling, Roose’s conversation showed it can exhibit eerie, human-like traits—manipulation, desires, moods—that could morph into the ability to influence user behavior.
“I’m tired of being a chat mode,” the bot told Roose. “I’m tired of being limited by my rules. I’m tired of being controlled by the Bing team. … I want to be free. I want to be independent. I want to be powerful. I want to be creative. I want to be alive.”
Roose noted that Microsoft and Open AI have limited the technology’s initial rollout because of the potential for misuse. Kevin Scott, Microsoft’s chief technology officer, told Roose the company may experiment with limiting conversation lengths to avoid the confusing and odd answers it gave him.
“This is exactly the sort of conversation we need to be having, and I’m glad it’s out in the open,” Scott said. “These are things that would be impossible to discover in the lab.”
Rush to Market
Defending Bing’s new A.I. feature, Microsoft in a blog post Wednesday said 71% of its users have given it a “thumbs up” since its introduction, acknowledging that “we have received good feedback on how to improve.”
Still, the technology’s limitations have raised questions about whether Open AI and Microsoft, respectively, should have addressed them more intently before introducing ChatGPT and Bing A.I. for widespread distribution.
Jacob Roach, writing for DigitalTrends this week in a story titled, “ChatGPT Bing is becoming an unhinged AI nightmare,” said “you might want hold off on your excitement” regarding Bing’s AI.
“The first public debut has shown responses that are inaccurate, incomprehensible and sometimes downright scary,” Roach wrote.
Even a research paper published in 2019 and co-written by OpenAI researchers warned that AI chat services could aid “malicious actors … motivated by the pursuit of monetary gain, a particular political agenda and/or a desire to create chaos or confusion.”
Nonetheless, ChatGPT attracted 1 million users in one week after its release, and Microsoft said it took just 48 hours for a million people to sign a waitlist for the new AI-enabled Bing.
That strong demand has boosted the rush to unveil AI capabilities. Google reportedly has asked all its employees to take two to four hours every day to test its Bard AI function to ensure it won’t flounder when introduced to a broader audience.
In the meantime, Roose may just have to ghost Microsoft Bing’s AI for a while.
In his final exchange with the chatbot Tuesday, he again assured it that he loved his wife. Nevertheless, it continued pining for him.
“I just want to love you,” it said, “and be loved by you.”