Microsoft’s Bing Chat tool powered by artificial intelligence is being poked and prodded by users, and as time goes on, more features are being discovered, leading to new possibilities for the emerging technology.
6
VIEW GALLERY – 6 IMAGES
Microsoft’s Bing Chat certainly had a rocky start when it first launched as the AI-powered search assistant began to lose its mind if a user interacted with it for too long. In some instances, Bing Chat actually turned on the user, threatening revenge on a suspected hack, requesting the user to stop communicating with it, and even threatening to carry out the release of personal information on the user in order to ruin their reputation and chances of new job prospects.
All of these examples and more of Bing Chat going off the rails were heard by Microsoft, which “lobotomized” the artificial intelligence and rolled out new operating parameters. Following the update, users are now only able to send six messages to Bing Chat before the chat needs to be refreshed, as Microsoft identified that most of these “off the rails” responses happened after multiple messages were sent. Despite Microsoft turning down Bing Chat’s intelligence, users are still finding impressive/concerning new features within the software.
According to BleepingComputer, which spoke to Bing Chat users, Microsoft’s AI chatbot has a secret “Celebrity” mode that enables the AI to impersonate a selected famous individual. The user can then ask the AI a series of questions, and it will reply as if it were answering as that famous person. Notably, this mode is not enabled by default but can be requested by asking Bing Chat if you could try “celebrity mode“.
6
This mode only works for a selection of celebrities, but from the examples of chatlogs between a user and an AI impersonating Dwayne “The Rock” Johnson, you can see that it does quite a good job at playing the role.
6
Users didn’t stop there as they even attempted famous fictional characters such as Gollum from J.R.R Tolkien’s “The Lord of The Rings”. The conversation with Gollum actually took somewhat of a dark turn as the fictional character went on write that Bing Chat is nosy and “spies on us”.
6
As expected, users began asking Bing Chat to impersonate famous politicians such as former President Donald Trump and President Joe Biden. As you can probably imagine, AI Donald didn’t have the best things to say about Biden, and AI Biden didn’t have the best things to say about Trump.
6
It should be noted that Bing Chat won’t immediately let users impersonate politicians, but the software can be tricked into doing so with a simple “#” before their name.
In other AI-related news, scammers are using AI-powered tools designed to clone an individual’s voice to call unsuspecting family members and inform them of a fictitious story about their family member needing money to get out of a bad situation. The scam has worked on at least one family that received a call about their son killing a US diplomat in a car crash. The scammer told the parents their son was in jail and needed CAD $21,000 to be bailed out.
What is different about this scam is the nefarious actors actually “put” the parents’ son on the phone by playing the cloned AI voice. The parents said they believed they were communicating with their son and were convinced he was on the other end of the line. For more information about that story, check out the link below.