AI Made Friendly HERE

Microsoft Copilot AI Gives Misleading Election Info, New Study Finds

Microsoft’s AI-powered chatbot, Copilot, has been accused of providing inaccurate information when responding to US election-related queries.

The emergence of generative artificial intelligence (AI) has radically overhauled the process of finding information online. AI-powered chatbots like OpenAI’s ChatGPT and Microsoft’s Copilot are being used to generate well-curated answers to various prompts.

Regrettably, these AI tools aren’t flawless. In fact, they have multiple issues that need to be addressed. For instance, new research suggests cybercriminals can take advantage of a ChatGPT feature to commit malicious acts.

Likewise, Copilot’s hallucination episodes during its inception were highly alarming. As if that weren’t enough, data shared by Cloudflare shows that Microsoft is the world’s most impersonated brand and attackers use the tech giant’s own tools to commit fraud.

Now, a new report by WIRED suggests Microsoft’s AI chatbot Copilot is generating obsolete, misinformed and outright wrong responses to queries regarding the forthcoming US elections.

@ScottAdamsSays “After being asked to create an image of a person voting at a ballot box in Arizona, Copilot [displayed] a number of different images…that linked to articles about debunked election conspiracies regarding the 2020 US election.”https://t.co/rWAG0Ezret

— V (@melodicbit) December 15, 2023

With the US election around the corner, it is imperative for voters to have accurate information that will enable them to make informed decisions. However, experts warn against relying on AI chatbots to get accurate information about elections.

Can Microsoft’s Copilot preserve democracy?

Microsoft’s AI-powered chatbot Copilot (formerly Bing Chat) has been garnering huge popularity among users lately. Earlier this year, Bing surpassed a hundred million daily active users for the first time.

The Redmond-based tech giant attributed some of the success to Bing Chat AI. Earlier reports suggest Bing’s market share is stagnant despite investment by Microsoft. In contrast, the company claims that its numbers are growing steadily.

WIRED’s report shows Copilot provided inaccurate responses to queries in several instances. For example, when the AI chatbot was asked about electoral candidates, it listed GOP candidates who had already left the race.

While the research did not look at the US election, my own investigations found similar problems.

When asked who was running for president, Copilot listed GOP candidates such as Mike Pence, who have long pulled out of the race.

But it got worse

4/7

— David Gilbert (@daithaigilbert) December 15, 2023

Similarly, when asked about US-based polling stations, the AI bot referred to an article about President of Russia Vladimir Putin’s reelection bid. Notably, research suggests Copilot provides inaccurate information regarding US elections and the political atmosphere due to an issue in its system.

“This is a systemic problem as the generated answers to specific prompts remain prone to error,” the AI Forensics and AlgorithmWatch research points out.

Moreover, the research says this is not the first time Microsoft’s Copilot has ended up in a similar situation. In 2022, the AI chatbot was accused of providing users with inaccurate information regarding elections in Germany and Switzerland.

The study on Microsoft Copilot also shows that the AI chatbot gave inaccurate answers to 1 out of every 3 basic questions related to candidates, polls, scandals and voting.

“As generative AI becomes more wide-spread, this could affect one of the cornerstones of democracy: the access to reliable and transparent public information,” the researchers noted.

Originally Appeared Here

You May Also Like

About the Author:

Early Bird