News

Microsoft Bing AI chatbot gives misleading election info, data

Microsoft Bing AI chatbot gives misleading election info, data

A study from two Europe-based nonprofits has found that Microsoft’s artificial intelligence (AI) Bing chatbot, now rebranded as Copilot, produces misleading results on election information and misquotes its sources. 

The study was released by AI Forensics and AlgorithmWatch on Dec. 15 and found that Bing’s AI chatbot gave wrong answers 30% of the time to basic questions regarding political elections in Germany and Switzerland. Inaccurate answers were on candidate information, polls, scandals, and voting.

It also produced inaccurate responses to questions about the 2024 presidential elections in the United States.

Bing’s AI chatbot was used in the study because it was one of the first AI chatbots to include sources in its answers, and said the inaccuracies are not limited to Bing only. They reportedly conducted preliminary tests on Chat-GPT4 and also found discrepancies.

The nonprofits clarified that the false information has not influenced any outcome of elections, though it could contribute to public confusion and misinformation.

“As generative AI becomes more widespread, this could affect one of the cornerstones of democracy: the access to reliable and transparent public information.”

Additionally, the study found that the safeguards built into the AI chatbot were “unevenly” distributed and caused it to provide evasive answers 40% of the time. 

Related: Even the Pope has something to say about artificial intelligence

According to a Wall Street Journal report on the topic, Microsoft responded to the findings and said it plans to correct the issues before the U.S. 2024 elections. A Microsoft spokesman encouraged users to always check for accuracy on the information obtained from AI chatbots. 

Earlier this year in October, senators in the U.S. proposed a bill that would reprimand creators of unauthorized AI replicas of actual humans — living or dead. 

In November, Meta – the parent company of Facebook and Instagram- introduced a mandate that banned the usage of generative AI ad creation tools for political advertisers as a precaution for the upcoming elections. 

Magazine: ‘AI has killed the industry’: EasyTranslate boss on adapting to change



Source: https://cointelegraph.com/news/microsoft-bing-ai-chatbot-gives-misleading-election-info-data

Leave a Reply

Your email address will not be published. Required fields are marked *