close
close

Foreign ministers urge Elon Musk to stop AI bot from spreading election lies

Foreign ministers urge Elon Musk to stop AI bot from spreading election lies

Top election officials from five states sent a letter on Monday calling on billionaire Elon Musk to stop the AI ​​chatbot he developed from spreading false election information on X (formerly Twitter).

The letter, signed by the secretaries of state of Minnesota, Pennsylvania, Michigan, Washington and New Mexico, calls on Musk to “immediately make changes” to Grok, X’s AI search assistant, “to ensure voters have accurate information in this crucial election year.”

Musk unveiled Grok at X last November, calling the chatbot an unfiltered alternative to large-scale language models like ChatGPT. He had derisively labeled companies like OpenAI and Google as “woke” for implementing guardrails to help the tools approach sensitive and controversial topics more cautiously.

“Please don’t use it if you hate humor!” said xAI, Musk’s artificial intelligence company, at the time of Grok’s unveiling.

However, the secretaries expressed concern about Grok’s role in spreading lies related to the 2024 presidential election, citing an instance last month in which the chatbot spread false information about election deadlines just hours after President Joe Biden announced he would not run for re-election.

Grok’s post falsely claimed that presumptive Democratic nominee and Vice President Kamala Harris had missed the deadline to cast ballots for the November election in Alabama, Indiana, Michigan, Minnesota, New Mexico, Ohio, Pennsylvania, Texas and Washington, many of which are considered key swing states that could greatly influence the race.

“This is false,” the letter says of Grok’s claim. “In all nine states, the opposite is true: the ballots have not yet closed, and impending election deadlines would allow for changes to the candidates on the ballot for the offices of President and Vice President of the United States.”

This embedded content is not available in your region.

Grok is currently only available to X Premium and Premium+ subscribers, and the bot has a disclaimer that asks users to verify the information it generates. However, according to the secretaries, Grok’s election-deadline misinformation managed to be intercepted and shared in more publicly accessible posts that reached millions. The chatbot’s misinformation was not corrected until 10 days later.

“As tens of millions of voters across the United States seek essential information about voting in this important election year, X has a responsibility to ensure that all voters who use your platform have access to guidance that contains true and accurate information about their constitutional right to vote,” said the letter, first seen by The Washington Post.

The secretaries of state asked X to implement a policy that directs Grok users who ask questions about the U.S. election to “CanIVote.org,” which officials said is a nonpartisan resource of professional election officials from both the Democratic and Republican parties. OpenAI partnered with secretaries of state this year to provide more accurate election information, and ChatGPT is already programmed to direct its users to the site with questions about the election.

A spokesperson for X did not respond to HuffPost’s request for comment, but sent an automated response that read: “Busy at the moment, please check back later.”

The company’s delayed response to Grok’s misinformation was “the equivalent of a shrug of the shoulders,” said Minnesota Secretary of State Steve Simon, who wrote the letter.

“It’s important that social media companies, especially those with global reach, correct their own mistakes — like in the case of the AI ​​chatbot Grok, which simply misunderstood the rules,” Simon told the Post. “Speaking up now will hopefully reduce the risk that a social media company will refuse or delay correcting its own mistakes between now and the November election.”

Leave a Reply

Your email address will not be published. Required fields are marked *