close
close

Elon Musk’s Grok-2 and Grok-2 Mini create scandalous deepfakes

Elon Musk’s Grok-2 and Grok-2 Mini create scandalous deepfakes

  • Less than a week after the launch of Grok-2 and Grok-2 Mini, Musk’s latest AI tools made controversial headlines for generating scandalous images.
  • From Barack Obama taking drugs to Mickey Mouse with a gun, the unregulated nature of this tool seems dangerous.
  • Elon Musk seems to know what his tools do, but he just doesn’t care.

Elon Musk's Grok-2 and Grok-2 Mini create scandalous deepfakes

Elon Musk recently announced his new imaging tools Grok-2 and Grok-2 Mini, but they made headlines for all the wrong reasons: Creation of offensive images.

Musk has always been known for not following the rules. Even when he founded his AI company xAI and the first AI model Grok, he did not follow the same strict guidelines as other AI tools from OpenAI and Google.

He says he wants to avoid political interference, but that also means there are no safeguards to prevent his funds from going awry. And that’s exactly what happened this time.

Grok 2 and Grok 2 Mini seem to have no regulations. After the tools were launched, some users shared some images they had created using the tool – and they were horrifying, to say the least.

It is noteworthy that the scandalous images are not limited to politicians, as one image showed Mickey Mouse carries a gun. It is clear that these image generators can be used to create hyper-realistic deepfakes of real people. It also does not seem to follow any copyright laws.

This may all be just fun and games for many people, but let’s not forget that it can take a very dangerous and dark turn very quickly. We’ve already seen people using AI to create explicit images of celebrities despite regulations being in place. Imagine what malicious actors would do if they got their hands on an AI image generator with no regulations.

What does Musk have to say about this?

Elon Musk doesn’t seem to understand the gravity of the situation. I say this because on Wednesday he shared a post on X where he said, “Grok is the most entertaining AI in the world!”

He also pointed to a thread on X where a user showed the different prompts that can produce interesting results. Needless to say, the images contained therein were once again deepfakes of well-known figures.

And when a user tweeted about the image generator and said the trolling it enabled was “epic,” Musk simply retweeted the tweet, showing that he knows what the tool is capable of and likely supports it.

Controversy over the spread of misinformation during the unrest in Britain

Elon Musk and his social media platform X have also been accused of spreading misinformation during the ongoing unrest in the UK. I cover this in detail in my recent article on the UK unrest and the Online Safety Act.

Unlike other social media platforms that have rules about posting content on sensitive topics, Elon Musk seems to give everyone free rein.

When users on X started posting false information about the defendants in the British stabbings, nothing stopped them. The tweets went viral, fuelling anti-immigration sentiment and leading to much violence.

Worse still, Elon Musk sat silently on the sidelines the whole time, enjoying the show and tweeting things like “Civil war is inevitable.” This event was heavily criticized by the British authorities, but Elon Musk once again seems to not care.

The Tech Report - Editorial ProcessThe Tech Report - Editorial ProcessOur editorial process

Tech Report’s editorial policies focus on providing helpful, accurate content that provides real value to our readers. We only work with experienced writers who have specific knowledge of the topics they cover, including the latest developments in technology, online privacy, cryptocurrencies, software, and more. Our editorial policies ensure that every topic is researched and curated by our in-house editors. We maintain strict journalistic standards, and every article is 100% written by real writers.

Leave a Reply

Your email address will not be published. Required fields are marked *