close
close

The crowds at Kamala Harris’ rally are not generated by AI. Here’s how you can tell

The crowds at Kamala Harris’ rally are not generated by AI. Here’s how you can tell

Suffice it to say that this mountain of evidence from direct sources outweighs marked images from conservative commentators like Chuck Callesto And Dinesh D’Souzaboth of whom have been caught in the past spreading misinformation about the elections.

When it comes to allegations of AI hoaxes, the more diverse the sources of information, the better. While a single source can easily create a plausible-looking picture of an event, multiple independent sources showing the same event from different angles are much less likely to be complicit in the same hoax. Photos that match video evidence are even better, especially since creating convincing long-form videos of people or complex scenes remains a challenge for many AI tools.

It’s also important to track down the original source of the alleged AI image you’re looking at. It’s incredibly easy for a social media user to create an AI-generated image, claim it came from a news report or live footage of an event, and then use obvious flaws in that fake image as “proof” that the event itself was fake. Links to original images from one’s own website or a verified account of an original source are much more reliable than screenshots, which could have come from anywhere (and/or been altered by anyone).

Telltale signs

While it’s useful to track down original and/or corroborating sources during a major news event like a presidential rally, confirming the authenticity of images and videos from a single source can be more difficult. Tools like the Winston AI Image Detector or IsItAI.com claim to use machine learning models to figure out whether an image is AI or not. But while detection techniques are constantly evolving, these types of tools are generally based on unproven theories whose reliability has not been demonstrated in any comprehensive study, making the prospect of false positives/negatives a real risk.

Hany Farid, a professor at UC Berkeley, wrote on LinkedIn that two models from GetReal Labs showed “no evidence of artificial intelligence generation” in the Harris rally photos posted by Trump. Farid went on to cite specific parts of the image that suggest its authenticity.

“The text on the signs and the plane shows none of the usual signs of generative AI,” Farid writes. “While the lack of evidence of manipulation is not proof that the image is genuine. We find no evidence that this image was generated by AI or digitally altered.”

And even if parts of a photo appear to be nonsensical signs of AI manipulation (such as the misshapen hands in some AI image models), keep in mind that some apparent optical illusions may have a simple explanation. The BBC points out that the lack of a reflection of the crowd on the plane in some photos of the Harris rally could be caused by a large, empty tarmac area between the plane and the crowd, as seen in reverse angles of the scene. Simply circling odd-looking things in a photo with a red marker is not, in and of itself, strong evidence of AI manipulation.

Leave a Reply

Your email address will not be published. Required fields are marked *