A YouTube video with nearly 2 million views and 8,000 comments chronicles a salacious true crime saga in Littleton that “revealed a hidden world of secrets and lies that shook the town to its core,” according to the video’s description.
The only problem?
The “gripping true crime story” of a Littleton real estate agent who was murdered by his stepson after they had a “secret gay love affair” is apparently pure fiction. The 25-minute video was likely created using and narrated by a generative artificial intelligence program.
“This doesn’t surprise me,” says Casey Fiesler, an associate professor at the University of Colorado Boulder who researches and teaches technology ethics, Internet law and policy, and online communities.
Fiesler said she has seen similar AI-generated content created with the intent of spreading false conspiracies.
“True crime makes as much sense as conspiracy theories as a genre because people watch that kind of content,” said Fiesler. “The motivation for that is money.”
The YouTube channel True Crime Case Files has been producing similar-looking and thematically similar “true crime” content for eight months, all of which relies on still images rather than actual news footage. On July 30, the channel released a video about Richard Engelbert, an alleged real estate agent who was killed by his stepson Harrison Engelbert in a “grisly murder” in 2014.
According to the video, Harrison Engelbert was sentenced to 25 years in prison without the possibility of parole for premeditated murder following a police investigation and a jury trial. The community, the video says, is “deeply shocked.”
But there is no evidence that this actually happened. The video says the case has received local and national media coverage, but Google searches turn up no reports from 2014. Local law enforcement says they have found no records of the alleged case. And the Colorado Department of Corrections does not list an inmate named Harrison Engelbert.
Eric Ross, spokesman for the 18th Judicial District Attorney’s Office, said the story appears to be fabricated because a search of Colorado court records turned up none of the names.
Sergeant Krista Schmit of the Littleton Police Department said the department did not investigate the crime described in the video and that none of the names mentioned in the clip are people the department has interacted with.
And yet, in comment after comment beneath the video, viewers are expressing their shock, outrage and disgust at the heinousness of this fabricated crime.
“The fact that people believe something just because they see it on the Internet is becoming an increasing problem,” said Fiesler of the CU.
Fiesler estimated that this one video could easily have earned tens of thousands of dollars for the creator, whose contact information is not listed.
When she watched the video, she immediately noticed the AI’s warning signals.
The voice acting was fake, she said. The photos of the people used in the video – all studio-style portraits – had an “eerie” feel, she said. Google’s reverse image search for the photos did not produce any real results.
The facts in the video are also incorrect.
The stilted narration initially says that the alleged murder took place on Bleak Street in Littleton – a street that doesn’t even exist in the city – but later mentions Oak Street as a local street. The narrator keeps changing the pronunciation of “Engelbert.” The video also says that Richard Engelbert’s wife, Wendy Engelbert, was a school principal and had to give up her candidacy for “superintendent of schools,” which is not an elected office in Colorado.
The other names used in the video do not appear in Google searches: District Attorney Laura Mitchell, Detective James Cattle, neighbor Luby Johnson-Guntz.
What Google searches reveal, however, are dozens of fake YouTube videos and TikToks, as well as websites that summarize the original fake video.
Creating misinformation is nothing new, but with the advent of AI-generated content, Fiesler said, technology is giving more people the ability to create and spread false content.
“Generative AI has in some ways democratized these types of bad actors, in the sense that the more people are able to create this kind of content, the more of it we’re going to see,” Fiesler said.
Get more Colorado news by signing up for our daily email newsletter, Your Morning Dozen.
Originally published: