close
close

Wyoming reporter uses artificial intelligence to create false quotes and stories

Wyoming reporter uses artificial intelligence to create false quotes and stories

HELENA, Mont. (AP) — A quote from the governor of Wyoming and a local prosecutor were the first things that struck Powell Tribune reporter CJ Baker as odd. Then it was some of the wording in the articles that struck him as almost robotic.

However, the clear indication that a reporter at a rival news network was using generative artificial intelligence to write his stories came from a June 26 article about the election of comedian Larry the Cable Guy as Grand Marshal of the Cody Stampede Parade.

“The 2024 Cody Stampede Parade promises to be an unforgettable celebration of American independence, led by one of comedy’s most beloved characters,” said the Cody Company reported. “This structure ensures that the most important information is presented first, allowing readers to grasp the key points more quickly.”

After some research, Baker, who has worked as a reporter for more than 15 years, met with Aaron Pelczar, a 40-year-old newcomer to journalism who, according to Baker, admitted to using AI in his stories before quitting the Enterprise.

The publisher and editor of the Enterprise, which was co-founded by Buffalo Bill Cody in 1899, have since apologized and promised to take steps to ensure that such a thing never happens again. editorial In the edition published on Monday, Enterprise editor Chris Bacon said he “did not notice” the AI ​​copy and the false quotes.

“It doesn’t matter that the misquotes were obviously the mistake of a rookie reporter in a hurry who trusted the AI. That was my job,” Bacon wrote. He apologized for “allowing the AI ​​to insert words into the stories that were never said.”

Journalists have their careers derailed from Inventing quotes or facts in stories long before AI. But this latest scandal illustrates the Possible pitfalls And Driven This is a problem that AI poses for many industries, including journalism, as chatbots can spit out fake, albeit somewhat plausible, articles with just a few prompts.

AI has found a role in journalism, including in automating certain tasks. Some newsrooms, including The Associated Press, use AI to free up reporters’ time for more important tasks, but most AP staff are not allowed to use generative AI to create publishable content.

The AP has been using technology to produce financial reporting articles since 2014 and, more recently, some sports stories. It is also experimenting with an AI tool to translate some articles from English to Spanish. At the end of each article there is a note explaining the role of technology in its production.

It has proven important to be open from the outset about how and when AI will be used. Sports Illustrated was criticized last year for publishing AI-generated online product reviews that were supposedly written by reporters who didn’t actually exist. After the story broke, SI announced that the company that produced the articles for the site would be fired. However, the incident damaged the reputation of the once-influential publication.

In his article for the Powell Tribune announcing the news of Pelczar’s use of AI in articles, Baker wrote that he had an awkward but cordial meeting with Pelczar and Bacon. During the meeting, Pelczar said, “Of course, I never intentionally tried to misquote anyone” and promised to “correct them, apologize, and say they were false statements,” Baker wrote, noting that Pelczar insisted that his errors should not reflect on his editors at Cody Enterprise.

After the meeting, the Enterprise began a comprehensive review of all the articles Pelczar had written for the paper during the two months he had worked there. They found seven articles that contained AI-generated quotes from six people, Bacon said Tuesday. He is still reviewing more articles.

“These are very credible quotes,” Bacon said, pointing out that people he spoke to while reviewing Pelczar’s articles said the quotes sounded like they were saying something, but they had never actually spoken to Pelczar.

Baker reported that seven people told him they had been quoted in stories written by Pelczar, but had not spoken to him.

Pelczar did not respond to an AP message left at a number he provided requesting a discussion of the incident. Bacon said Pelczar declined to discuss the matter with another Wyoming newspaper that reached out to him.

Baker, who regularly reads Enterprise because it is a rival paper, told AP that a combination of phrases and quotes in Pelczar’s stories aroused his suspicions.

Pelczar’s story about a shooting in Yellowstone National Park included the sentence: “This incident is a haunting reminder of the unpredictability of human behavior, even in the most tranquil environments.”

Baker said the line sounded like the summaries of his stories that a certain chatbot seemed to generate, adding a sort of “life lesson” at the end.

Another story – about a poaching conviction – used quotes from a conservation official and a prosecutor that sounded like they came from a press release, Baker said. But there was no press release and the agencies involved did not know where the quotes came from, he said.

Two of the stories in question contained fake quotes from Wyoming Governor Mark Gordon, which his staff did not learn about until Baker called them.

“In one case, (Pelczar) wrote an article about a new OSHA rule that included a quote from the governor that was completely fabricated,” Michael Pearlman, a spokesman for the governor, said in an email. “In a second case, he appeared to fabricate part of a quote and then combine it with part of a quote included in a press release announcing the new director of our Wyoming Game and Fish Department.”

The most obvious AI-generated text appeared in the story about Larry the Cable Guy, which ended with an explanation of the inverted pyramid, the basic technique for writing a breaking news story.

Creating AI stories isn’t difficult. Users could feed an affidavit into an AI program and ask it to write an article about the case that includes quotes from local officials, says Alex Mahadevan, director of a digital media literacy project at the Poynter Institute, the leading journalism think tank.

“These generative AI chatbots are programmed to give you an answer, whether that answer is complete nonsense or not,” Mahadevan said.

Megan Barton, the editor of Cody Enterprise, wrote an editorial calling AI “the new, advanced form of plagiarism. In the realm of media and writing, plagiarism is something that every media company has had to correct at some point. That’s the ugly part of the job. But a company that is willing to correct (or, literally, write) those mistakes is reputable.”

Barton wrote that the newspaper had learned its lesson, had a system in place to detect AI-generated stories, and would have “longer conversations about why AI-generated stories are not acceptable.”

The Enterprise did not have an AI policy, in part because it seemed obvious that journalists should not use it to write stories, Bacon said. Poynter has a Template On this basis, news agencies can develop their own AI policies.

Bacon plans to have one set up by the end of the week.

“This will be a topic of discussion before hiring,” he said.

Leave a Reply

Your email address will not be published. Required fields are marked *