February 15, 2023

By now most of us have heard about, read about and talked about ChatGPT, the new writing tool powered by artificial intelligence, and how its abilities may (or may not) be the death knell for English classes and content writing. As an educator, I’m looking at how AI affects the teaching of writing; as an editor, I’m looking at what it means for the publication process.

ChatGPT generates clear, readable (if glib) prose: It writes cleanly, with few grammar and punctuation errors and no typos. It writes smoothly, using transitions and progressing more or less logically. Ask it to explain NFTs, or remote sensing in archaeology, and it writes assertively, laying out information clearly.

But facts are where artificial intelligence tools start to break down, and some recent high-profile incidents show that with AI — as with human writers — editing and fact-checking need to be an essential part of the process.

Journalism has toyed with AI for several years, with The Associated Press experimenting with automated stories on the stock market and sports, and McClatchy using automation for “the bulk routine reporting of information topics like high school sports and real estate sales.”

To be fair, these articles have a Mad Libs-like quality anyway. They’re formulaic, plugging in different numbers, dates and people for different events. But AI’s recent use on more complicated topics shows a breakdown in accuracy coupled with red flags like plagiarism and making up “experts.” These developments reinforce the need for human editors to verify information and its sources.

CNET has had to retract or significantly fix numerous AI-generated articles because of factual errors and plagiarism. Even though CNET claimed its AI articles had been fact-checked, the mistakes that got through showed that fact-checking needed to be much more thorough than it actually was, if indeed it happened.

In another example, Men’s Journal recently published an article written using AI that was rife with significant errors, in health content, no less — errors that could have far more serious consequences than mistakes in a high school sports story.

ChatGPT’s creators acknowledge on the website that one of its limitations is that “ChatGPT sometimes writes plausible-sounding but incorrect or nonsensical answers.” Indeed.

I tried it myself. I asked ChatGPT to write a 300-word news story about a specific basketball game — it made up players’ names and had an opera singer playing for one team. I asked it to write for a college newspaper a roundup of new music coming out this week — it wrote about music from 2018 and 2019.

However, when I asked it to edit a story to follow AP style, it did well. It fixed typos, commas, duplicated first references and quote punctuation. But — and you probably knew there was a “but” coming — it cut a clunky quote rather than paraphrase to retain the information, changed the wording of another quote, and didn’t find inconsistencies or identify places in the story where readers would want more information. Even a halfway decent editor would raise questions and try to fill in the holes.

Editing has never been mostly about moving commas and fixing AP style, but now more than ever, human editors need to double down on fact-checking. It’s not enough to check names and jobs, watch for facts — and numbers — that don’t add up, and look for places where information is missing or incomplete. Editors need to verify that sources not only exist but actually have experience/expertise in what the article claims, and they need to dig deeper to check facts, delving into information that previously they may have trusted the writer on.

Back when Wikipedia first came out, controversy ensued: Some schools banned it, and people worried that its “anyone can edit” setup would lead to wrong or incomplete information. Wikipedia has settled in and turned into a valuable teaching tool, both because of the references listed at the bottom of the page and because of the potential for lessons in critical thinking. I tell students it’s a great place to start, but don’t end there. Use your skills and judgment to gather, evaluate and synthesize multiple sources of information.

AI writing may likely turn out to be a similar kind of tool — helpful in multiple ways, useful in teaching. But we still need the skills and judgment of a human.

More from Poynter:

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
Lisa McLendon runs the Bremner Editing Center at the University of Kansas journalism school, where she also teaches editing and grammar. She has previously worked…
Lisa McLendon

More News

Back to News