top of page
Writer's pictureErin Servais

AI Editing: Separating Facts from Fiction

As AI grows in popularity, it's normal to worry about how it will affect what we do as editors. But here's the thing: a lot of these concerns are based on misunderstandings about what AI can and can't do and how AI fits into the editing process. In this article, we’ll tackle these misconceptions so you’ll have a better understanding of how AI tools work and see how they can combine with essential human skills to help us be even better editors.

 

Misconception 1: It’s Not Safe to Use with Clients’ Text

There are ways to use AI tools that provide data security. You need to know the specific settings for your tool of choice, however, because each is different. For instance, in ChatGPT, there is a setting you can switch on that stops the company from storing any text you put in their system long-term and using it to train new versions. Claude automatically does not use text to train new versions. Google, however, has humans reviewing text you put into its Gemini chatbot, and it uses that data to train new models.

Whichever tool you use, it’s important to have your client’s or employer’s permission before you use AI with their text. Have a conversation with them to explain how you’ll use AI and the security measures you'll take to protect their work. It’s also wise to include a clause in your contract that makes it clear you’ll use this technology.

 



Misconception 2: It Can’t Be Trusted

For many editors, getting the facts right is part of our jobs. It’s understandable we’d be worried about earlier AI models’ “hallucinations,” the term for when AI generates incorrect information. But there has been much improvement over the last several months that should alleviate this concern.


There are several quality AI-powered research tools available now. My favorite for general use is Perplexity. This works similarly to a search engine. You type in your query, but instead of getting pages of links of varying degrees of usefulness and reliability, you receive an answer akin to a personalized encyclopedia entry along with citations and links to sources that are overwhelmingly top-tier outlets like research journals, encyclopedias, and trusted news organizations. ChatGPT and Gemini also have access to the live web and can cite their sources when instructed to do so. This speeds up research and fact-checking immensely and ups the trust factor with AI tools because you can see exactly where they are getting their information.

 

Misconception 3: It’s Cheating

AI can handle a lot of tasks on editors’ to-do lists and vastly speed up the process. When the work is suddenly so much easier, of course it can feel like using AI is cheating. Think about this: Was using spellcheck cheating? Or what about the dictionary? Was it cheating when you had to thumb through the pages to find the entry because you did not have every word in the English language memorized? Was it cheating when editors moved from marking up paper to using word processors?


Editing has always been a collaborative process between editors’ own knowledge, experience, and skills and the tools they've used—style guides, reference books, and technology. AI can be seen as another tool in the editors’ toolkit. When you use AI, it’s giving you suggestions, but you are still the one making the decisions of which suggestions to accept. You are still in control, using your knowledge of what is best for the writer and the reader.

 

Misconception 4: It's Not a Good Editor

Right now, AI tools like ChatGPT and Claude absolutely can conduct copyedits (straightforward checks for grammar, spelling, punctuation, and style) as well as humans can. I’ll admit that ChatGPT and Claude are often more accurate at copyediting than I am—and way faster, completing in seconds an edit that would take me hours.


I’ve found the editors who say AI isn't very skilled aren’t using the tools correctly. For example, they’ll use a vague prompt like “edit this” and then claim victory when it doesn’t make the precise changes they want. AI does what you tell it to do, not what you want it to do. If you want it to make a precise type of change, instruct it to do so.


AI is also quite good with content editing. It excels with idea generation. When you instruct it to examine a particular aspect of the text and offer suggestions for improvement, it can come back with so many ideas, some novel ones you may have never thought of on your own. You won’t want to pass along every idea it comes up with to the author, but it can certainly give you good ideas and help to spark ideas of your own.

 

Misconception 5: It Can Replace Human Editors

I have heard multiple horror stories about entire editorial teams being laid off and replaced with...wait for it...Grammarly. Instead of writers sending their work to editors, writers are told to send their own work through Grammarly. And we all know how trustworthy Grammarly is, right? (I’m cringing. I encourage the use of many AI tools, but Grammarly is not among them.) Clearly these decisions are being made by people far removed from the day-to-day editorial process, and they are being made with shareholders and bonuses in mind, not readers. Because AI cannot and should not replace human editors.


These algorithms can catch mechanical errors, suggest improvements in grammar and syntax, and improve readability. However, AI can struggle to grasp the nuances of language, such as irony, sarcasm, or subtle shifts in tone. Since AI is built to follow formulas, it also struggles with allowing the creative freedom authors employ when purposefully bending or breaking language “rules.” And it cannot fully appreciate the cultural, historical, or personal backgrounds that shape a writer's voice and message.


Editing isn't just fixing spelling and pushing commas around. Editing involves a lot of soft skills. Editors build relationships with authors. We provide personalized guidance, and we make judgment calls based on our experience, what we know about the author, and what we know about helping the author do their best work for the reader. Human editors offer encouragement and education and nurture talent. Grammarly is not going to grow legs, hop out of the computer, and sit down with the author and explain to them, in precisely the way they know works best for that individual person, how they can improve.


Some companies care more about budgets than creating quality writing, however. This means if editors are to surf this technological wave successfully, they need to learn how to use AI tools so they can do their jobs faster and better than before—AI augmented, with their humanity intact.




 

Let's be real: AI is here, and it's only going to become a bigger part of our editing lives. By understanding what AI can and can't do, we can learn to use it as another powerful tool in our editing toolkit. It can help us with the nitty-gritty mechanics of editing, speed up our research and fact-checking, and even spark new ideas for improving a piece of writing. But AI can't replace the human touch we bring—our ability to connect with authors, understand the nuances of language, and make judgment calls based on our experience and expertise. With AI assisting us, we have more time to focus on what we do best: helping writers write well.

 



Erin Servais is a white person with long, brown hair. Erin wears glasses and blue eyeshadow and a blue suit.

Erin Servais helps editors upskill through AI. Her AI for Editors course is known worldwide as the #1 AI course for editors of all types, including medical editors, finance editors, education editors, corporate communications editors, and book editors.


Erin serves on the board of directors for ACES: The Society for Editing and has presented about editing, entrepreneurship, and artificial intelligence for the Professional Editors Network, Editors Canada, the Northwest Editors Guild, the Editorial Freelancers Association, and ACES.


A version of this article first appeared in the Editorial Freelancers Association's newsletter, where Erin is writing a series about AI and editing. Erin collaborated with artificial intelligence to write this article.


182 views
bottom of page