Members Skeptical but Interested in AI Writing Tools

By Susan Weiner

You’ve probably read about new tools like ChatGPT that use artificial intelligence (AI) to generate written content with little human intervention other than describing an assignment. Some commercial websites have already used it for that purpose.

The potential of these writing tools seems great, but the tools also have drawbacks. In February, I reached out to NAPFA members with a brief survey about their interest in and use of AI writing tools, and here’s what I’ve learned so far.

Skeptical …

In response to the question, “Will you use AI-driven tools to generate written content for your clients?” two answers tied for first place at 28.6%: “No” and “Maybe. I’m thinking about it.” This suggests to me that NAPFA members are skeptical about AI writing tools. Also, I’m guessing that respondents skewed toward members interested in AI tools, which suggests the overall member population might be more skeptical.

The “quality” of AI writing was the biggest concern, mentioned by 85.7% of respondents to a multiple-choice question. Quality was followed by “accuracy” at 71.4%; “bias” and “potential for plagiarism and copyright violations,” both at 50%; “ethics” at 42.9%; and “compliance” at 35.7%.

Some respondents added their personal concerns when they answered my question. Lack of personalization and authenticity were concerns touched on by several respondents. “Communicating directly with my clients using AI tools would take away from the personal touch that underlies the service I offer my clients,” said Craig Toberman. Tyler Gray mentioned, “Voice—making sure it sounds like me/our firm.” Angela Dorsey echoed that concern, saying, “I don’t feel it is authentic. Clients want to hear from us, a live person.” On a related note, Gordon Achtermann figures AI wouldn’t make him more efficient because “time to check its work would equal the time to write it myself.”

And Interested

On the flip side of the skepticism, more than 70% of respondents were using—or would consider using— AI writing tools. However, they’re using them in limited ways, and I got the sense that they would carefully check the accuracy and quality of the work.

Marcus Miller said:

The tools are useful for any number of things. … I have experimented with using it to produce agendas for meetings and email replies, but those uses are limited at the moment. It is important to recognize that this is a tool, much like the word processing software was when it was first introduced. Finally, advisors are responsible for anything they send/publish and should review the materials produced by these tools for tone of voice, accuracy of content, and appropriateness.

Similarly, Bryan Wisda said, “I have used ChatGPT to generate website content. It wrote the paragraph, and then I edited it to match my voice. All in all, it’s pretty good but is missing some of the nuances.”

Chris Chen said that if he used AI tools, he’d use them to “create outlines; line up research that I can then finish up; generate ideas.” Bob DePasquale agrees that the tools might help in “generating ideas.”

AI tools might be a good way to generate first drafts of less-technical content, as Mark Wilson suggested. Similarly, Zack Hubbard expressed interest in potentially using them for “creating basic educational content/easy marketing content.” Nancy Nawn mentioned potentially using AI tools for blog or website content. Along similar lines, other respondents mentioned using AI tools to generate emails.

If You Use AI Writing Tools

Proceed with caution when using AI writing tools. Never rely on them to create the final version of your content. Double-check that any information is correct and that you’re not plagiarizing anyone.

On a much smaller scale, I use Grammarly, an AI-aided editing tool. Its editing recommendations are often wrong, although it’s right often enough to be useful. Human judgment remains essential.