
Recognizing the growing use of artificial intelligence by financial advisors, this week the CFP Board released a “Generative AI Ethics Guide.”
The guide, which includes a checklist for upholding the CFP Board’s code and standards, focuses on free or paid generative AI platforms, not custom-built tools. It lists common ways certified financial planners might use generative AI in their practices, including gathering client information, conducting initial research, improving communications, creating or refining content and generating brand-building ideas.
The guide emphasizes the importance of using AI as a tool to aid — not replace — professional expertise. AI is “helpful for idea generation, synthesizing info and refining work. CFP professionals must account for AI limitations and risks, including inaccuracies or ‘hallucinations’ [and] are responsible for the final work product,” it states.
The standards that the guide seeks to uphold include confidentiality and privacy, integrity, legality and providing accurate information. The board’s AI guide prescribes caution when it comes to using AI for “work that requires a reasonable understanding of assumptions and outcomes.”
Sara Cortes, CFP Board assistant general counsel, told Financial Planning that the CFP Board began to work on its AI ethics checklist after publishing a broader technology guide and related questionnaires and checklist.
“CFP professionals are increasingly using generative AI, and we want them to know that they can easily do so in accordance with the CFP Board’s ethical standards,” she said. “We framed the guide as a checklist because that will help CFP professionals focus on key issues.”
Cortes said CFPs have shown openness to using generative AI, “while also being unsure of where to begin, what uses are appropriate and what measures they should take to use the technology in a way that aligns with their ethical obligations.”
Reaction to the CFP Board’s AI ethics guide
Lawrence D. Sprung, founder of Mitlin Financial in Hauppauge, New York, said the guide was forward-thinking and helpful for advisors.
“They have done a good job of outlining things to think about, as well as those that you should not be doing and those that you can use it for as well,” he said. “I think this guide will be a work in progress that will be updated as time and technology evolves. We as CFP professionals need to be in a position to utilize these tools but do it in a way that honors our fiduciary responsibilities and the privacy of the families we serve.”
READ MORE: Advisors are getting more comfortable with AI, with exceptions
William Trout, director of securities and investments at technology data firm Datos Insights, said the guide provides a strong ethical framework for the use of generative AI by financial advisors, while at the same time acknowledging the commercial opportunities presented by AI.
“The guide mediates against a shoot-first, check-later approach by asking advisors to think deeply about whether the content they generate using generative AI corresponds to what they know as practitioners,” he said. “By encouraging adherence to firm policies around AI usage, it functions as a compliance safety net for both advisors and the firms they work for. Safeguards for client privacy mentioned in the guide are equally important.”
Leo G. Rydzewski, CFP Board general counsel, said different vendors offer different privacy protection mechanisms.
“That is why we emphasize due diligence to understand the privacy protection mechanisms the vendor has in place,” he said. “The checklist provides topics concerning security and encryption measures to guide this inquiry, including when a CFP professional is inputting any private data.”
READ MORE: AI scams are getting harder to spot. How advisors can help
Noah Damsky, principal at Marina Wealth Advisors in Los Angeles, said his firm uses AI to do tedious, low-skilled work that would be time-consuming for a person. However, he said he doesn’t use AI in ways that can expose client data.
“I won’t expose client personal information into ChatGPT, since I assume it may use my data for other reasons,” he said. “Maybe it won’t, but I don’t trust it enough at this point. I’m cautious as to which software providers can see my client data.”
Looking ahead to future potential guidance
Rydzewski said in the coming months, the CFP Board will form an external AI Working Group, bringing together financial and technology leaders “to provide actionable recommendations to help CFP Board, CFP professionals and the financial planning profession understand the opportunities and risks.”
John O’Connell, founder and CEO of industry tech consultant The Oasis Group, said the guide in its current form provides sound direction on use cases where AI technologies can be used to save time for an advisor.
“[It] should include guidance on educating the advisor on the risks of using AI technologies,” he said.
Marcio Silveira, a financial advisor with Silvergreen Sustainable Investments in Arlington, Virginia, said he had reviewed the guide and found it to be a “good starting point, as it covers most of the current AI capabilities available to advisors today.”
“Looking ahead, we can anticipate even more sophisticated AI tools,” he said. “One area that requires further consideration is the use of AI to analyze subtle client cues like tone of voice and facial expressions. While these tools could potentially enhance client understanding, they also raise ethical concerns regarding manipulation, transparency, and data privacy. The guide should offer further clarification on these issues to ensure responsible use and protect client autonomy.”
How the guidelines might be improved
Trout said that while the guide provides an overall solid foundation, its value could be enhanced by distinguishing between critical and recommended practices. It could also provide clearer AI evaluation metrics and more advisor-specific scenarios and outline team collaboration protocols, especially around decision-making.
“Additionally, more specific guidance on client communication would help financial advisors more effectively explain their AI use to clients,” he said.