Skip navigation
tablet digital lock welcomia/iStock/Thinkstock

Survey: Most Advisory Firms Lack AI Safeguards

Ninety-two percent of firms surveyed by the ACA Group and National Society of Compliance Professionals have no policies in place for AI use by third parties and service providers.

An overwhelming number of advisory firms have not adopted policies and procedures concerning AI use among third parties and service providers, according to results from a survey conducted by compliance firm ACA Group and the National Society of Compliance Professionals.

In all, the survey found 92% of respondents have no policies in place for AI use by third parties and service providers and only 32% have an AI committee or governance group in place. Additionally, nearly seven in 10 firms have not drafted or implemented policies and procedures governing employees’ use of artificial intelligence, while only 18% have a formal testing system for AI tools. 

The results indicated that while there is “widespread interest” in AI throughout the space, there is also a “clear disconnect when it comes to establishing the necessary safeguards,” according to NSCP Executive Director Lisa Crossley.

The survey was conducted online in June and July, with responses from 219 compliance professionals detailing how their firms use AI. About 40% of respondents were from firms with between 11 and 50 employees, with managed assets ranging from $1 billion to $10 billion. 

Though an earlier ACA Group survey this year found that 64% of advisory firms had no plans to introduce AI tools, that survey focused on AI use for client interactions. According to Aaron Pinnick, senior manager of thought leadership at ACA, the current survey concerns using AI for internal and external use.

According to the results from the current survey, 50% of respondents didn’t have any policies and procedures on employee AI use finalized or in process, while 18% responded that they were “in the process of drafting” such policies. 

While 67% of respondents said they were using AI to “increase efficiency in compliance processes,” 68% of AI users reported they’d seen “no impact” on the efficiency of their compliance programs (survey respondents indicated the most common uses for AI were research, marketing, compliance, risk management and operations support).

Compliance professionals at firms reported that the two biggest hurdles to adopting AI tools remained cybersecurity or privacy concerns and uncertainty around regulations and examinations, at 45% and 42%, respectively (while the lack of talent with AI knowledge came in third). 

About 50% of respondents said their employee training covered AI cyber risks and “appropriate AI use and data protection.” At the same time, some firms encrypted data and conducted “regular vulnerability and penetration testing” on AI tools. About 44% of firms reported only allowing “non-public” AI tools, while 33% of compliance professionals said they conduct a “privacy impact assessment” on a tool before their firm adopts it.

The survey results come a week after the SEC Examinations Division released its 2025 priorities, underscoring that they were investigating advisors’ integration of AI into operations, including portfolio management, trading, marketing and compliance (as well as their disclosures to investors). Along with a previously reported SEC sweep, it’s the latest indication of regulators’ increasing focus on how advisors use AI in daily practices.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish