SEC Conducts Sweep of AI Use by Investment Advisers

Author

Aaron Pinnick, Josh Broaded, Carlo di Florio

Publish Date

Type

Compliance Alert

Topics
  • Compliance
  • Cybersecurity

As part of the U.S. Securities and Exchange Commission’s (SEC’s) recent focus on artificial intelligence (AI), the Division of Examinations has initiated a sweep of investment advisers on how AI-based tools are being used by the firms. The SEC has requested information on how firms are managing AI-related conflicts of interest, copies of marketing materials that mention AI, continuity plans around AI system failures, and other documents related to AI.

With this ongoing sweep, firms should be prepared to supply the SEC with documentation around the firm’s management of AI risk, including:

  • An inventory of where and how AI-based tools are used within the firm.
  • Policies and procedures that govern the use of AI within the firm, and in investor interactions.
  • Security controls that are in place to protect client data that is used by AI systems.
  • Information on who developed and manages the AI software being used.
  • The sources and providers of data included in AI tools and models.
  • Reports or results of the validation and testing that has been performed on the firms AI-based tools.
  • Internal reports of any incidents where AI use created regulatory, ethical, or legal issues.
  • Marketing materials and disclosures that reference the use of AI.
  • A list of governance committees with specific AI-related responsibilities and associated documentation.
  • Business continuity plans in case of AI system failures or errors.

Regulatory hurdles for asset managers

Regulators have significant investor protection and market integrity concerns about oversight, explainability, algorithmic bias, data quality/model validation, operational and systemic risk issues related to AI use. In addition, they've raised concerns around “AI Washing” where firms overstate their AI capabilities in marketing materials. The SEC is working to address these concerns through this AI sweep to learn about how firms are using AI and the governance, risk management, and control framework they have in place. The SEC has also proposed a Predictive Data Analytics rule, which looks to manage conflicts of interest in how firms may leverage technology, such as AI, to put their interests ahead of the clients’ best interests.

To be prepared to manage these risks, firms should focus on implementing effective governance, risk management and compliance programs around their AI marketing and investments that ensure human oversight (risk governance and model validation framework), robustness and safety protocols, conflict of interest identification and mitigation, privacy and data governance, transparency, bias mitigation, and accountability.

How we help

The SEC, under Chair Gary Gensler, is working quickly to establish rules for how advisers and financial services firms are using AI, and firms need to be aware of this as they begin integrating AI tools into their work. Compliance and cybersecurity leaders should not only begin preparing the documentation necessary to satisfy the SEC, but they must take the initiative to educate the firm on the potential legal and regulatory risks associated with the use of AI.

ACA's regulatory compliance, cybersecurity, and privacy consultants can help clients draft policies, train employees, and assess readiness to respond to regulatory inquiries regarding AI. ACA Signature can help. Choose the combination of compliance advisory services,  innovative technology, managed services, and cybersecurity that is right for your firm to gain expert insight, guidance and support as you navigate emerging challenges like AI.  

To learn more about the SEC’s recent AI rulemaking and examination sweeps, or how ACA can support you to enhance your policies regarding the use of AI, please don’t hesitate to reach out to your consultant or contact us here.

Contact us