AI in Finance: The SEC’s Rules and Whistleblowing

The SEC is taking a stand on AI in finance. While the agency is still developing comprehensive regulations, employees who observe fraud or other misconduct involving the use of AI have options to report their concerns to the SEC.

Updated

May 14, 2025

AI In Finance: The Sec's Rules And Whistleblowing
Grid Half

In the world of finance, the relationship between a broker-dealer or investment advisor (collectively, “firms”) and their client should be one of trust and loyalty. However, the rise of sophisticated AI, or Predictive Data Analytics (PDA), has introduced a potential conflict of interest.

These technologies hold the potential to personalize investment advice and help investors optimize decision-making, but the SEC has expressed concerns about how firms might leverage them to prioritize the firms’ own gains over investors’ best interests.

This is the issue the SEC sought to address with a proposed rule on July 26, 2023. Building on existing legal frameworks, the SEC’s proposal aimed to ensure investors’ interests are not harmed by firms’ use of AI.

The proposed rule would require broker-dealers and investment advisors to take a critical look at their use of investor interaction technologies and identify any practices that could create conflicts of interest where the firm’s interests take precedence over those of the investor.

To achieve necessary transparency, the SEC proposes a two-pronged approach.

First, firms would be required to create written policies and procedures to ensure compliance with the new regulations. Secondly, they would need to maintain records documenting their efforts to identify and mitigate the conflicts they identify. The proposal would allow firms flexibility in choosing specific tools to address these risks if the chosen methods align with the overall goals of the rule.

The proposed release, which outlines the details of the rule, was published in the Federal Register, triggering a 60-day comment period for public input. The SEC received hundreds of comments on the proposal, and has said it is considering those comments in shaping a potential final rule.

This transparency allows for informed discussion and ensures the final rules best serve the interests of investors and other market participants. Continue reading to learn more about this rule and also what whistleblowers can do to report fraud involving artificial intelligence.

The Rise of AI in Financial Advice and the SEC’s Proposed Safeguards

The financial services industry has witnessed a boom in the use of AI. Broker-dealers and investment advisors are leveraging AI for various purposes, including:

  • Market forecasting: firms are using AI and machine learning to analyze vast amounts of data to predict pricing trends of investment products.
  • Automated financial planning: firms are tailoring investment strategies and asset allocation based on individual client profiles.
  • Risk management: firms are employing AI algorithms to identify and mitigate potential risks associated with investment portfolios.

Firms are also using AI for customer interaction functions, such as chatbots. While AI holds immense potential for personalized and data-driven financial advice, the SEC has identified major risks associated with its application.

SEC Concerns and Proposed Rules

SEC Chair Gary Gensler has raised significant concerns about the use of AI to predict investor behavior or to obscure firms’ placement of their own interests ahead of their clients’. Additionally, Gensler has cautioned that the reliance on standardized datasets and models across multiple firms can lead to a concentration of risk. To address these concerns, the SEC proposed new rules governing the use of PDA by broker-dealers and investment advisors.

A core focus of the proposed rules lies in “black box” technologies. These are AI models where the internal decision-making process remains opaque to the user, or even to the firm implementing the model. This opacity raises concerns about the potential use of biased or corrupted algorithms, leading to intentionally or inadvertently skewed investment recommendations that prioritize firm profits over client interests.

The proposed rules aim to mitigate these risks by requiring firms to:

  • Identify and Mitigate Conflicts: Evaluate their PDAs to identify and eliminate, or neutralize, any conflicts of interest that could disadvantage clients.
  • Implement Written Policies and Procedures: Develop and maintain documented procedures ensuring compliance with the proposed regulations.
  • Enhance Recordkeeping: Maintain comprehensive records on PDA evaluations, including implementation dates, modifications, testing results, and identified conflicts of interest.

These proposed rules are intended to mandate a comprehensive evaluation and documentation of all potential conflicts that could occur from the use of AI.

Industry Response and the Road Ahead

The proposed regulations have generated significant commentary from market participants, observers, and even members of Congress. While the SEC emphasizes the importance of protecting investors, industry participants argue that the rules may impede innovation and impose unnecessary restrictions that could translate to higher costs for investors.

Notwithstanding the delay in adopting a final rule on PDAs, the SEC remains committed to addressing AI-related risks. The SEC’s Division of Examinations has already engaged with market participants, inquiring about their AI practices, data models, and incident reporting procedures. This underscores the SEC’s focus on promoting strong compliance oversight and mitigating risks associated with AI-powered financial services.

While final rules for AI use have not been implemented, the SEC’s activities suggest they are exploring using existing regulations to address potential risks associated with AI technology, which can be difficult because of AI’s adaptability, which makes problems hard to identify. Gensler warns that this type of unchecked AI could fuel a future financial meltdown.

Development of AI

Existing provisions offer a framework for mitigating risk associated with AI models in finance. For example, those deploying predictive AI have to be mindful of the data sources to which the model has access, in order to ensure compliance with insider trading laws and regulations. That is, they must be aware if the model may be executing trades based on data that may contain material nonpublic information.

AI-driven outputs can also raise concerns regarding an investment advisor’s fiduciary duty or a broker-dealer’s obligations under Regulation Best Interest (Reg BI) by making investment choices or recommendations that advantage a firm’s interest over those of a customer or client.

Reg BI mandates identifying and disclosing conflicts fully and fairly, while an investment adviser has a fiduciary duty of loyalty to clients. The SEC can establish the existence of conflicts that may violate these requirements by examining patterns of outcomes, even without access to the complete dataset or the AI model itself.

AI Washing

In addition to such AI concerns, SEC Chair Gary Gensler also warns investors and businesses against AI washing, a term often used to describe the overhyped promises of AI-driven products. This term was coined after ESG greenwashing, which is a term to describe an investment product that is marketed in a misleading way to sound more socially responsible than it might be.

Many companies are now touting their innovative uses of AI, including to perform core functions and create material efficiencies. The concern is that these promises may not match what some companies are delivering. In fact, some are luring investors by simply using AI language in their marketing and promotional materials while not actually possessing and/or implementing actual AI-driven products.

Thus, the SEC is cracking down on inadequate disclosures about a firm’s AI technology.

Recent AI Washing Cases

Delphia (USA) Inc. (2024)

The SEC charged Delphia (USA) Inc., a Toronto-based investment advisor, for making false and misleading statements about their use of AI and machine learning in their investment process.

From 2019 to 2023, Delphia claimed they used AI to analyze client data and predict future market trends. The SEC found these statements to be false, as Delphia did not have the AI capabilities they advertised. Delphia agreed to settle the charges, pay a $225,000 civil penalty, and be censured.

Global Predictions Inc.

The SEC also charged Global Predictions Inc., a San Francisco-based investment advisor, for making false and misleading claims about its use of AI.

In 2023, Global Predictions claimed to be the “first regulated AI financial advisor” and offered “expert AI-driven forecasts” through their platform. The SEC found these claims to be false.

Additionally, Global Predictions violated other securities laws by falsely advertising tax-loss harvesting services and including an improper clause in their advisory contracts. Global Predictions agreed to settle the charges, pay a $175,000 civil penalty, and be censured.

Complying with AI Rules

As stated previously, the SEC is continuing to ask advisors about their AI policies, which suggests they are interested in making sure advisors are aware of the potential risks of using AI and have plans in place to avoid any potential problems. 

There are existing rules (like Regulation S-P and Regulation S-ID) that require financial firms to protect client information and respond to identity theft. This means any AI models a firm uses that access customer information must be secure and monitored for their capacity to be cyber threats.  

It is a two-way street: AI models used by other companies might also access a firm’s online information about customers. If a firm’s own systems have weaknesses, it could allow access to more data than intended, potentially violating their obligation to protect client information. 

Blowing the Whistle on AI Washing

AI-related misconduct within the financial industry can be reported under the SEC Whistleblower Program, which offers potential financial rewards to qualifying individuals. The program incentivizes individuals with knowledge of violations (such as misleading investors through AI-washing) to report it. In exchange they offer qualifying individuals awards ranging from 10% to 30% of the monetary sanctions collected from companies found to be in violation.

To qualify for an award, whistleblowers must provide the SEC with original information about past or ongoing AI misuse that leads to a successful enforcement case by the SEC and the assessment of monetary sanctions of more than $1 million. This information could include evidence of biased algorithms, misleading AI marketing claims, or the manipulation of AI models for unfair profits.

The SEC protects the confidentiality of whistleblowers and offers various reporting channels, including a secure online portal, hotline, and in-person meetings at regional offices. It is crucial to consult an attorney specializing in whistleblower protection to navigate the complex legal landscape and maximize the chances of a successful claim. 

Conclusion

The SEC is taking a measured approach to finalizing AI regulations, but their proposed framework and existing rules offer strong evidence of the SEC’s attention to detecting and preventing harm from misuse of AI.

The focus is on mitigating conflicts of interest that AI might introduce and ensuring these models prioritize client needs. This means understanding how AI makes decisions and using data responsibly. Existing regulations regarding data security and client best interest also apply.

Potential whistleblowers who become aware of these types of violations should consider consulting with specialized whistleblower attorneys to discuss the best way to notify the SEC of their concerns.

Our Firm’s Cases

  • Andrés Olarte Peña

    Environment & Human Rights Violations Exposed

    Oil industry’s environmental crimes and cover-up in Colombia have been exposed. Whistleblower Andrés Olarte Peña, with the support of his attorneys Kohn, Kohn & Colapinto and the damning evidence compiled in the Iguana Papers, is calling for an investigation into Ecopetrol and its executives by the Colombian government and the U.S. Securities and Exchange Commission.

  • SEC Whistleblower

    $30 Million Award

    Protecting the confidentiality of Wall Street whistleblowers is among the most important breakthroughs in federal whistleblower law. Under the Dodd-Frank Act, whistleblowers can file anonymous cases, and everything about their case, including who they sued, remains secret.

  • SEC Confidential Whistleblower

    $13.5 Million Award

    Our firm represented an anonymous whistleblower, who on May 17, 2021, received a whistleblower award of almost $13.5 million. The SEC has issued more than $31 million in whistleblower awards related to this case.

Relevant FAQs

Latest News & Insights

Securities Fraud

Securities and Commodities Group

Former SEC officials lead the firm’s new group, representing whistleblowers who report financial fraud and legal violations to the SEC, CFTC, DOJ, FinCEN, and the IRS.