SEC Considers Ban on Predictive AI Tools in Financial Advisory Apps

The Securities and Exchange Commission proposed a rule in July 2023 that would have required broker-dealers and investment advisers to remove conflicts of interest embedded in the AI systems they use to guide client behavior. This was something the financial technology industry had not anticipated coming quite so directly.

Machine learning models, algorithmic tools, and what the SEC specifically referred to as “black box” systems—software that generates recommendations without being able to explain, even to its own operators, how it arrived at them—were all included in the plan. The proposal was in the regulatory process for two years.

CategoryDetails
ProposalSEC Predictive Data Analytics Rule â€” first introduced July 2023
Targeted TechnologyAI, machine learning, algorithmic modeling, and “black box” systems used in investment advice
Core AimRequire firms to eliminate — not just disclose — conflicts where AI prioritises firm revenue over client interests
Industry ResponseSignificant pushback from broker-dealers, investment advisers, and fintech lobby groups
Withdrawal DateJune 2025 â€” formally withdrawn alongside several other SEC initiatives
Reason for WithdrawalIndustry opposition and shift in political administration following 2024 election
Current EnforcementIndividual “AI washing” prosecutions continue; no systemic conflict-of-interest rule in place
International ContextEU AI Act now regulates behavioral manipulation through AI — some US firms adapting voluntarily
Investor RiskFirms still legally permitted to use AI systems that optimise for firm profit without explicit prohibition
Further AnalysisRegulatory coverage at Financial Times Markets Regulation

The SEC then discreetly withdrew it in June 2025. The political landscape had changed, the financial sector had fiercely resisted, and the laws intended to shield ordinary investors from AI systems that were optimized for corporate profits had simply vanished.

The issue that the SEC had been attempting to resolve was real. When a financial guidance app employs an AI model to provide product recommendations, the model’s goals are determined by the company that created it, and their financial interests may not necessarily coincide with those of the client.

When a system is optimized to maximize engagement or direct users toward higher-margin products, it can do so in ways that are actually hard to see from the outside. The SEC’s “black box” characterization was accurate because many of these systems are unable to provide a logical justification for the recommendations they made, making it nearly hard to detect bias or self-serving behavior through traditional compliance review.

Instead of hiding the disclosure in an unread terms-of-service document, the proposed rule would have forced businesses to assess those systems and resolve problems before to deployment.

It’s difficult to ignore the fact that the withdrawal coincided with a more general relaxation of financial regulation after the 2024 election, with the new administration indicating a less stringent approach to technological regulation in the majority of industries.

The industry organizations that had opposed the rule claimed that the current fiduciary obligation regulations were adequate to safeguard investors, that the definition of “covered technology” was too expansive, and that the costs of compliance would be unaffordable.

Significant resources were used to support those claims. They prevailed. Since the current regulations were created before AI-driven advice tools existed at anything like their current scale or sophistication, it is less obvious whether they are truly adequate.

SEC Considers Ban on Predictive AI Tools in Financial Advisory Apps
SEC Considers Ban on Predictive AI Tools in Financial Advisory Apps

The SEC’s approach to AI in financial services has limited to individual enforcement in 2026. This includes prosecuting companies that misrepresent their AI capabilities and the “AI washing” instances that resulted in fines against Global Predictions and Delphia in March 2024. That is true enforcement, and it is important. However, it focuses more on the integrity of AI marketing than the design of AI rewards.

As long as it doesn’t lie about having the AI in the first place, a company can now utilize an AI system that subtly directs customers toward more expensive products. It is a significant distinction, existing in the void left by the rule’s withdrawal. The EU AI Act’s stronger limits on behavioral manipulation are being voluntarily adopted by several US companies.

However, voluntary compliance differs from mandatory compliance, and the companies most likely to surpass minimal norms are probably not the ones who most urgently required the legislation.

From the outside, it appears that the retail investor is essentially back where they were prior to the proposal: using apps that might or might not be optimized for their benefit, with little ability to distinguish between them, and regulators whose most ambitious attempt to address that issue has now been formally abandoned.

The question is not whether the SEC will eventually review its AI conflict-of-interest regulations. It most likely will, especially if a well-known enforcement action finally links damage to an algorithmic recommendation that the company was unable to justify. How much distance is traveled in the interim is the question.

Show Comments (0) Hide Comments (0)
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments