SEC Implements New Regulations for Wall Street’s AI and Data Analytics Practices
The primary regulator of Wall Street is introducing proposed limitations for brokerages and money managers who employ artificial intelligence to engage with their clients.
The U.S. Securities and Exchange Commission on Wednesday approved a plan to eradicate conflicts of interest that can arise when financial firms adopt the technologies, according to Chairman Gary Gensler. The agency also approved final rules requiring companies to disclose serious cybersecurity incidents within four business days of being deemed significant.
The AI proposal is the latest from regulators in Washington, who are concerned about the technology’s power to influence everything from credit decisions to financial stability. According to the SEC’s release, companies must assess whether their use of predictive data analytics or artificial intelligence creates conflicts of interest, and then remove them. They would also need to establish written policies to ensure they remain compliant with the rule.
“These rules would help protect investors from conflicts of interest and require that regardless of the technology used, companies meet their obligations” to put customers first, Gensler said during the meeting. “This is more than just disclosure. It’s about whether there’s something built into these predictive data analytics that optimizes for our benefit, or something that optimizes” for the benefit of financial firms, he said.
Banks and banking firms have been using artificial intelligence for fraud detection and market surveillance for years. Recently, the focus has shifted to trading recommendations, asset management and lending. The SEC wants to ensure that companies do not put their interests before the interests of customers when they recommend businesses or products.
The proposal is broader than existing requirements for brokers to act in the best interest of their clients when making recommendations, an employee of the agency said at a background press conference on Tuesday.
The plan is open to the public for comments, which the agency will review before voting on the final version, likely sometime in 2024. The rule would require a majority of approval by the five-member board to be finalized.
Two Republicans on the commission criticized the rule for being too broad, requiring companies to evaluate the use of too many types of technology for potential conflicts.
For example, “a number of commonly used tools, such as a simple electronic calculator or an app that analyzes an investor’s future retirement assets by, for example, changing a broad asset allocation between stocks, bonds and cash, could qualify,” Commissioner Mark Uyeda said. The proposal’s “vagueness” and compliance challenges “could cause companies to shy away from innovation,” he said.
Full Court Press
In recent weeks, regulators have made it clear that they are stepping up oversight of AI.
Rohit Chopra, director of the Consumer Financial Protection Bureau, announced that new restrictions are coming to the use of artificial intelligence in lending. Michael Barr, the Federal Reserve’s vice chairman for oversight, said lenders need to make sure such tools don’t add to bias and discrimination in credit decisions.
The Federal Trade Commission has already launched an investigation into Microsoft Corp.-backed OpenAI Inc., maker of ChatGPT, to determine whether the chatbot poses risks to consumers’ reputations and data. The Washington Post was the first to report on the investigation.
President Joe Biden said on July 21 that his administration will take new executive action in the coming weeks to set a framework for “responsible innovation” with technology.
Since taking over the SEC in 2021, Gensler has raised concerns about the potential for artificial intelligence to use data to target individual investors and encourage them to change their behavior when trading, investing or opening financial accounts.
Last week, he called the tools “the most transformative technology of our time” but warned that focusing the technology on just a few companies or a few fundamental data sets poses a risk that could lead to financial market instability in the future.
Cyber Notices
The SEC also approved a plan Wednesday to require companies to disclose significant cybersecurity breaches.
The final rule retains the proposed version’s requirement to disclose violations within four business days of a determination that they are “material” to a company’s operations or financial condition. It does, however, increase the possibility of delaying disclosure if the US attorney general determines that public safety or national security would be jeopardized by disclosure of the incident.
Industry groups such as the Business Roundtable have warned that a four-day timeline would provide valuable information about a company’s operations to bad actors.
Another proposal on the SEC’s agenda would allow online-only investment advisers to register with the commission. According to the agency’s estimate, the current exemption applies to approximately 200 investment advisors.