Introduction
Artificial intelligence (AI) is rapidly changing the way we handle personal finances. In fact, a reported one-third of Canadians are already using AI tools to manage their own finances, and over 60% of Gen Zers believe that AI can help them make more informed financial or investment decisions. This is an exciting technological progression, as AI can lower the barriers and costs to qualified investment advice, and investment advisers can use it to better provide suggestions to their clients.
This edition of the McCarthy Tétrault AI Insights Series will explore how advisers and investors are using AI, and how Canadian securities regulators are reacting to the use of AI in this context compared to approaches seen in the USA and Europe. At the end, we will consider: (1) the applicability of the adviser registration requirement to AI companies, and (2) the extent to which AI tools can be integrated into self-directed online trading platforms.
How Advisers are Using AI to Provide Advice
In addition to individuals using AI to manage their own finances, advisers are increasingly using AI. For instance, AI can increase efficiency, improve customer relationships, and, most notably, optimize investment returns.
Increased Efficiency
Advisers might use AI to increase efficiency, such as automating pre- and post-trade processes (e.g., generation of trade reports and submissions to regulators), or incorporating generative AI into initial drafts of documents using tools like Microsoft 365 Copilot, which can lead to lower transaction costs.
Improving Customer Relationships
Advisers can use AI tools to improve customer service and find new clients, using AI CRM platforms like Salesforce. For example, RBC US Wealth Management has rolled out an AI CRM tool which uses "demographics like age, income level, and inflows and outflows of capital" to optimize their services. It can be anticipated that AI will enhance client interaction processes and permit investment advisers to better collect and utilize customer data.
Optimize Investment Returns
Most notably, AI can be a powerful tool for portfolio construction, investment selection and, ultimately, optimizing investment returns. For instance, advisers can use AI algorithms to research a broad universe of investment options, fine-tune their trading strategies, or analyze alternative data like social media posts or satellite images to make better-informed decisions.
One study found that GPT-4 can produce results on par with professionally managed benchmark portfolios. When given an investor's risk tolerance, risk capacity, and sustainability preferences, GPT-4 was able to design a portfolio suited for that investor and suggest specific investment products. However, it cannot conduct risk profiling on its own, and while it can outline certain products (e.g., stock tickers), it cannot actually implement the portfolio (i.e., it cannot open a brokerage account, buy or sell, nor rebalance portfolios). For these reasons, the study suggested that GPT-4 only be used by advisers – to improve their services, not replace them.
How Canadian Securities Regulators are Reacting to the Use of AI in the Dispensation of Investment Advice
Canadian regulators "haven't yet proposed specific rules targeting firms' AI usage." That said, they have acknowledged AI as a growing part of the investment industry, and appear to be embracing the technology. For instance, the Ontario Securities Commission (OSC) predicted that AI will be a major factor influencing the capital markets landscape in their Strategic Plan for 2024–2030, and released a report titled "Artificial Intelligence in Capital Markets: Exploring Use Cases in Ontario" which explores the myriad AI use cases in the investment industry. Likewise, the Canadian Investment Regulatory Organization (CIRO) released a study titled "Enabling the Evolution of Advice in Canada" that found an increased interest by compliance executives to adopt AI "capabilities over the next three years," and that 67% of millennials want computer-generated recommendations as part of their financial management.
Thus far, Canadian regulators have provided information about the use of AI by advisers and investors, but have said little about their regulatory stance on AI in this context. The first guidance we received was in the brief "AI Governance" section of the OSC's "Exploring Use Cases" report published in October, 2023, which acknowledged concerns that "traditional governance approaches are insufficient in addressing [the] unique risks [of AI], such as lack of transparency, heavy reliance on different types of data, quality of data and bias in model selection." As a result, the OSC noted that AI governance frameworks are being developed to prioritize "AI principles such as unbiasedness, performance, transparency, explainability and resilience."
In September, 2024, the OSC released a study titled "Artificial Intelligence and Retail Investing: Use Cases and Experimental Research," which found that participants who received investment advice from a human using AI ("blended" advice) were more likely to adhere to that advice compared to those who received advice solely from a human or AI program. These results turned out to be statistically insignificant, but the study made another important observation: there was no material difference between adherence to the human and AI suggestions. This shows how Canadians are certainly comfortable using AI for their finances – further underscored by the finding that 90% "of those engaging with AI applications are using the information to inform their financial decisions to at least a moderate extent."
In a companion study also released in September, 2024, titled "Artificial Intelligence and Retail Investing: Scams and Effective Countermeasures," the OSC found that participants "exposed to AI-enhanced scams invested significantly more in fraudulent opportunities than those exposed to conventional scams." This is troubling, with AI's capability to: (i) "turbocharge" common investment schemes by increasing their reach, efficiency, and effectiveness, (ii) develop new scams, and (iii) make false claims of "AI-enhanced investment opportunities."
For the OSC, the willingness to depend on AI for investment advice – amplified by the rapid scalability of AI programs and scams – "reinforces the need for a regulatory framework that ensures that the outputs of AI models are accurate and appropriate for retail investors." They further acknowledged the "need to ensure that algorithms are based on high quality data, that factors contributing to bias are proactively addressed, and that these applications prioritize the best interests of investors rather than the firms who develop them." To that end, we expect the CSA will publish guidance that responds to these risks.
For now, the OSC has cited CSA Staff Notice 31-342 - Guidance for Portfolio Managers Regarding Online Advice (Staff Notice 31-342), issued in 2015 in response to the rise of online advisers (commonly known as robo-advisers) which provided regulatory guidance that investment decisions generated by algorithms must be overseen by humans. This "human in the loop" principle is prevalent in discussions regarding regulatory frameworks for AI in various contexts, including investment advice. We expect that any CSA guidance issued in relation to AI will reinforce the requirements for some human oversight.
Registrants should also be guided by their core duties owed to clients when considering how to incorporate AI into their investment advisory processes, including the obligation to address all material conflicts in the best interests of the client,1 and the statutory duty of care and duty of loyalty applicable to registered dealers and advisers in Ontario.2 Similar concepts are informing approaches being taken in other jurisdictions.
A Look Across the Border: What are the USA and Europe Doing?
USA
The Securities and Exchange Commission (SEC) has begun taking action to regulate AI use. SEC Chair Gary Gensler has commented that AI will cause a "nearly unavoidable" financial crisis without quick regulatory intervention, and their Division of Examinations has identified AI as a focus for 2024.
In July 2023, the SEC proposed new rules to "eliminate, or neutralize the effect of, certain conflicts of interest associated with broker-dealers' or investment advisers' interactions with investors through these firms' use of technologies that optimize for, predict, guide, forecast, or direct investment-related behaviors or outcomes." However, some have criticized the proposed rules as overly broad, as their application would go far beyond AI and affect both individual and institutional investors. They have also been called too onerous, because they go beyond the "full and fair" disclosure regime typically seen in American securities law. Nevertheless, the OSC briefly referenced the SEC's proposed rule in its most recent report.
Whether or not new rules come into place, the SEC is willing to take action under existing mechanisms. For example, they charged two investment advisers in March 2024 for "AI washing" (making misleading statements about their use of AI), and their examinations division is reportedly conducting a sweep to obtain information from investment advisers about their algorithmic models, AI-related marketing documents, compliance training, and third-party providers.
Academic commentators have also offered views on the use of AI by investment advisers. A 2020 article in the Harvard Law School Forum on Corporate Governance suggested that investment advisers' fiduciary duties raise unique considerations with the use of AI. For example, an adviser's duty of loyalty may require them to disclose their "specific uses of AI to manage client accounts," and their duty of care may require them to periodically review the AI's performance and monitor the alignment of the AI's recommendations with clients' objectives and preferences.
Europe
The European Securities and Markets Authority – the EU's financial markets regulator and supervisor – has acknowledged the strong potential for AI applications to improve investment strategies and customer service, along with its risks such as "algorithmic biases, data quality issues, and (potential) lack of transparency." That said, they have taken the position that existing legal rules are sufficient to deal with AI for now. Accordingly, firms using AI are expected to comply with existing MiFID II requirements, "particularly when it comes to organisational aspects, conduct of business, and their regulatory obligation to act in the best interest of the client."
Likewise, the UK Financial Conduct Authority has issued an AI update, noting that "many risks related to AI are not necessarily unique to AI itself and can therefore be mitigated within existing legislative and/or regulatory frameworks."
Applicability of the Adviser Registration Requirement to AI Companies
As with other software tools that have the ability to generate investment advice, it is likely that some types of AI software would, if accessible directly to investors, trigger the adviser registration requirement.
Section 25(3) of the Securities Act (Ontario) (the "OSA") and similar provisions under the Securities Act in force in other Canadian jurisdictions provide that anyone engaging in the business of, or holding themselves out to engage in the business of, advising others with respect to investing, buying, or selling securities must be registered. Thus, the operator of an AI tool that advises individuals with respect to investing or buying or selling securities would likely need to register as an adviser under securities laws in each Canadian jurisdiction where the AI tool is available.
While the "advising generally" exemption set out in section 34(1) of the OSA and section 8.25(2) of NI 31-103 provides that the provision of general advice that is not tailored to anyone's specific needs does not trigger registration, most generative AI tools are built to provide highly customized, tailored responses based on specific user inputs.
In practice, if an online user can open an AI program, enter their information, and get advice tailored to that information, the AI software operator would likely need to register. Similarly, anyone who uses these AI tools to provide investment advice would also need to register as an adviser. This result would be consistent with regulators' historical stance toward "robo-advisers" set out in Staff Notice 31-342: "[t]here is no 'online advice' exemption from the normal conditions of registration" for portfolio managers and the registration requirements are "technology neutral."
Having regard to the regulatory risk and liability exposure which can arise from the provision of investment advice, many general-purpose AI services (such as ChatGPT) are programmed to avoid the dispensation of investment advice and encourage users to consult with a registered adviser. Tools like "chatbots" could easily cross the line into specific investment advice if not carefully programmed. With that in mind, AI companies may want to ensure that their tools – especially chatbots – only offer general information that is not tailored to a specific user.
Integration of AI Tools in Self-Directed Trading Platforms
In Canada, self-directed trading platforms (known as "order-execution-only" or "OEO" platforms) are operated by investment dealers that are registered under securities laws and are members of the Canadian Investment Regulatory Organization (CIRO). OEO platforms are exempt from the requirement to assess the suitability of client trades, subject to the "Recommendation Prohibition" prescribed in IIROC Rule 3400 and CIRO GN-3400-21-003.
CIRO broadly defines a "recommendation" as any communication or statement that could, depending on the context or circumstances, reasonably be expected to influence an investment decision regarding a security. CIRO also provides guidance on which types of investment tools OEO platforms can offer while adhering to the Recommendation Prohibition. For instance, tools offering historical/factual information, general pricing incentives (e.g., commission-free ETF trades), or hyperlinks to third-party websites are less likely to be considered "recommendations." Conversely, trading tools suggesting specific trades, making price predictions, or facilitating the purchase of certain products over others will constitute a recommendation and cannot be circumvented by a disclaimer.
Perhaps most importantly for AI tools, CIRO noted that "chat rooms" will not inherently violate the Recommendation Prohibition, but easily could give rise to a recommendation if a representative "influence[s] a person(s) to make an investment decision" through their discussions. CIRO has also made it clear that a recommendation may be generated by anything, including computers (and thus, AI). OEO dealers should be mindful of the Recommendation Prohibition when considering which types of AI tools may be integrated into their platforms.
Having regard to the efficiencies that AI tools can bring to the investment process, CIRO may also consider revisiting the Recommendation Prohibition as it relates to AI, so that Canadian OEO clients can enjoy the benefits of these new tools. In particular, CIRO uses a "push vs pull" analysis to determine whether research is a recommendation. This analysis says that OEO users need to pull research manually, instead of benefitting from tools that can recommend certain research on the basis of prior research or trading activities. CIRO noted that the relative weight on this analysis "will depend on the tool and the facts and circumstances." In the context of AI, it may make sense to eliminate this distinction to allow these tools to work effectively.
The Way Forward
At this point in time, Canadian registered firms and AI companies should apply existing rules and guidance to determine the appropriate use of AI in the dispensation of investment advice. The CSA's principle of "technological neutrality" advanced in Staff Notice 31-342 should be instructive when applying the business trigger for adviser registration to AI software. Similarly, registrants can consider their core client duties regarding conflict of interest management, loyalty and care when establishing disclosure, monitoring and supervision practices related to their AI-enhanced services.
Luckily, given the enthusiasm toward AI – and acknowledgment of its risks – by the CSA, we expect that it is only a matter of time before we have detailed guidance. In the meantime, please reach out to Lori Stein, Sean Sadler, or another member of our Securities Regulation & Investment Products group if you would like to discuss these matters further.
Footnotes
1 Section 13.2 of National Instrument 31-103 Registration Requirements, Exemptions and Ongoing Registrant Obligations (NI 31-103).
2 OSC Rule 31-505 Conditions of Registration.
To view the original article click here
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.