In This Article:
Bob Lai, a personal finance blogger, sometimes uses ChatGPT, Gemini and Claude to support his investment research, but draws a firm line when it comes to decision-making.
Lai, who invests in dividend-paying stocks and index ETFs, uses artificial intelligence (AI) to analyze stocks, spot gaps in his research and get a “second opinion” on the stability of his dividend income. However, he says he always makes the final call. “I want to make sure a human (i.e. me) is behind the final decision-making process,” he said.
According to a 2025 Canadian Investor Study published by Broadridge Financial Solutions, 88 per cent of investors say they’re likely to act on the information generative AI provides. Among investors using AI, 21 per cent are millennials, 18 per cent are Gen Z, eight per cent are Gen X, and three per cent are baby boomers.
While investors are experimenting with generative AI tools to support their investing goals, experts warn an over-reliance on these systems could be risky.
AI can be useful as a research and summary tool, but the idea that investors can delegate important financial decisions to it — at least in its current state — is misguided, says Jason Pereira, senior partner and financial planner at Woodgate Financial.
One of the key problems with using AI is that investors may not know how to write prompts in such a way to get the answers they need, he says.
“They’ll overly simplify what they think is important, but not necessarily know what to ask,” Pereira said.
For example, a prompt might say, “I have $5,000 to invest. What’s the best option to reduce my tax bill?” What’s missing from this question is context, like the investor’s marginal tax rate, risk and income levels, among other factors, which would highly influence the answer.
“The trick is really to make sure you frame the questions in as many details as possible,” Lai said, referring to the use of AI tools in analyzing stocks and how companies are performing. “I have found that a good way to get more insight is by asking follow-up questions based on the initial answers.”
Pereira, who uses AI regularly, says he frequently catches it making mistakes, taking shortcuts and leaving tasks incomplete.
“My analogy for this is it's a really smart lazy intern,” he said. “The more complicated the tasks you give it, the more likely it is to quit and tell you it’s done — without telling you that it hasn’t actually finished the project.”
Those new to AI as well as investing are likely to have “unknown unknowns” — things they don’t even realize they don’t know. So when AI gives an answer, it’s hard to decipher whether it's correct, unless you have specific expertise in that area, Pereira says.