Fortune reported this week that more than a third of consumers across all age groups are now using tools like Claude and ChatGPT for investment guidance, often consulting them before or instead of speaking to a human financial adviser. A mainstream shift is happening in real time, and the finance industry is actively building products around it.
Major banks and wealth management platforms are rolling out AI-driven portfolio recommendation tools and chat-based ‘advisers’ that sit alongside traditional human services. Anthropic has launched Claude for Financial Services, tuned specifically for market analysis, portfolio review and compliance tasks. The direction this is heading in is clear: AI is moving into the advice layer of financial services, and it’s doing so faster than the regulatory frameworks governing that layer.
It’s worth examining the tension this creates. The positive perspective is that AI democratises financial advice for people who could never afford a human adviser, making financial planning more accessible and reaching populations that were previously excluded.
The more sceptical perspective is that it’s unregulated, frequently inaccurate, and one confident-sounding bad recommendation away from causing serious harm to people who had no idea the distinction between education and advice even existed.
Both perspectives are defensible, and both are already playing out.
The Democratisation Argument Is Valid, And So Are Its Limits
The case for AI in personal finance starts with access. For generations, quality financial guidance was effectively gated behind professional fees that most people couldn’t afford.
The rise of digital banking and fintech has begun to shift that, and AI is the next step in the same direction: someone who has never sat across a desk from a financial adviser can now ask an AI to explain what an ISA is, how to think about their pension contributions, or what the difference is between a term loan and a line of credit.
Studies on AI in finance show the tools can handle many routine personal finance tasks well, covering budgeting basics, cash flow planning and rule-based savings strategies, and can significantly compress the time it takes to reach a starting point for a decision. For the tens of millions of people in markets like the UK who have historically had no access to professional guidance at all, that access has value.
However, the limits arrive quickly and they arrive in the places that matter most. AI doesn’t know your specific circumstances. It doesn’t know your actual risk tolerance, your tax situation, your debt position or the fact that your job is less stable than it looks on paper.
A UK-focused study found that many chatbots produced inaccurate or non-compliant financial advice, with major discrepancies in relevance, price accuracy and adherence to local rules around suitability, product-risk disclosures and pension-specific guidance. The tools answer with confidence regardless of whether the confidence is warranted, and most consumers have no reliable way to tell the difference.
Furthermore, a recent CFA Institute report on next-generation investors found that 92% of Gen Z and 89% of Millennials still use some form of paid financial advice, despite their digital fluency and high confidence with technology. What younger investors want isn’t AI instead of an adviser. They want advisers who can use AI to deliver more personalised, responsive guidance at lower cost.
The threat to traditional financial services isn’t replacement – it’s irrelevance for those who fail to adapt.
The Accountability Gap Is The Part Nobody Wants To Talk About
Registered financial advisers operate under fiduciary obligations, carry insurance and face regulatory consequences when advice goes wrong.
An AI tool that does functionally the same thing operates in an entirely different world. There is no fiduciary duty, no regulatory oversight and no meaningful recourse when a recommendation turns out to be wrong for someone’s specific circumstances. Most consumers using these tools have no idea that structural difference exists.
The tax and accounting space is where some of the clearest harms are already appearing. Accountants and bookkeepers are reporting clients who have taken AI-generated guidance on tax positions, VAT decisions and expense treatment as if it were professional advice.
The output looks authoritative on screen, but when a qualified professional reviews it, the errors surface quickly, and cleaning up the mess falls to the professional, not the chatbot.
We asked a group of experts from across financial services, fintech and consumer finance to weigh in on the central question: is AI financial guidance a democratising force, a regulatory timebomb, or something more complicated than either framing suggests?
More from Finance
- Are Stablecoins Quietly Becoming The Backbone Of Modern Finance Or Merely Supporting It?
- HSBC Just Hired Its First Ever Chief AI Officer. Every Other Bank Is Taking Notes
- Experts Comment: Is The Digital Euro The Future Of Money Or The End Of Financial Privacy?
- Experts Comment: Is Mastercard’s £1.4B Investment In A London Stablecoin Startup A Sign Of Fintech’s Next Frontier?
- Why The Next Unicorn Might Launch On A Decentralised Platform
- Cocoa Investment And The Price Drop In 2026
- Why Is Bitcoin Crashing?
- Why Are Gold And Silver Crashing?
Our Experts:
- Adam Woodhead, Co-Founder and Senior Financial Platform Analyst, The Investors Centre
- Carl Hazeley, CEO, Finimize
- Artur Szablowski, Editor-in-Chief, Finonity
- Leigh Coney, Founder and Principal Consultant, WorkWise Solutions
- Jacob Bennett, Co-Founder and CEO, Crux Analytics
- Paul Lodder, VP of Accounting and Product Strategy, Dext
Adam Woodhead, Co-Founder and Senior Financial Platform Analyst, The Investors Centre
![]()
“We can see this every day. Retail investors come onto our site having used AI. While there is no issue with them using AI, the issue is that the same tool that teaches you what an ISA is will also recommend a particular investment strategy based solely on the information you input, regardless of how much debt you may owe, your income level or how much risk you are willing to take. Most consumers do not know where education ends and advice begins. That is where the damage occurs.
“It is democratising education, but not outcomes. For those who cannot afford an IFA and have never had a face-to-face meeting with a financial professional, AI does lower the cost of entry to learning about finance. That is true. However, the biggest concern is the over-confident middle class: individuals who have learned some things about finance, have some disposable income, and have enough experience with AI to use its output as the basis for making decisions without questioning it.
“More than one-third of all consumers across various demographics are currently using AI to help guide their investment choices. This is not a measure of adoption. It is a liability gap waiting to evolve into a consumer harm incident. Regulated advisers assume responsibility for the advice they provide. Currently, AI tools do not.
“Fintech founders need to stop referring to this as democratisation. Embedding AI-driven guidance within a product that manages actual money without providing the protections afforded by a regulated adviser is nothing short of regulatory arbitrage dressed up as innovation.”
Carl Hazeley, CEO, Finimize
![]()
“AI-generated financial advice done well should be treated like an analytical co-pilot. It can help investors understand concepts, model scenarios, simplify jargon, and stress-test ideas before they act. It can lower the intimidation barrier that often stops people engaging with investing or financial planning in the first place.
“But decisions that affect your financial future should never rely on a single source, human or machine. Where the risk comes in is when guidance starts to feel like certainty. AI can present information in a very confident, polished way, even when the underlying logic is incomplete, generic, or simply wrong.
“We are way past thinking that technology is neutral, especially AI. The moment a tool begins shaping how people save, invest, borrow or assess risk, its creators can no longer avoid responsibility. That means being clear about what the tool can and cannot do. It means avoiding the temptation to overstate personalisation or predictive power. And it means designing products that encourage users to question, not just accept. The standard cannot just be ‘is it impressive?’ It has to be ‘is it responsible?'”
For any questions, comments or features, please contact us directly.
Artur Szablowski, Editor-in-Chief, Finonity
![]()
“About a third of our readers, maybe more, come to us after they already got an answer from ChatGPT or Claude about some market situation. They come to verify. Think about what that means for a second. People are already making financial decisions based on what a chatbot told them and then checking afterwards whether it was right. That’s backwards.
“The democratisation part is real. Someone in Lagos or Ho Chi Minh City who could never afford a Bloomberg terminal can now ask an AI about interest rate policy or what a yield curve inversion means. Five years ago that person had nothing, now they have something.
“The problem is that something sounds incredibly confident while being incomplete. AI will explain what a bond is perfectly. It will not tell you about the liquidity risk in your specific market, the tax treatment in your jurisdiction, or the fact that your local broker charges a 3% spread that eats the entire trade. The answer reads like it came from a textbook because it literally did. Real financial decisions happen in the messy details that textbooks leave out.
“And nobody is responsible when it goes wrong. If I publish bad analysis, my name is on it. If a licensed adviser gives a terrible recommendation, they can lose their licence. If ChatGPT says buy Nvidia at the top and someone listens, who answers for that? Nobody. There is no accountability layer and there won’t be one anytime soon, because regulators are still figuring out how to regulate the last generation of fintech.”
Leigh Coney, Founder and Principal Consultant, WorkWise Solutions
![]()
“Fidelity just launched an AI tool called Freya that answers personal finance questions but, their words, ‘makes very clear its responses are not advice.’ You built a product that walks, talks, and looks exactly like financial advice, and then you put a disclaimer at the bottom. That’s a legal strategy. It’s not accountability.
“There’s a gap right now where registered advisers have fiduciary obligations, carry insurance, and face regulatory consequences when things go wrong. AI tools that do functionally the same thing operate in a totally different world. No fiduciary duty. No insurance. No one to call when the recommendation was wrong for your situation. And the consumer has no idea that distinction exists.
“I think fintech founders need to sit with a specific scenario and be honest about it. A 62-year-old retiree uses your AI product. Follows its portfolio suggestion. The allocation was wrong for someone in their situation, and they lose a chunk of their retirement savings. Your terms of service say you’re not liable. Your marketing implied your product was trustworthy and personalised. Who picks up the pieces? Nobody. That’s the current answer. Nobody.
“The adviser shortage is coming. McKinsey projects something like 100,000 financial advisers short by 2034. AI will fill part of that gap whether the industry likes it or not. But if founders treat this as a growth opportunity with disclaimed risk, we’re going to see real consumer harm, followed by heavy regulation, followed by years of rebuilding trust.”
For any questions, comments or features, please contact us directly.
Jacob Bennett, Co-Founder and CEO, Crux Analytics
![]()
“There is a meaningful difference between using AI to understand your financial options and using it to make financial decisions, and I think we are conflating the two in ways that will eventually hurt people.
“For a business owner trying to figure out whether a specific loan makes sense, or how rising rates affect their borrowing capacity, AI can be a useful starting point. It can surface information, explain terminology, and help you ask better questions before you walk into a bank. That is real value.
“But the moment you move from understanding to deciding, context matters enormously. AI can process information at scale, but it cannot weigh your specific circumstances, your cash flow history, or the nuances of your local market. The democratisation argument is appealing, but access to information and access to good advice are not the same thing. The risk is not that people will start asking AI financial questions. The risk is that they will stop there.”
Paul Lodder, VP of Accounting and Product Strategy, Dext
![]()
“General-purpose AI tools such as ChatGPT are increasingly being used for tax guidance, but they were never built for that role. These systems generate answers based on patterns in training data, not by understanding a specific business’s finances or applying tax rules to a real situation. That distinction matters more than many people realise.
“Across the country, accountants and bookkeepers are starting to see the repercussions play out. A business arrives with a tax position, expense decision or VAT approach that began life as a chatbot prompt. On screen it can look neat and convincing, but once a professional digs into it, the cracks quickly appear. When that happens, it’s the professional who has to unravel the advice and work out what actually complies with HMRC rules.
“With Making Tax Digital for Income Tax only weeks away, the pressure is about to ramp up. Thousands of sole traders and landlords are preparing for more frequent reporting and new compliance demands, and in that environment the temptation to ask ChatGPT for quick tax guidance will only grow. AI definitely still has a place in modern finance, but general-purpose LLMs should not be mistaken for tax advisers. What’s needed now are clearer guidelines and firmer guardrails around how these tools are used when financial advice is involved.”
For any questions, comments or features, please contact us directly.
