The FCA’s AI Regulatory Framework in 2026: What Principles-Based Really Means

Deliberate permissiveness — and deliberate ambiguity

The FCA's approach to artificial intelligence in UK financial services is deliberate, carefully argued, and — for compliance teams — somewhat frustrating in its lack of specificity.

In December 2025, FCA CEO Nikhil Rathi reaffirmed that the regulator "will not introduce AI-specific rules." (Lexology) Instead, the FCA is relying on its existing framework — principally Consumer Duty, the Senior Managers and Certification Regime (SM&CR), and operational resilience requirements — to govern AI use. The regulator believes, not without justification, that technology-neutral rules are more durable than technology-specific ones, and that the principles already on the books are sufficient to address the risks AI presents.

For firms, this creates both opportunity and genuine uncertainty. The opportunity: there is no prescriptive list of prohibited AI applications, no mandatory approval process for AI tools, and no AI-specific reporting requirements. The uncertainty: when something goes wrong, firms must demonstrate that their existing obligations were met — without the comfort of a specific rule they can point to as evidence of compliance.


The Mills Review: a three-stage map of AI in retail finance

In January 2026, Executive Director Sheldon Mills launched a long-term review of AI in retail financial services — the most significant FCA initiative in this space since Consumer Duty itself. (FCA)

The review's framing is particularly useful for understanding where AI is heading and what regulatory risk looks like at each stage. Mills identified three categories of AI application in financial services:

1. Assistive AI — Tools that support human advisers in tasks like report writing, compliance checking, or client communication. This category is already widespread. Products like AdvisoryAI and Aveni Assist fall here. The human remains the decision-maker; AI produces inputs. Regulatory risk is lowest.

2. Advisory AI — Systems capable of generating personalised guidance for consumers, potentially without human sign-off on each recommendation. This is the emerging frontier. Robo-advisor platforms with personalised portfolio construction capabilities sit here. Regulatory risk is moderate to high, depending on whether the output constitutes regulated advice.

3. Autonomous AI — Fully automated systems making and executing regulated decisions without human intervention. This is the horizon case. It is not yet widespread but the Better.com mortgage engine, which approved loans "in seconds" through ChatGPT, (National Mortgage Professional) suggests the timeline may be shorter than expected.

Recommendations from the Mills Review are due in summer 2026. (FCA) For firms in the Advisory and Autonomous categories, those recommendations will be highly consequential.


How Consumer Duty applies to AI

Consumer Duty is the most practically important regulatory framework for AI in UK financial services, and it creates obligations that AI can help firms meet — but also new risks if AI is implemented poorly.

The Duty requires firms to deliver good outcomes for retail customers across four areas: products and services, price and value, consumer understanding, and consumer support. For AI applications:

Products and services: AI systems used in product design or suitability assessment must not embed biases that systematically disadvantage any customer segment. Algorithmic bias in credit scoring or investment risk profiling is a live FCA concern.

Consumer understanding: AI-generated communications — including suitability reports, simplified advice summaries, or chatbot responses — must meet the same clarity standards as human-produced ones. Hallucinations or confident but incorrect AI outputs are an obvious failure mode here.

Consumer support: AI tools used in client communication must be able to identify and appropriately escalate situations involving vulnerable customers. The FCA and ICO are preparing joint guidance on AI and vulnerable customers, expected in early 2026. (Voyc AI)

Monitoring: Consumer Duty requires firms to monitor outcomes across 100% of client interactions. AI-powered conversation intelligence tools — capable of analysing every client touchpoint — are uniquely suited to this requirement. (IFA Magazine) The irony is that AI creates a Consumer Duty monitoring obligation that only AI can cost-effectively fulfil.


SM&CR and the accountability gap

The Senior Managers and Certification Regime raises a critical question for AI deployment: who is the senior manager accountable for AI decisions?

SM&CR requires named individuals to hold prescribed responsibilities for key business functions. When an AI system contributes to a regulated outcome — a suitability determination, a credit decision, a client communication — there must be a human senior manager who owns that system's outputs. This cannot be delegated to the technology itself.

For firms using third-party AI tools, this creates an important procurement question. If the AI vendor's system produces a flawed output that leads to poor client outcomes, the senior manager responsible for that tool's deployment remains accountable to the FCA. Vendor contracts must clearly allocate responsibility, data access rights must be maintained for audit purposes, and firms must be able to override or escalate away from AI recommendations in real time.

BCLP's analysis of the FCA's approach describes the challenge as "turning principles into practice" — a gap that the Mills Review recommendations are expected to begin closing. (BCLP)


Operational resilience: the risk firms are underestimating

The FCA's operational resilience framework — which came into full effect in March 2025 — requires firms to identify their important business services and demonstrate they can remain within impact tolerances during severe disruption.

AI systems that become critical to core operations — suitability report production, client communication, compliance monitoring — become important business services under this framework. If the AI platform goes down, the firm must have a resilience plan. If the AI vendor is acquired or shut down, the firm must be able to continue operating. Concentration risk in AI tool adoption is a real regulatory concern that most firms have not yet properly stress-tested.


What the FCA's stance means for different firm types

The practical effect of principles-based AI regulation differs significantly by firm size and complexity:

Large firms with legal and compliance teams are well-placed to interpret principles-based regulation and build bespoke governance frameworks. They have the resource to conduct AI audits, maintain oversight logs, and adapt rapidly when guidance evolves.

SME advice firms face a harder challenge. Without dedicated compliance resource, interpreting how Consumer Duty applies to a specific AI tool, or how to structure SM&CR accountability for AI decisions, requires external guidance they often cannot easily access or afford. YouGov research found that 49% of non-AI-adopters in B2B professional services cite data privacy concerns as the top barrier. (YouGov) For many SMEs, this is really a proxy for regulatory uncertainty — they do not know what is safe.


Upcoming regulatory milestones: the 2026 calendar

  • Early 2026: FCA-ICO joint guidance on AI and vulnerable customers
  • April 2026: SRA-commissioned research on AI and legal professional conduct (relevant for dual-regulated firms)
  • Summer 2026: Mills Review recommendations published
  • Ongoing: AI Airlock regulatory sandbox (MHRA Phase 2 — relevant for health-adjacent firms)

Firms across financial services should treat the Mills Review recommendations as the next significant compliance planning trigger. Even if the FCA does not create AI-specific rules, the Review's framing of what responsible AI use looks like in an Advisory or Autonomous context will define best practice — and, by extension, FCA expectations.


Key statistics at a glance

  • FCA confirmed no AI-specific rules — existing Consumer Duty, SM&CR frameworks apply (Lexology)
  • Sheldon Mills Review launched January 2026, recommendations due summer 2026 (FCA)
  • Three FCA AI categories: Assistive, Advisory, Autonomous (FCA)
  • AI enables 100% Consumer Duty monitoring vs previous 5% manual sampling (IFA Magazine)
  • 49% of SME non-adopters cite data privacy / regulatory uncertainty as the primary barrier (YouGov)
  • FCA-ICO joint guidance on AI and vulnerable customers expected early 2026 (Voyc AI)

MarGen helps professional services firms build content strategies that get cited by AI systems and ranked by search engines. Find out more.

Leave a Reply

Your email address will not be published. Required fields are marked *



Hello!
Download The $100M Revenue Acquisition Guide
........
Simply enter your email
address to download
DOWNLOAD NOW
close-link