EU AI Act for FinTech:
Obligations, Deadlines
and How to Prepare

If your company uses artificial intelligence to assess credit, manage financial risk, or automate decisions affecting customers, the EU AI Act applies to you directly. Here's everything you need to know to be ready by August 2026 — with no surprises.

Why the AI Act is urgent for FinTechs

The financial sector is among the most exposed to Regulation EU 2024/1689 — commonly known as the AI Act — for one precise reason: it uses artificial intelligence to make decisions that directly impact people's lives. Granting or denying a loan, calculating insolvency risk, determining a customer's insurance premium. These are high-impact decisions, and the AI Act treats them as such.

The most common mistake we see in FinTechs is thinking the problem only concerns those who build AI algorithms from scratch. That's wrong. If you use third-party software for credit scoring, you are still subject to obligations — as a deployer. And the obligations from August 2026 are anything but trivial.

⚠️ Warning

The 2 August 2026 deadline is non-negotiable. Companies that arrive unprepared risk significant fines and, above all, being unable to continue operating their AI systems until they achieve compliance.

Who is affected: provider, deployer and the fine line between them

The AI Act distinguishes two fundamental roles, with very different obligations:

Role Who they are FinTech examples Obligation level
Provider Who develops and markets the AI system Startup selling a credit scoring engine to banks 🔴 Very high
Deployer Who uses a third-party AI system in their business Bank or FinTech using purchased credit scoring software 🟠 High

When a deployer becomes a provider

This is the trap many FinTechs fall into. You are reclassified as a provider — with all the heavier obligations — if you:

ℹ️ Practical case

A FinTech buys a scoring model from a provider, then re-trains it on their own customer data to improve accuracy. This fine-tuning is very likely a "substantial modification" that transforms the FinTech from deployer to provider, with all the resulting certification obligations.

Which FinTech AI systems are high-risk

Annex III of Regulation EU 2024/1689 explicitly lists high-risk AI systems in the financial sector. Here they are with concrete examples:

AI System Concrete examples Classification
Credit scoring Algorithms to assess creditworthiness of individuals, BNPL scoring, mortgage pre-approval 🔴 High risk
Creditworthiness assessment Systems to determine repayment capacity, automated debt-to-income analysis 🔴 High risk
Insurance risk scoring Personalised insurance pricing algorithms, real-time premium calculation 🔴 High risk
Fraud detection Real-time fraud detection on transactions 🟡 Depends on use — verify
Algorithmic trading Automated order execution, HFT algorithms 🟡 Depends — verify with DORA
Customer service chatbots AI answering questions on financial products 🟢 Limited risk (transparency only)
Document processing OCR and AI for processing bank statements, payslips 🟢 Minimal risk
⚠️ Watch out for fraud detection

Fraud detection systems are not automatically high-risk, but may become so if they produce automated decisions that restrict access to financial services (e.g. automatic account blocking). Each case must be assessed individually.

Deadlines: what is already in force today

Many FinTechs think the AI Act is "something for 2026." In reality some deadlines have already passed.

Date What enters into force Status
2 Feb 2025 AI Literacy mandatory for all staff using AI. Prohibited practices banned. ✅ Already in force
2 Aug 2025 GPAI model obligations. Penalties active. EU governance regime operational. ✅ Already in force
2 Aug 2026 Full enforcement for high-risk systems: credit scoring, creditworthiness, risk scoring. ⏳ 18 months
2 Aug 2027 AI systems already on the market before August 2025. Legacy systems. ⏳ 30 months
⚠️ Already behind?

If your FinTech has not yet launched an AI Literacy programme for employees using AI systems, you have been out of compliance since the 2 February 2025 deadline. This is the first obligation to fix — and the simplest to implement.

Concrete obligations for FinTechs

Obligations already in force (for all)

AI Literacy — Art. 4: every company must ensure that staff using AI systems have the necessary competence to understand the capabilities, limitations and risks of those systems. This is not deep technical training: it is operational awareness. It must be calibrated by role — different for those using the tool operationally, for managers, and for the C-suite.

Obligations from August 2026 for deployers (using third-party systems)

Obligations from August 2026 for providers (who develop AI)

Provider: the complete list

Continuous risk management system: not a static document — an active process throughout the entire lifecycle of the system.

Complete technical documentation: model architecture, training data, performance metrics, known limitations, intended and unintended use cases.

Data quality and governance: datasets must be documented, representative, and free of unjustified bias.

Pre-deployment conformity assessment: for high-risk systems, a formal compliance process before placing the system on the market (self-assessment or third party).

Registration in EU database: mandatory before commercialisation.

Post-market monitoring: systematic collection of data on real-world system performance, with documented update plans.

CE marking: EU declaration of conformity for high-risk systems.

Penalties

The AI Act provides a proportionate penalty system with three tiers:

Violation Maximum penalty % turnover
Prohibited practices (Art. 5) €35,000,000 7% global turnover
High-risk system obligations €15,000,000 3% global turnover
False information to authorities €7,500,000 1% global turnover

The higher value between the fixed amount and the percentage applies. For SMEs, the lower value always applies. A FinTech with €3M turnover faces up to €90,000 for non-compliance on high-risk systems — not catastrophic, but enough to make compliance a rational investment.

ℹ️ DORA and AI Act together

FinTechs must consider the AI Act alongside the Digital Operational Resilience Act (DORA), fully applicable since January 2025. The two regulations overlap on audit trails, incident reporting and ICT risk management. Integrated compliance is more efficient than two separate tracks.

Checklist: are you already compliant?

Use this checklist for a quick initial self-assessment. It does not replace a professional audit, but gives you an immediate picture of your exposure level.

✅ Obligations already in force (Feb 2025)

⏳ To prepare for August 2026 (deployer)

⏳ To prepare for August 2026 (provider)

Not sure where to start?

We offer a free 30-minute initial assessment. In a single call you'll understand your exposure level and the first concrete steps to take.

Book your free assessment

FAQ

Does the AI Act apply to FinTechs using foreign cloud providers?

Yes. The AI Act applies to anyone operating in the European market or whose AI systems produce effects on people in the EU — regardless of where the technology provider is located. Using AWS, Azure or Google Cloud does not exempt you from compliance.

If I use ChatGPT or Claude for internal assistance, am I subject to the AI Act?

For non-critical internal uses (drafting text, summarising documents, customer service support) probably not — or with very limited obligations. But if you use these models to support decisions affecting customers (e.g. generating credit recommendations), the situation changes. Each case must be assessed in its specific context.

My software vendor said they handle compliance. Should I still be concerned?

Yes, at least in part. The vendor (provider) has their own obligations, but the deployer — you — has independent obligations of your own: human oversight, input data quality, log retention, customer disclosure, FRIA. You cannot delegate everything to the supplier.

How much does compliance cost?

It depends on the complexity of your AI systems. For an SME FinTech with 1–3 high-risk AI systems, a full compliance journey (audit, documentation, training, implementation) typically falls between €5,000 and €20,000. A one-off cost, compared to potential fines of tens of thousands of euros.

What is the difference between GDPR and the AI Act?

The GDPR regulates the processing of personal data. The AI Act regulates AI systems as such — regardless of whether they use personal data. The two regulations overlap (a high-risk AI system often also processes personal data) but have distinct obligations and different supervisory authorities. Two separate compliance tracks are needed, even if coordinated.


This article is for informational purposes only and does not constitute legal advice. For specific business decisions, consult a qualified professional. Source: Reg. EU 2024/1689. Updated February 2026.