How we have helped real organisations understand their exposure to the EU AI Act and take action ahead of the deadline.
An international startup accelerator — with European offices including Italy, Germany and the Netherlands — was using an AI chatbot to answer questions from founders during the application process. The system was classified as limited risk. Then came the request to add an automated candidate scoring feature. That single addition would have shifted the entire system into the EU AI Act's high-risk category, triggering a completely different set of technical and legal obligations.
A representative scenario based on a European FinTech SME using a third-party credit scoring system. The most common misconception in this context: believing that AI Act obligations rest solely with the software vendor. The deployer, however, has a distinct set of its own obligations — human oversight, log retention, FRIA, client disclosure — entirely independent of those of the provider.
Tell us about your situation. The first call is free.
Book a free call →