Free Compliance Assessment

Is your AI deployment EU AI Act ready?

12 questions. Under 5 minutes. Receive a personalized compliance gap report with article-by-article analysis and a remediation roadmap.

No account required PDF delivered instantly Built for GC & CCO teams
📋

Ready to assess your compliance posture?

This assessment covers the key provisions of the EU AI Act applicable to regulated-industry deployments: Articles 9, 13, 14, 15, 16–17, and Annex III.

Answer 12 questions about your AI deployment
Get instant article-by-article gap analysis
Receive your PDF report by email immediately
Question 1 of 12

What is your primary industry vertical?

Select the category that best describes your organization's regulated activities.

Question 2 of 12

What is your primary AI use case?

The use case determines which Annex III categories apply to your deployment.

Question 3 of 12

Does your organization operate within the EU?

The EU AI Act applies to any AI system deployed in the EU, regardless of where the provider is incorporated.

Question 4 of 12

How is your AI model deployed?

Deployment architecture affects conformity assessment obligations and traceability requirements.

Question 5 of 12

Have you self-assessed your system as high-risk under Annex III?

Annex III lists specific high-risk categories including: employment decisions, access to essential services, law enforcement, and administration of justice.

Question 6 of 12

How do you currently verify AI outputs before they reach end users?

Output verification is central to Articles 9 and 14 of the EU AI Act for high-risk systems.

Question 7 of 12

How do you log AI-generated outputs and errors?

Articles 16 and 17 require high-risk AI providers to keep detailed technical documentation and event logs.

Question 8 of 12

How do you ground AI outputs in source material?

Source grounding is critical for Article 15 accuracy and robustness requirements, and for professional liability in legal and financial contexts.

Question 9 of 12

How frequently do you conduct adversarial testing of your AI system?

Article 9 requires a systematic risk management process including testing for foreseeable misuse and adversarial conditions.

Question 10 of 12

What is your Article 14 human oversight model?

Article 14 requires that high-risk AI systems allow humans to effectively oversee the system and intervene in its operation during its use period.

Question 11 of 12

What Article 13 transparency disclosures do you currently provide?

Article 13 requires high-risk AI systems to be transparent so deployers understand capabilities, limitations, and appropriate use.

Question 12 of 12 — Final step

Where should we send your compliance report?

Your personalized PDF report will be emailed immediately. We will not share your contact details with third parties.

Something went wrong. Please try again or contact us at compliance@octomind-9fce.polsia.app
✉️

Your report is on its way

We've generated your personalized EU AI Act compliance gap report and sent it to your email. Check your inbox — it should arrive within 60 seconds.

Download PDF Report

Ready to close these gaps with a Sturna pilot?

Book a 30-minute discovery call →

Not legal advice. This assessment is for informational purposes only. Consult qualified legal counsel for compliance determinations under Regulation (EU) 2024/1689 (EU AI Act). Sturna is an AI verification infrastructure provider, not a law firm.