EU AI Act
The EU AI Act establishes regulatory requirements for AI systems in the European Union.
Overview
The EU AI Act categorizes AI systems by risk:
- Unacceptable risk - Prohibited
- High risk - Strict requirements
- Limited risk - Transparency obligations
- Minimal risk - No specific requirements
High-Risk Requirements
GLACIS helps demonstrate compliance with high-risk requirements:
| Article | Requirement | GLACIS Support |
|---|---|---|
| 9 | Risk management | Impact assessments |
| 10 | Data governance | Data quality controls |
| 11 | Technical documentation | System registry |
| 12 | Record-keeping | Attestation logs |
| 13 | Transparency | User documentation |
| 14 | Human oversight | Oversight controls |
Mapping to ISO 42001
| EU AI Act | ISO 42001 | GLACIS Feature |
|---|---|---|
| Article 9 | A.5 | Impact assessments |
| Article 10 | A.7 | Data management |
| Article 11 | A.4 | AI system registry |
| Article 12 | A.6.2.6 | Attestation service |
| Article 14 | A.9.3 | Human oversight |
Using GLACIS for EU AI Act
- Run the Certification Wizard (select EU AI Act)
- Identify high-risk AI systems
- Map to applicable requirements
- Collect evidence via attestations
- Generate compliance reports