Catalog
concept#Governance#AI#Product#Security

AI Act

EU regulatory framework for AI systems that sets requirements for risk assessment, transparency and governance.

The AI Act is the EU regulatory framework for governing AI systems.
Established
High

Classification

  • High
  • Organizational
  • Organizational
  • Intermediate

Technical context

Internal QA and audit toolsTicketing and governance platformsData pipelines and data catalogs

Principles & goals

Risk-based approach: stricter rules for higher risk.Transparency: traceability for users and auditors.Accountability: clear roles for providers and users.
Discovery
Enterprise, Domain

Use cases & scenarios

Compromises

  • Missing or incorrect documentation may lead to sanctions.
  • Overly complex compliance processes may hinder innovation.
  • Inconsistent interpretation across authorities complicates planning.
  • Early involvement of legal, compliance and engineering in the product lifecycle.
  • Automated documentation of data and model decisions.
  • Regular audits and feedback loops to adapt measures.

I/O & resources

  • Product and system descriptions
  • Data schemas and provenance
  • Model and training documentation
  • Compliance reports and technical documentation
  • Governance policies and role descriptions
  • Monitoring and audit plans

Description

The AI Act is the EU regulatory framework for governing AI systems. It defines risk categories, requirements for transparency, oversight and governance, and obligations for providers and users. The concept helps organisations assess compliance risks and implement legal requirements across products, services and development lifecycles.

  • Improved protection of citizens' rights and safety.
  • Clarity for providers regarding obligations and evidence paths.
  • Promotion of trustworthy innovation in the market.

  • Regional scope: primarily focused on the EU market.
  • Ambiguities in technical requirements may exist.
  • Operational implementation can be resource-intensive for small providers.

  • Share of compliant products

    Percentage of products meeting the requirements.

  • Time to compliance implementation

    Average time from identification to implementation of required measures.

  • Number of documented audits

    Count of successfully completed internal or external audits.

EU proposal text (example application)

Excerpt from the official proposal illustrating requirements.

Compliance checklist for product teams

Practical example of a team checklist to meet requirements.

Governance template for providers

Template for internal policies to implement obligations.

1

Perform initial gap assessment against AI Act requirements.

2

Determine risk classes for products and processes.

3

Introduce documentation and monitoring processes.

4

Adapt contracts and supplier requirements.

⚠️ Technical debt & bottlenecks

  • Lack of automated traceability in the CI/CD pipeline.
  • Legacy models in use without sufficient training documentation.
  • Inconsistent dataset and versioning practices.
Lack of data provenance and traceabilityLimited internal compliance capacitiesUnclear technical specifications for audits
  • Incomplete risk analysis leads to incorrect risk classification.
  • Missing transparency disclosures to users.
  • Documentation prepared only internally and not audit-ready.
  • Overestimating internal compliance capability without external review.
  • Ignoring national implementation guidance despite EU regulation.
  • Unclear responsibilities between product and legal teams.
Legal and regulatory knowledge in data protection and AI lawTechnical understanding of models and data flowsRisk management and audit competence
Traceability and documentation of decisionsRobustness and security of modelsClear responsibilities and governance structures
  • Scope limited to EU law and consider extraterritorial effects
  • Technical standards still under development
  • Dependence on interpretation by national authorities