Documenting EU AI Act Compliance: A Call for High-Risk
and GPAI Use Cases
Interested parties are invited to submit their responses via this questionnaire.
The Regulation (EU) 2024/1689 (Artificial Intelligence Act)
classifies AI use by risk level and imposes documentation, auditing, and
process requirements on providers and deployers of AI systems. Thus, especially
providers of ‘high-risk’ AI systems must undergo a rigid conformity assessment
and provide technical documentation to demonstrate the AI system’s compliance with
the AI Act-related requirements before entering the European single market.
ETSI Background
With the publication of TR 104 119 “Documentation of
AI-enabled Systems”, ETSI already established a foundational documentation
approach for demonstrating AI Act-related compliance to support transparency,
accountability, and quality assurance across the lifecycle of AI-enabled
systems.
Project Objective
ETSI TC MTS / WG AI has recently started a project to
further develop the Technical Report TR 104 119. One objective of the
project is to validate the documentation approach by creating documentation for
some selected Use Cases of High-Risk AI systems and GPAI models.
Call to Action
In this project
developers, providers and deployers are invited to provide their use cases
along with corresponding requirements and, in return, receive EU AI
Act–compliant documentation. We are especially interested in use cases that
come from highly regulated domains to consider
overlaps and synergies with documentation requirements arising from other
regulatory frameworks, such as cybersecurity, product safety, and data
protection regulations.
Participation offers the opportunity to:
·
assess the applicability of ETSI documentation
guidance to real-world systems,
·
identify gaps or ambiguities in current
documentation practices,
·
and contribute practical feedback to the
evolution of ETSI technical specifications.
Interested parties are invited to submit their responses via this questionnaire.