Search

What Is a Conformity Assessment?

A conformity assessment is the procedure by which a provider of a high-risk AI system demonstrates that the system meets the requirements of the EU AI Act before placing it on the market or putting it into service. It is the AI Act's gatekeeping mechanism for high-risk systems: no high-risk AI system may legally be placed on the EU market or put into service without a completed conformity assessment. The assessment produces a technical documentation package, a formal declaration of conformity, and (where the assessment is self-conducted) a record that must be held available for at least 10 years after placement on the market.

Conformity assessments serve two audiences. For the market, they represent the provider's documented assurance that the system has been evaluated against the Act's requirements and found compliant. For regulators, they are the primary source of evidence in any market surveillance investigation. An assessment that is incomplete, inaccurate, or superficial is not merely a documentation deficiency: it is a direct compliance failure that can result in the system being prohibited from the market, recall orders, and significant financial penalties.

Self-Assessment vs Third-Party Assessment

The AI Act provides for two types of conformity assessment, and which applies depends on the nature of the high-risk AI system. For most Annex III high-risk AI systems, the provider may conduct a self-assessment: an internal procedure conducted against the Act's requirements using the provider's own technical staff and processes, without involvement of a notified body. The provider issues a declaration of conformity on the basis of this self-assessment.

Third-party conformity assessment by an accredited notified body is mandatory for high-risk AI systems that are biometric identification systems, certain AI systems used in critical infrastructure, and AI systems that are safety components of products subject to third-party assessment under existing EU product safety legislation. Where third-party assessment is required, the system cannot be placed on the market until the notified body has issued a positive assessment certificate.

Even where self-assessment is permitted, it is not easy. The self-assessment process requires detailed technical documentation covering the system's design, architecture, training data, performance metrics, testing results, and risk management record, all assessed against the standards and requirements specified in the Act and, progressively, in harmonised technical standards developed under the Act. Providers who lack the internal technical capability to conduct a rigorous self-assessment should engage external technical advisers to support the process.

What the Assessment Covers

The conformity assessment evaluates the high-risk AI system against the requirements of Chapter III, Section 2 of the AI Act. These requirements cover risk management (a continuous process identifying and mitigating risks throughout the system's lifecycle), data and data governance (training, validation, and testing datasets meeting quality criteria for relevance, representativeness, and freedom from error), technical documentation (comprehensive documentation of the system enabling the authorities to assess compliance), record-keeping (logging capability sufficient to enable post-deployment traceability), transparency to deployers (instructions for use documenting capabilities, limitations, and performance metrics), human oversight (design features that enable human review and override of system outputs), accuracy, robustness, and cybersecurity (performance meets specified metrics and the system is resilient to manipulation), and quality management (an organisational quality management system covering the full development lifecycle).

For each of these requirements, the assessment must produce documented evidence that the system meets the standard. Where harmonised standards exist, compliance with those standards creates a presumption of conformity. In the absence of published harmonised standards (which is the situation for most requirements in 2024 and 2025, as the standards are still being developed) providers must interpret the Act's requirements directly, a more demanding exercise that benefits from legal and technical advisory input.

Registration and CE Marking

Following a successful conformity assessment, the provider must register the high-risk AI system in the EU AI database operated by the European Commission. Registration creates a public record of the system's existence, its provider, its intended purpose, and its compliance status. CE marking must be affixed to the system (or its accompanying documentation) indicating compliance with the AI Act and any other applicable EU legislation.

The registration obligation applies before the system is placed on the market or put into service. It is not a retrospective administrative step: a system that is deployed and then registered is a system that was non-compliant at the point of deployment.

Frequently Asked Questions

Can all high-risk AI systems be self-assessed?

No. Third-party assessment by a notified body is mandatory for biometric identification systems and certain other categories. For the majority of Annex III high-risk AI systems, self-assessment is permitted, but it requires rigorous technical documentation and must be conducted against the full requirements of Chapter III Section 2. Self-assessment does not mean cursory review: it means applying the same substantive standards as third-party assessment, with the provider itself bearing the burden of demonstrating compliance.

How long does a conformity assessment take?

Timeline depends on system complexity and the provider's documentation maturity. For a relatively straightforward Annex III system with good existing technical documentation, a self-assessment might be completed in 2 to 3 months. For a complex biometric or safety-critical system requiring third-party assessment, timelines of 6 to 12 months or more are realistic, particularly while notified body capacity is being developed. Providers who leave conformity assessment until close to the August 2026 deadline face the risk that they cannot complete the process in time.

Does the conformity assessment need to be repeated?

The assessment must be updated whenever the AI system is substantially modified, meaning a change that affects the system's compliance with the Act's requirements or its risk profile. The AI Act's definition of substantial modification is broad enough to capture significant changes to training data, algorithms, intended use, or performance characteristics. Providers should build assessment review triggers into their product development and change management processes.

What happens if a high-risk system is placed on the market without a conformity assessment?

Placing a high-risk AI system on the market without a completed conformity assessment, without registration in the EU AI database, or without CE marking is a direct violation of Article 16 of the AI Act. Market surveillance authorities can prohibit the system's placement on the market, require recall, and impose fines of up to EUR 15 million or 3% of global annual turnover. Where the system processed personal data, concurrent GDPR violations may also trigger action by the competent data protection authority.

Bart Lieben
Attorney-at-Law
key takeaways

More related articles

WhatsApp messaging icon for live chat support
Pitch Chatbot
Contact us right away
Pitch Chatbot
Hi there,
How can we help you today?
Start Whatsapp Chat
By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage and assist in our marketing efforts. More info
No items found.