The European Union Artificial Intelligence Act establishes a cross-sectoral framework governing the development and use of AI systems, aiming to ensure safety, transparency and compliance with fundamental rights while fostering innovation. To support research, it includes two exemptions: one covering AI systems under development before they are placed on the market or put into service, and another applying to systems specifically developed and put into service solely for scientific research. Although designed to protect scientific inquiry, these exemptions rely on distinctions that are increasingly difficult to apply in contemporary AI research. Boundaries between laboratory development and real-world testing and between scientific and commercial objectives, are often blurred, creating regulatory uncertainty and potential for misuse.
Development-Phase Exemption and Real-World Testing
The development-phase exemption excludes research, testing or development activities before an AI system is placed on the market or put into service. Such activities must still comply with applicable Union law. However, testing in real-world conditions is not covered by the exemption.
Putting into service refers to first use for the system’s intended purpose. Placing on the market involves making the system available in the Union. Testing in real-world conditions means temporary testing outside the laboratory, involving human participants and prior approval by relevant authorities, to gather data and verify conformity. The Act outlines procedures for such testing but does not define real-world conditions in general terms.
Must Read: WHO Calls for Legal Clarity on AI Use in Healthcare
This lack of definition creates ambiguity. An AI system operating in silent mode within a hospital, collecting live data without displaying outputs, may still be considered as being tested for its intended purpose. In such a case, the activity could qualify as real-world testing and fall within the Act’s scope. By contrast, deployment limited to technical checks unrelated to clinical purpose may remain within the development phase.
Hypothetical circumvention strategies illustrate possible risks, including offshore deployment, simulated environments supplemented with live data, splitting models across systems or silent on-site data capture. Although the Act applies to systems affecting individuals in the Union regardless of provider location, the blurred boundary between laboratory research and real-world deployment underscores the need for clearer guidance and effective enforcement.
Scientific-Use Exemption and the Sole Purpose Requirement
The scientific-use exemption applies to AI systems and models, including their outputs, that are specifically developed and put into service for the sole purpose of scientific research and development. The system must be both developed and deployed exclusively for research. If either element is absent, the exemption does not apply.
The requirement of sole purpose raises interpretative challenges. Research goals may evolve, and establishing original intent can be difficult. A system developed for clinical screening but later confined to research creates uncertainty. Conversely, a commercially developed system used by a university solely for research does not qualify, as it was not originally developed exclusively for scientific research.
Examples illustrate these distinctions. An AI system developed and used only in a laboratory for research remains exempt. A diagnostic system implemented in routine patient care is not exempt. A commercial tool used only for research also remains subject to the Act. A system initially intended for practical use but later redirected exclusively to research presents an unclear case regarding whether sole purpose must apply throughout development.
The absence of a harmonised definition of scientific research in Union law compounds these difficulties. Modern research frequently involves collaborations among universities, public institutions and private enterprises, often supported by funding arrangements that encourage partnership. In such contexts, separating purely scientific inquiry from activities with potential commercial application becomes complex. This creates a risk of regulatory arbitrage, while overly rigid interpretation may hinder legitimate collaboration.
Regulatory Gaps and Implementation Challenges
The research exemptions depend on distinctions between research and commercialisation and between laboratory activity and real-world impact. In practice, research often operates along a continuum. Publicly funded projects may include knowledge-transfer obligations requiring engagement with industry partners. Non-profit and humanitarian AI initiatives that distribute tools for social benefit may also constitute real-world use.
Uncertainty surrounding real-world testing further complicates application. AI research commonly involves controlled deployment using live data. Without clearer guidance on what constitutes real-world conditions, interpretations may vary, creating either opportunities for avoidance or barriers to legitimate research.
Clearer definitions of scientific research and real-world conditions, along with practical implementation guidance, could support consistent application. Greater transparency in research involving private partnerships may also reduce the risk of misuse while preserving space for responsible innovation.
The EU AI Act’s research exemptions aim to foster scientific progress while maintaining safeguards. In practice, the development-phase and scientific-use exemptions rely on distinctions that are increasingly difficult to sustain. Ambiguities concerning real-world testing and sole purpose create uncertainty and potential loopholes. At the same time, restrictive interpretation may impede legitimate research. Clearer guidance and consistent oversight are essential to ensure that innovation is supported without undermining the Act’s protective objectives.
Source: npj digital medicine
Image Credit: iStock
References:
Meszaros J, Huys I & Ioannidis JPA (2026) Challenges in applying the EU AI act research exemptions to contemporary AI research. npj digit med: In Press.