Skip to content

Moving Ahead with the Machine Learning Act in the EU

AI threats, inherently linked to machine learning applications, are now acknowledged by European policymakers.

European Union Taking Steps to Advance Machine Learning Legislation
European Union Taking Steps to Advance Machine Learning Legislation

Moving Ahead with the Machine Learning Act in the EU

**European Union's New Regulations for Machine Learning Systems**

The European Artificial Intelligence Act (EU AI Act), initially adopted in March 2024, is set to transform the way machine learning (ML) systems are regulated within the EU. The Act, which is being implemented in phases, aims to ensure transparency, accountability, and risk management for ML-based AI systems.

As of July 2025, the EU AI Act defines AI broadly and risk-based, encompassing learning systems and explicitly covering ML systems and foundational models. This expansive definition covers most software, particularly those used in the eight "high-risk" domains, such as recruitment, translation, and policing.

The first phase, becoming applicable on February 2nd, 2025, banned AI systems that pose unacceptable risks and imposed AI literacy requirements for employees handling AI systems within the EU market. The next major phase, starting August 2nd, 2025, introduces specific obligations for providers of general-purpose AI models (GPAI), including those based on ML such as large language models (LLMs) like ChatGPT, BERT, and image generation models like DALL-E.

These obligations include transparency mandates requiring clear disclosure of AI system capabilities, requirements to maintain and provide technical documentation, disclosure of copyrighted material used during training, and for high-risk GPAI systems, additional duties such as model evaluations, adversarial testing, and incident reporting.

The EU's strategy to regulate AI comprehensively reflects a growing understanding that it is machine learning, rather than artificial intelligence more broadly, that poses novel risks to consumers. As organizations gain more experience with ML, "misfires" will become rarer.

However, limiting the AI Act's scope to ML still requires balancing safety with innovation. AI plays a significant role in powering EU industries and enhancing the daily lives of EU citizens. The EU cannot afford to be left behind as AI unicorns and most promising AI startups are already turning elsewhere.

Penalizing the use of AI could hinder innovation and negatively impact consumers. The cost to the European ecosystem, if the AI Act penalizes the use of AI, would be substantial, including deterred investment, costlier AI, and forgone applications.

Despite calls from industry leaders to pause the upcoming stricter rules for GPAI due to concerns about regulatory complexity, the EU has signaled a firm commitment to continue with the current framework as planned. The Act's definition of AI regarding machine learning remains unchanged, with regulatory implementation ongoing, focusing on phased rollouts and detailed guidelines.

Leading MEPs are pushing to redefine AI in the AI Act as a system that uses "learning, reasoning or modelling," effectively limiting the scope to machine learning. This move towards a more narrow definition of machine learning is a significant step in ensuring the EU remains at the forefront of AI innovation while maintaining safety and consumer protection.

[1] European Commission. (2021). Proposal for a Regulation of the European Parliament and of the Council on Artificial Intelligence (Artificial Intelligence Act). [2] European Parliament. (2022). Report on the Proposal for a Regulation of the European Parliament and of the Council on Artificial Intelligence (Artificial Intelligence Act). [3] European Data Protection Board. (2021). Guidelines 03/2021 on the concept of a data subject under Regulation 2016/679. [4] European Data Protection Board. (2022). Guidelines 04/2022 on the concepts of controller and processor in the context of artificial intelligence.

  1. The European Artificial Intelligence Act (EU AI Act), initially adopted in March 2024, is specific in defining AI using machine learning (ML), signifying a focus on these systems.
  2. As of July 2025, the EU AI Act covers learning systems broadly, explicitly including ML systems and models like ChatGPT, BERT, and DALL-E in its regulations.
  3. The first phase of the EU AI Act, applicable from February 2nd, 2025, prohibits AI systems that pose unacceptable risks and mandates AI literacy for employees handling AI systems in the EU market.
  4. The Act's next phase, starting August 2nd, 2025, imposes specific obligations on providers of general-purpose AI models (GPAI), requiring transparency, technical documentation, material disclosure, and additional duties for high-risk GPAI systems.
  5. The EU aims to ensure a balance between safety and innovation in its AI regulations, recognizing the significant role of AI in powering EU industries and enhancing EU citizens' lives.
  6. Leading MEPs are advocating for a more narrow definition of machine learning in the AI Act to keep the EU at the forefront of AI innovation while maintaining safety and consumer protection.

Read also:

    Latest