22/12/23
Initially published on AGEFI
Digital governance has been high on the agenda of regulators in the past years. This is especially true in Europe. The European Union's legislative framework, comprising the Artificial Intelligence Act (EU AI Act), Data Governance Act (DGA), and Digital Services Act (DSA), plays a critical role in shaping the future of digital technologies. These acts collectively contribute to a comprehensive digital regulatory landscape, setting ambitious objectives and potentially impacting millions of stakeholders in the EU. This article explores how these regulations intersect to form an integrated approach to digital governance.
Following the conclusion of the trilogue negotiations at the beginning of December 2023, which involved extensive debates on foundational models (large-scale models that are trained on vast datasets, such as OpenAI’s GPT) and several critical open points, the EU AI Act is now poised to set a new standard in AI regulation. It represents a landmark piece of legislation, representing the first major regulatory framework in the world specifically dedicated to governing the development, deployment, and use of AI. Its significance lies in setting comprehensive standards and obligations not only for developers but also for users of AI systems. This framework aims to ensure that AI systems are safe, ethical, and respect existing laws on fundamental rights and values. It applies to both providers and users of AI systems within the EU, as well as to those outside the EU if the AI system is used within EU borders.
The Act categorises AI systems based on their risk level and includes distinct requirements for foundational models. There are four levels: unacceptable, high, limited, and minimal. Unacceptable AI systems such as those that harm, deceive, or judge people – such as social scoring by governments - will be banned in the EU. High-risk AI systems should satisfy strict standards before they are offered or used, like those used in healthcare or human resource for CV analysis. Limited-risk AI systems should be transparent. Minimal or no-risk AI systems are free to use, but they are advised to follow voluntary codes and standards.
The Act mandates adherence to ethical standards and fundamental rights, with a strong emphasis on human oversight for high-risk AI to prevent discrimination and harmful outcomes. It enforces compliance through fines, akin to the GDPR, and establishes a European Artificial Intelligence Board, alongside national authorities, for uniform application across EU member states. Simultaneously, the Act strives to enhance innovation and competitiveness in the European AI industry, advocating for AI development and usage that aligns with EU values and standards.
The DGA is a significant piece of legislation in the EU's strategy for data governance adapted in June 2022. The Data Governance Act (DGA) aims to expand the use of data held by public authorities by establishing a legal framework that enables various entities, such as private companies, research institutions, and non-profit organisations, to access and share this data. To protect user privacy, this process includes applying secure processing techniques and anonymising personal data. Secure processing ensures that data handling is safe and secure, while anonymisation removes personal identifiers from the data, making it impossible to trace back to an individual. The goal is to encourage the adoption of these privacy-preserving methods more broadly, not just in the context of public sector data.
The DGA defines the requirements and obligations for these entities, such as registration, transparency, and compliance. Additionally, the DGA introduces data cooperatives, member-controlled data-sharing service providers, designed to democratise data access and foster collaborative and ethical data management practices. The DGA promotes data altruism, enabling individuals and organisations to be formally recognised as data altruism organisations. They can voluntarily share their data for public interest purposes, such as scientific research, thus contributing to societal advancement.
The DGA restricts transfers of non-personal data. Data intermediaries - entities that facilitate the sharing or pooling of data between different parties - alongside recognised data altruism providers will have to consider if third countries offer appropriate protections for non-personal data of the users in EU. The DGA aims to foster data-driven innovation and contribute to the EU’s digital sovereignty and green and digital transition.
In Luxembourg, the newly established Luxembourg National Data Service (LNDS) supports the DGA’s objectives through a platform dedicated to the responsible reuse and transfer of public sector data. Enhancing data accessibility, LNDS addresses data challenges in a compliant and reliable manner, ensuring providers retain full control. Serving both public and private sectors, it champions data-driven innovation, aligning with the EU's vision for digital sovereignty and a sustainable digital transition.
The EU published the Digital Services Act (DSA) in October 2022, and it came into force in November 2022, aiming to create a safer and more open internet for everyone. As a major legislative initiative by the EU, the DSA regulates digital services, particularly focusing on online platforms. It encompasses all digital services connecting consumers to goods, services, or content, including social media, online marketplaces, and similar services. The DSA responds to the challenges of digital transformation and the increasing influence of tech giants.
The Act imposes varied rules on different online services, contingent on their size, impact, and associated risks. For instance, very large online platforms, such as Amazon, face more obligations due to their extensive reach, impacting over 10% of the EU population. These platforms are required to disclose how algorithms rank and recommend content, offering users greater control and choice in their settings. The DSA institutes a clear procedure for the removal of illegal content online, including hate speech, terrorism, or child abuse. It upholds users' rights to contest decisions made by online service providers and seek judicial redress if necessary.
Moreover, the DSA establishes a cooperation mechanism among the national authorities of EU Member States, the Commission, and a new European Board for Digital Services, ensuring consistent enforcement of rules across the EU. Online service providers must designate a single contact point within the EU for communication and cooperation with the authorities.
The DGA, EU AI Act, and DSA are interconnected through a shared focus on "data". The AI Act ensures that AI systems, trained on large volumes of data and utilised for diverse purposes, are governed by high ethical standards within a legal framework to minimise societal risk. The DGA emphasises ethical data sharing for organisations, facilitating data reuse across the EU and promoting responsible innovation. The DSA regulates digital platforms, mandating transparent and accountable data usage with a focus on user safety. Collectively, these acts ensure that data, a crucial resource in the digital age, is used innovatively yet responsibly, striking a balance between technological advancement and societal safety and ethics. This integrated approach bolsters Europe's digital sovereignty, nurturing a safe and innovative digital economy.
With this changing landscape of AI and data regulations, it is important for organisations to adopt a strategic approach in integrating AI into their businesses, ensuring strict adherence to the frameworks. Doing this successfully will include:
Andreas Braun
Advisory Director, Data Science & AI Team Lead, PwC Luxembourg
Tel: +352 62133 23 66