Tag: EU Data Protection

Artificial Intelligence and Personal Data: a hard co-existence. A new perspective for the EU

7. July 2022

In the last decades AI has had an impressive development in various fields. At the same time, with each step forward the new machines and the new processes they are programmed to perform need to collect way more data than before in order to function properly.

One of the first things that come to mind is how can the rise of AI and the principle of data minimization, as contained in Art. 5 para. 1 lit. c) GDPR, be reconciled? At first glance it seems contradictory that there may be a way: after all, the GDPR clearly states that the number of personal data collected should be as small as possible. A study carried out by the Panel for the Future of Science and Technology of the European Union suggests that, given the wide scope (referring to the exceptions contained in the article) conceded by the norm, this issue could be addressed by measures like pseudonymization. This means that the data collected by the AI is deprived of every information that could refer personal data to a specific individual without additional information, thus lowering the risks for individuals.

The main issue with the current legal framework of the European Union regarding personal data protection is the fact that certain parts have been left vague, which causes uncertainty also in the regulation of artificial intelligence. To address this problem, the EU has put forward a proposal for a new Artificial Intelligence Act (“AIA”), aiming to create a common and more “approachable” legal framework.

One of the main features of this Act is that it divides the application of artificial intelligence in three main categories of risk levels:

  1. Creating an unacceptable risk, thus prohibited AIs (e.g. systems that violate fundamental rights).
  2. Creating a high risk, subject to specific regulation.
  3. Creating a low or minimum risk, with no further regulation.

Regarding high-risk AIs, the AIA foresees the creation of post-market monitoring obligations. If the AI in question violates any part of the AIA, it can then be forcibly withdrawn from the market by the regulator.

This approach has been welcomed by the Joint Opinion of the EDPB – EDPS, although the two bodies stated that the draft still needs to be more aligned with the GDPR.

Although the Commission’s draft contains a precise description of the first two categories, these will likely change over the course of the next years as the proposal is undergoing the legislative processes of the EU.

The draft was published by the European Commission in April 2021 and must still undergo scrutiny from the European Parliament and the Council of the European Union. Currently, some amendments have been formulated and the draft is still under review by the Parliament. After the Act has passed the scrutiny, it will be subject to a two – year implementation period.

Finally, a question remains to be answered: who shall oversee and control the Act’s implementation?It is foreseen that national supervisory authorities shall be established in each EU member state. Furthermore, the AIA aims at establishing a special European AI Board made up of representatives both of the member States and of the European Commission, which will also be the chair. Similar to the EDPB, this Board shall have the power to issue opinions and recommendations, and ensure the consistent application of the regulation throughout the EU.

Microsoft Cloud Services will store and process EU data within the EU

7. May 2021

On May 7th, 2021, Brad Smith, Microsoft’s President and Chief Legal Officer, announced in a blogpost that Microsoft will enable its EU commercial and public sector customers to store all their data in the EU. Microsoft calls this policy “EU Data Boundary” and it will apply across all of Microsoft’s core business cloud services, such as Azure, Microsoft 365 and Dynamics 365. Microsoft is the first big cloud provider to take such a step. The transition is intended to be done by the end of 2022.

This move can be seen as a reaction to the Court of Justice of the European Union’s (CJEU) “Shrems II” ruling in June 2020 (please see our blogpost), in which the CJEU ruled that the “EU-US-Privacy Shield” does not provide sufficient protection and therefore invalidating the agreement. The “Privacy Shield” was a framework for regulating the transatlantic exchange of personal data for commercial purposes between the EU and the USA.

However, the CJEU has clarified that server location and standard contractual clauses alone are not sufficient to meet the requirements of the General Data Protection Regulation (GDPR). This is because under U.S. law such as the “CLOUD Act”, U.S. law enforcement agencies have the power to compel U.S.-based technology companies to hand over requested data stored on servers, regardless of whether the data is stored in the U.S. or on foreign soil. So even with Microsoft’s proposed changes, U.S. authorities would still be able to access EU citizens’ personal data stored in the EU.

Microsoft believes it has found a way around the U.S. intelligence agencies: The U.S. intelligence agencies’ right of access could be technically worked around if customers effectively protected their data in the cloud themselves. To do this, customers would have to encrypt the data with a cryptographic key. In such a case, it would not be Microsoft that would manage the keys, but the customer themselves, and it would not be possible for Microsoft to hand over the keys to the US intelligence agencies. Microsoft also states that they are going above and beyond with their “Defending your Data” (please see our blogpost) measures to protect their customers’ data.

These measures by Microsoft are a step in the direction of a GDPR-compliant use of cloud applications, but whether they are sufficient to meet the high requirements of the GDPR may be doubted given the far-reaching powers of the US intelligence agencies. The reference to the possibility that users can encrypt their data themselves and keep the keys should help to comply with EU data protection standards, but must also be implemented in practice. Microsoft will have to educate its customers accordingly.

The GDPR-compliant transfer of personal data of EU citizens to the US remains uncertain territory, although further positive signals can be observed. For example, the new U.S. administration under President Joe Biden recently showed itself open to concluding a new comprehensive data protection agreement with the EU.

Trust in current mechanisms to carry out international data transfer decreases

1. September 2016

According to a survey conducted recently by the International Association of Privacy Professionals (IAPP), trust in current legal mechanisms to carry out data transfers to third countries, such as Standard Contractual Clauses and the EU-U.S. Privacy Shield, has decreased.

The results of this survey reveal that 80 percent of companies relies on the Standard Contractual Clauses approved by the EU Commission to carry out international data transfers, especially to the U.S.A. However, there is currently uncertainty regarding the validity of the Standard Contractual Clauses, which may be also invalidated by the ECJ, as already occurred with the former Safe Harbor framework.

Regarding the EU-U.S. Privacy Shield, which is operative since 1st August, the survey reveals that only 42 percent of U.S. companies plan to self-certify through this new framework, compared to the 73 percent that conducted self-certification with the Safe Harbor framework. The main reason for this may be related to the uncertainty regarding its validity. The Article 29 WP stated recently that the first annual review of the Privacy Shield will be decisive.

Finally, Binding Corporate Rules (BCR) are also used by companies to carry out intra-group data transfers. However, there are several reasons why not many companies implement them. One of these reasons relates to the high costs involved with the implementation. Moreover, the implementation process can last over one year. Also, BCR can be only used for international data transfers within the group, so that other mechanisms shall be used if data transfers outside the group take place.