Tag: European Data Protection Board

European Data Protection Board adopts a dispute resolution decision in the context of Instagram

17. August 2022

In early December 2021, the Irish Data Protection Commission (DPC) in its capacity as lead supervisory authority responsible for overseeing Instagram (meta) sent a draft decision to other European supervisory authorities in line with Art. 60 (3) GDPR. In this draft decision, the DPC expressed its concern with instagram’s compliance with several GDPR provisions, notably Art. 5(1)(a) and (c), 6(1), 12(1), 13, 24, 25 and 35 GDPR.

The lead supervisor authority specifically raised the issue of the public disclosure of children’s personal data, such as e-mail addresses and phone numbers, due to their use of the Instagram business account feature.

The respective Supervisory Authorities, however, did not fully agree with the draft decision and issued objections in accordance with Art. 60(4) GDPR. Unable to find common ground on some of the objections, Art. 65(1) (a) GDPR laying down the dispute resolution procedure, became applicable. Consequently, the lead supervisory authority, the DPC, was required to ask the European Data Protection Board (EDPB) to adopt a binding decision.

On July 29, 2022, the EDPB announced that it had adopted a dispute resolution decision following these objections. Now, it is upon the DPC to adopt its final decision and to communicate it to the controller. The DPC has one month to issue its final decision, albeit it should be based on the EDPB decision.

EDPB AND EDPS criticise the Commission’s Proposal to combat child sexual abuse online

15. August 2022

In May 2022, the European Commission published its proposal on combating child sexual abuse material. The Commission justified the need for this proposal with the alleged insufficiency of voluntary detection carried out by companies. Recently, the European Data Protection Board (EDPB) and European Data Protection Supervisor (EDPS) have issued a joint statement criticizing the proposal on privacy grounds.

According to the proposal, hosting services and communication services would be obliged to identify, remove and report online child pornography. This, in turn, requires that encrypted messages can be screened. In other words, the actual text messages are to be read in order to detect grooming.

In their joint criticism, the EDPB and EDPS highlight that such an AI based system will most likely result into errors and false positives.

EDPS Supervisor, Wojciech Wiewiórowski, said: “Measures allowing public authorities to have access to the content of communications, on a generalised basis, affect the essence of the right to private life. Even if the technology used is limited to the use of indicators, the negative impact of monitoring the text and audio communications of individuals on a generalised basis is so severe that it cannot be justified under the EU Charter of Fundamental Rights. The proposed measures related to the detection of solicitation of children in interpersonal communication services are extremely concerning.”

The Commission’s Proposal for the European Health Data Space raises data protection concerns

21. July 2022

On May 3, 2022, the European Commission (EC) published its proposal for the creation of the European Health Data Space (EHDS). This proposal, if adopted, would foresee the creation of an EU-wide infrastructure that allows to link health data sets for practitioners, researchers, and industry. In its communication, the EC points at the necessity for promoting “the free, cross-border flows of personal data” with the aim of creating an “internal market for personal health data and digital health products and services”.

Doctors in Germany, by way of an example, would then be able to access the medical file of a Spanish patient that is currently undergoing medical treatment in Germany. In this context, it might be worthy to note that not all Member States are maintaining electronic records of patients having the consequence that this proposal would require certain member states to take steps towards digitalization. With regard to researchers and industry, the underlying incentive of this proposal is to enable them to draw from health data available to create new solutions and to push forward innovation.

Nevertheless, health data are sensitive data within the meaning of the GDPR, which means that access to such data is only exceptionally possible. This begs the question whether and how access to personal health data that this proposal is intending to enable, can be reconciled with the GDPR. Recently, the European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS) issued a joint opinion on this new legislative initiative expressing several concerns in relation to the proposal from a data protection perspective.

If one takes the example of health data processed while accessing healthcare, then the legal ground of art. 9 (2) (h) GDPR, namely that of medical diagnosis or provision of health, would be applicable. Further processing for any other purpose, however, would then require the data subject’s consent.

In the words of EDPB Chair Andrea Jelinek: “The EU Health Data Space will involve the processing of large quantities of data which are of a highly sensitive nature. Therefore, it is of the utmost importance that the rights of the European Economic Area’s (EEA) individuals are by no means undermined by this Proposal. The description of the rights in the Proposal is not consistent with the GDPR and there is a substantial risk of legal uncertainty for individuals who may not be able to distinguish between the two types of rights. We strongly urge the Commission to clarify the interplay of the different rights between the Proposal and the GDPR.”

Diving into the details of the joint opinion, the EDPB and EDPS strongly recommends making secondary use of personal data steaming from wellness applications, such as wellness and behavioral data, be subject to the prior consent of the data subject, in case these data, contrary to EDPB and EDPS’ recommendation, are not excluded from the scope of this proposal altogether.

That would not only be in line with the GDPR, but would also make possible to differentiate between health data generated by wellness applications, on the one hand, and health data generated by medical devices, on the other hand.

The fundamental difference between both data lies in the different degrees of quality and the fact that wellness applications do also process, for instance, food practices which therefore allows to draw conclusions from data subjects’ daily activities, habits, and practices.

EDPB adopts new guidelines on certification as a tool for transfers

23. June 2022

On June 16, 2022, the European Data Protection Board (EDPB) announced on its website that it had adopted guidelines on certification as a tool for transfers of personal data (publication is yet to take place following linguistic checks). Once published these guidelines will undergo public consultation until September 2022.

On a first note, these guidelines can be placed within the broader context of international data transfers, as envisioned by art. 46 (2) (f) GDPR. Further, the certification mechanism comes only into play when an adequacy decision is absent. As is probably well known, art. 46 (2) GDPR outlines several safeguards that may be resorted to in case personal data is being transferred to third countries.

One of these is the voluntary certification mechanism, as laid down by art. 42/43 GDPR, that allows accredited certification bodies or supervisory authorities to issue certifications, provided, of course, that controllers or processors have made binding and enforceable commitments. What the EU legislators hoped was to assist data subjects in quickly assessing “the level of data protection of relevant products and services” (Recital 100 GDPR) by way of certifications, seals, and marks.

In accordance with art. 42 (5) GDPR and guideline 1/2018 on certification, whereby the latter is to be complemented with the new guidelines, accredited certification bodies or supervisory authorities are competent to issue such certification. It is important to note that the previously mentioned accredited certification bodies could very well be private bodies which are subject to certain requirements and prior approval by the Board or supervisory authorities. The criteria on the basis of which certifications are issued are to be determined and approved by the Board or by the competent supervisory authorities (art. 42 (5) GDPR).

According to EDPB Deputy Chair Ventsislav Karadjov, these yet-to-be published guidelines are “ground-breaking” as he provides an outlook for the content of the guidelines. One of the most important aspects that will be touched upon are the accreditation requirements that certification bodies have to comply with as well as the certification criteria attesting that appropriate safeguards for transfers are in place. It remains to be seen whether these guidelines will indeed provide more guidance on those aspects.

EDPB adopts new Guidelines on restrictions of data subject rights under Article 23 GDPR

25. October 2021

During its plenary session of October 2021, the European Data Protection Board (EDPB) adopted a final version of the Guidelines on restrictions of data subject rights under Art. 23 of the General Data Protection Regulation (GDPR) following public consultation.

The Guidelines “provide a thorough analysis of the criteria to apply restrictions, the assessments that need to be observed, how data subjects can exercise their rights after the restrictions are lifted, and the consequences of infringements of Art. 23 GDPR,” the EDPB stated in their press release.

Further, the Guidelines aim to analyze how the legislative measures setting out the restrictions need to meet the foreseeability requirement and examine the grounds for the restrictions listed by Art. 23(1) GDPR, as well as the obligations and rights which may be restricted.

These Guidelines hope to recall the conditions surrounding the use of the restrictions by the Member States in light of the Charter of Fundamental Rights of the European Union, and to guide Member States if they wish to implement restrictions under national law.

EDPB creates “Cookie Banner Taskforce”

5. October 2021

On September 27, 2021, the European Data Protection Board (EDPB) announced that it has established a “Cookie Banner” taskforce in order to coordinate the complaints and corresponding responses filed with several EU data protection authorities (DPA) by the non-governmental organization None of Your Business (NOYB) in relation to website cookie banners.

In May 2021 NOYB sent over 500 draft and formal complaints to companies residing in the EU regarding the use of their cookie banners. The complaints seem to focus on the absence of a “reject all” button on most of the websites as well as the way cookie banners use deceptive design in order to get data subjects to consent to the use of non-essential cookies. Another regular complaint is the difficulty for refusing cookies, as opposed to the simple way of consenting to them.

The EDPB stated that “this taskforce was established in accordance with Art. 70 (1) (u) GDPR and aims to promote cooperation, information sharing and best practices between the DPAs”. The taskforce is meant to exchange views on legal analysis and possible infringements, provide support to activities on the national levels and streamline communication.

EDPS and the EDPB call for a tightening of the EU draft legislation on the regulation of Artificial Intelligence (AI)

26. July 2021

In a joint statement, the European Data Protection Supervisor (EDPS) and the European Data Protection Board (EDPB) call for a general ban on the use of artificial intelligence for the automated recognition of human characteristics in publicly accessible spaces. This refers to surveillance technologies that recognise faces, human gait, fingerprints, DNA, voice, keystrokes and other biometric or behavioral signals. In addition to the AI-supported recognition of human characteristics in public spaces, the EDPS and EPDB also call for a ban of AI systems using biometrics to categorize individuals into clusters based on ethnicity, gender, political or sexual orientation, or other grounds on which discrimination is prohibited under Article 21 of the Charter of Fundamental Rights. With the exception of individual applications in the medical field, EDPS and the EDPB are also calling for a ban on AI for sentiment recognition.

In April, the EU Commission presented a first draft law on the regulation of AI applications. The draft explicitly excluded the area of international law enforcement cooperation. The EDPS and EDPB expressed “concern” about the exclusion of international law enforcement cooperation from the scope of the draft. The draft is based on a categorisation of different AI applications into different types of risk, which are to be regulated to different degrees depending on the level of risk to the fundamental rights. In principle, the EDPS and EDPB support this approach and the fact that the EU is addressing the issue in general. However, they call for this concept of fundamental rights risk to be adapted to the EU data protection framework.

Andrea Jelinek, EDPB Chair, and Wojciech Wiewiórowski, of the EDPS, are quoted:

Deploying remote biometric identification in publicly accessible spaces means the end of anonymity in those places. Applications such as live facial recognition interfere with fundamental rights and freedoms to such an extent that they may call into question the essence of these rights and freedoms.

The EDPS and EDPB explicitly support, that the draft provides for national data protection authorities to become competent supervisory authorities for the application of the new regulation and explicitly welcome, that the EDPS is intended to be the competent authority and the market surveillance authority for the supervision of the Union institutions, agencies and bodies. The idea that the Commission also gives itself a predominant role in the “European Artificial Intelligence Board” is questioned by the EU data protection authorities. “This contradicts the need for a European AI Board that is independent of political influence”. They call for the board to be given more autonomy, to ensure its independence.

Worldwide there is great resistance against the use of biometric surveillance systems in public spaces. A large global alliance of 175 civil society organisations, academics and activists is calling for a ban on biometric surveillance in public spaces. The concern is that the potential for abuse of these technologies is too great and the consequences too severe. For example, the BBC reports that China is testing a camera system on Uighurs in Xinjiang that uses AI and facial recognition to detect emotional states. This system is supposed to serve as a kind of modern lie detector and be used in criminal proceedings, for example.

EDPB adopts final Recommendation 01/2020 on Supplementary Measures for Data Transfers to Third Countries

22. June 2021

On June 21st, 2021 during its 50th plenary session, the European Data Protection Board (EDPB) adopted a final version of its recommendations on the supplementary measures for data transfers.

In its recent judgment C-311/18 (Schrems II) the Court of Justice of the European Union (CJEU) has decided that, while the Standard Contractual Clauses (SCCs) are still a valid data transfer mechanism, controllers or processors, acting as exporters, are responsible for verifying, on a case-by-case basis and where appropriate, in collaboration with the importer in the third country, if the law or practice of the third country impinges on the effectiveness of the appropriate safeguards contained in the Article 46 GDPR transfer tools. In the cases where the effectiveness of appropriate safeguards is reduced due to the legal situation in the third country, exporters may need to implement additional measures that fill the gaps.

To help exporters with the complex task of assessing third countries and identifying appropriate supplementary measures where needed, the EDPB has adopted this recommendation. They highlight steps to follow, potential information sources as well as non-exhaustive examples of supplementary measures that are meant to help exporters make the right decisions for data transfers to third countries.

The recommendations advise exporters to follow the following steps in order to have a good overview of data transfers and potential supplementary measures necessary:

1. Know the data transfers that take place in your organization – being aware of where data flows is essential to identify potentially necessary supplementary measures;

2. Verify the transfer tool that each transfer relies on and its validity as well as application to the transfer;

3. Assess if a law or a practice in the third country impinges on the effectiveness of the transfer tool;

4. Identify and adopt supplementary measures that are necessary to bring the level of protection of the data transferred up to the EU standard;

5. Take formal procedural steps that may be required by the adoption of your supplementary measure, depending on the transfer tool you are relying on;

6. Re-evaluate the level of protection of the data you transfer at appropriate intervals and monitor any potential changes that may affect the transfer.

The EDPB Chair, Andrea Jelinek, stated that “the effects of Schrems II cannot be underestimated”, and that the “EDPB will continue considering the effects of the Schrems II ruling and the comments received from stakeholders in its future guidance”.

The recommendations clearly highlight the importance of exporters to understand and keep an eye on their data transfers to third countries. In Germany, the Supervisory Authorities have already started (in German) to send out questionnaires to controllers regarding their data transfers to third countries and the tools used to safeguard the transfers. Controllers in the EU should be very aware of the subject of data transfers in their companies, and prepare accordingly.

Belgian DPA approves first EU Data Protection Code of Conduct for Cloud Service Providers

21. June 2021

On May 20th, 2021, the Belgian Data Protection Authority (Belgian DPA) announced that it had approved the EU Data Protection Code of Conduct for Cloud Service Providers (EU Cloud CoC). The EU Cloud CoC is the first transnational EU code of conduct since the entry into force of the EU General Data Protection Regulation in May 2018.

The EU Cloud CoC represents a sufficient guarantee pursuant to Article 28 (1) and 28 (5) of the GDPR, as well as Recital 81 of the GDPR, which makes the adherence to the code by cloud service providers a valid way to secure potential data transfers.

In particular, the EU Cloud CoC aims to establish good data protection practices for cloud service providers, giving data subjects more security in terms of the handling of their personal data by cloud service providers. In addition, the Belgian DPA accredited SCOPE Europe as the monitoring body for the code of conduct, which will ensure that code members comply with the requirements set out by the code.

It further offers cloud service providers with practical guidance and a set of specific binding requirements (such as requirements regarding the use of sub-processors, audits, compliance with data subject rights requests, transparency, etc.), as well as objectives to help cloud service providers demonstrate compliance with Article 28 of the GDPR.

In the press release, the Chairman of the Belgian DPA stated that „the approval of the EU Cloud CoC was achieved through narrow collaboration within the European Data Protection Board and is an important step towards a harmonised interpretation and application of the GDPR in a crucial sector for the digital economy“.

EDPB adopts opinion on draft UK adequacy decisions

16. April 2021

In accordance with its obligation under Article 70 (1) (s) of the General Data Protection Regulation (GDPR), on April 13th, 2021, the European Data Protection Board (“EDPB”) adopted its opinions on the EU Commissions (“EC”) draft UK adequacy decision (please see our blog post). “Opinion 14/2021” is based on the GDPR and assesses both general data protection aspects and the public authority access to personal data transferred from the EEA for law enforcement and national security purposes contained in the draft adequacy decision, a topic the EC also discussed in detail. At the same time, the EDPB also issued “Opinion 15/2021” on the transfer of personal data under the Law Enforcement Directive (LED).

The EDPB notes that there is a strong alignment between the EU and the UK data protection regimes, especially in the principles relating to the processing of personal data. It expressly praises the fact that the adequacy decision is to apply for a limited period, as the EDPB also sees the danger that the UK could change its data protection laws. Andrea Jelinek, EDPB Chair, is quoted:

“The UK data protection framework is largely based on the EU data protection framework. The UK Data Protection Act 2018 further specifies the application of the GDPR in UK law, in addition to transposing the LED, as well as granting powers and imposing duties on the national data protection supervisory authority, the ICO. Therefore, the EDPB recognises that the UK has mirrored, for the most part, the GDPR and LED in its data protection framework and when analysing its law and practice, the EDPB identified many aspects to be essentially equivalent. However, whilst laws can evolve, this alignment should be maintained. So we welcome the Commission’s decision to limit the granted adequacy in time and the intention to closely monitor developments in the UK.”

But the EDPB also highlights areas of concern that need to be further monitored by the EC:

1. The immigration exemption, which restricts the rights of those data subjects affected.

2. How the transfer of personal data from the EEA to the UK could undermine EU data protection rules, for example on basis of future UK adequacy decisions.

3. Access to personal data by public authorities is given a lot of space in the opinion. For example, the Opinion analyses in detail the Investigatory Powers Act 2016 and related case law. The EDPB welcomes the numerous oversight and redress mechanisms in the UK but identifies a number of issues that need “further clarification and/or oversight”, namely bulk searches, independent assessment and oversight of the use of automated processing tools, and the safeguards provided under UK law when it comes to disclosure abroad, particularly with regard to the application of national security exemptions.

In summary, this EDPB opinion does not put any obstacles in the way of an adequacy decision and recognises that there are many areas where the UK and EU regimes converge. Nevertheless, it highlights very clearly that there are deficiencies, particularly in the UK’s system for monitoring national security, which need to be reviewed and kept under observation.

As for the next steps, the draft UK adequacy decisions will now be assessed by representatives of the EU Member States under the “comitology procedure“. The Commission can then adopt the draft UK adequacy decisions. A bridging period during which free data transfer to the UK is permitted even without an adequacy decision ends in June 2021 (please see our blog post).

Pages: 1 2 3 Next
1 2 3