Category: General Data Protection Regulation

CJEU ruling on One-Stop-Shop mechanism

25. June 2021

On June 15th, 2021, the Court of Justice of the European Union (CJEU) ruled that “under certain conditions, a national supervisory authority may exercise its power to bring any alleged infringement of the GDPR before a court of a member state, even though that authority is not the lead supervisory authority”. It grants each supervisory authority the power to bring matters within its supervisory area before the courts. If a non-lead supervisory authority wishes to bring cross-border cases to court, it can do so under the so-called emergency procedure under Article 66 of the GDPR.

The General Data Protection Regulation (GDPR) provides that the data protection authority of the country in which a company has its principal place of business in the EU has primary jurisdiction for cross-border proceedings against such companies (the so-called one-stop-shop principle). Facebook and a number of other international companies have their EU headquarters in Ireland. The Irish data protection authority has been criticised several times for dragging out numerous important cases against tech companies. The CJEU’s ruling is likely to lead to more enforcement proceedings by local data protection authorities.

In 2015 – before the GDPR came into force – the Belgian data protection authority filed a lawsuit in Belgian courts against Facebook’s collection of personal data via hidden tracking tools. These tracking tools even tracked users without Facebook accounts. After the GDPR came into force, Facebook argued that lawsuits against data protection violations could only be filed in Ireland. A court of appeal in Brussels then referred the question to the ECJ as to whether proceedings against Facebook were admissible in Belgium. This has now been confirmed by the ECJ. The Belgian court is now free to make a final decision (please see our blog post).

The CJEU has now ruled that, in principle, the lead data protection authority is responsible for prosecuting alleged GDPR violations if they involve cross-border data processing. The data processing must therefore take place in more than one Member State or have an impact on individuals in several member states. However, it is also specified that the “one-stop-shop” principle of the GDPR obliges the lead authority to cooperate closely with the respective local supervisory authority concerned. In addition, local data protection authorities may also have jurisdiction pursuant to Art. 56 (2) and Art. 66 GDPR. According to the CJEU, if the respective requirements of these provisions are met, a local supervisory authority may also initiate legal proceedings. The CJEU has clarified that actions by non-lead data protection authorities can still be upheld if they are based on the Data Protection Directive, the predecessor of the GDPR.

The EU consumer association BEUC called the ruling a positive development. BEUC Director General Monique Goyens said:

Most Big Tech companies are based in Ireland, and it should not be up to that country’s authority alone to protect 500 million consumers in the EU.

While Facebook’s associate general counsel Jack Gilbert said:

We are pleased that the CJEU has upheld the value and principles of the one-stop-shop mechanism, and highlighted its importance in ensuring the efficient and consistent application of GDPR across the EU.

New SCCs published by the EU Commission for international data transfers

10. June 2021

On June 4th 2021, the EU Commission adopted new standard contractual clauses (SCC) for international data transfers. The SCCs are model contracts that can constitute a suitable guarantee under Art. 46 of the General Data Protection Regulation (GDPR) for the transfer of personal data to third countries. Third countries are those outside the EU/European Economic Area (EEA), e.g. the USA.

The new clauses were long awaited, as the current standard contractual clauses are more than 10 years old and thus could neither take into account the requirements regarding third country transfers of the GDPR nor the significant Schrems II ruling of July 16th, 2020. Thus, third country transfers had become problematic and had not only recently been targeted by investigations by supervisory authorities, inter alia in Germany.

What is new about the SCCs now presented is above all their structure. The different types of data transfers are no longer spread over two different SCC models, but are found in one document. In this respect, they are divided into four different “modules”. This should allow for a flexible contract design. For this purpose, the appropriate module is to be selected according to the relationship of the parties. The following modules are included in the new SCCs:

Module 1: Transfer of personal data between two controllers.
Module 2: Transfer of personal data from the controller to the processor
Module 3: Transfer of personal data between two processors
Module 4: Transfer of personal data from the processor to the controller

The content of the new provisions also includes an obligation to carry out a data transfer impact assessment, i.e. the obligation to satisfy oneself that the contractual partner from the third country is in a position to fulfil its obligations under the current SCCs. Also newly included are the duty to defend against government requests that contradict the requirements of the standard protection clauses and to inform the competent supervisory authorities about the requests. The data transfer impact assessment must be documented and submitted to the supervisory authorities upon request.

The documents are the final working documents. The official publication of the SCCs in the Official Journal of the European Union took place on June 7th, 2021. From then on and within a period of 18 months until December 27th, 2022, the existing contracts with partners from third countries, in particular Microsoft or Amazon, must be supplemented with the new SCCs.

However, even if the new SCCs are used, a case-by-case assessment of the level of data protection remains unavoidable because the new clauses alone will generally not be sufficient to meet the requirements of the ECJ in the above-mentioned ruling. In such a case-by-case examination, the text of the contract and the actual level of data protection must be examined. The latter should be done by means of a questionnaire to the processor in the third country.

Accordingly, it is not enough to simply sign the new SCC, but the controller must take further action to enable secure data transfer to third countries.

EPRS publishes report on post-Brexit EU-UK Data Transfer Mechanisms

20. April 2021

On April 9th, 2021, the European Parliamentary Research Service (EPRS) published a report on data transfers in the private sector between the EU and the U.K. following Brexit.

The report reviews and assesses trade dealings, adequacy challenges and transfer instruments under the General Data Protection Regulation (GDPR). The report is intended to help take regulatory and business decisions, and in the Press Release the European Parliament stated that “a clear understanding of the state of play and future prospects for EU-UK transfers of personal data is indispensable”.

The report provides in-depth analysis of an adequacy decision for the UK as a viable long-term solution for data flows between the U.K. and the EU, also considering possible mechanisms for data transfer in the potential absence of an adequacy decision, such as Standard Contractual Clauses, Binding Corporate Rules, codes of conduct, and certification mechanism.

In this analysis the EPRS also sheds light on adequacy concerns such as U.K. surveillance laws and practices, shortcomings of the implementation of the GDPR, weak enforcement of data protection laws, and wavering commitment to EU data protection standards.

As part of its conclusion, the EPRS stated that the European Data Protection Board’s (‘EDPB’) opinion on the draft decision, which has just been published (please see our blogpost here), will likely scrutinise the Commission’s approach and provide recommendations on next steps.

EDPB adopts opinion on draft UK adequacy decisions

16. April 2021

In accordance with its obligation under Article 70 (1) (s) of the General Data Protection Regulation (GDPR), on April 13th, 2021, the European Data Protection Board (“EDPB”) adopted its opinions on the EU Commissions (“EC”) draft UK adequacy decision (please see our blog post). “Opinion 14/2021” is based on the GDPR and assesses both general data protection aspects and the public authority access to personal data transferred from the EEA for law enforcement and national security purposes contained in the draft adequacy decision, a topic the EC also discussed in detail. At the same time, the EDPB also issued “Opinion 15/2021” on the transfer of personal data under the Law Enforcement Directive (LED).

The EDPB notes that there is a strong alignment between the EU and the UK data protection regimes, especially in the principles relating to the processing of personal data. It expressly praises the fact that the adequacy decision is to apply for a limited period, as the EDPB also sees the danger that the UK could change its data protection laws. Andrea Jelinek, EDPB Chair, is quoted:

“The UK data protection framework is largely based on the EU data protection framework. The UK Data Protection Act 2018 further specifies the application of the GDPR in UK law, in addition to transposing the LED, as well as granting powers and imposing duties on the national data protection supervisory authority, the ICO. Therefore, the EDPB recognises that the UK has mirrored, for the most part, the GDPR and LED in its data protection framework and when analysing its law and practice, the EDPB identified many aspects to be essentially equivalent. However, whilst laws can evolve, this alignment should be maintained. So we welcome the Commission’s decision to limit the granted adequacy in time and the intention to closely monitor developments in the UK.”

But the EDPB also highlights areas of concern that need to be further monitored by the EC:

1. The immigration exemption, which restricts the rights of those data subjects affected.

2. How the transfer of personal data from the EEA to the UK could undermine EU data protection rules, for example on basis of future UK adequacy decisions.

3. Access to personal data by public authorities is given a lot of space in the opinion. For example, the Opinion analyses in detail the Investigatory Powers Act 2016 and related case law. The EDPB welcomes the numerous oversight and redress mechanisms in the UK but identifies a number of issues that need “further clarification and/or oversight”, namely bulk searches, independent assessment and oversight of the use of automated processing tools, and the safeguards provided under UK law when it comes to disclosure abroad, particularly with regard to the application of national security exemptions.

In summary, this EDPB opinion does not put any obstacles in the way of an adequacy decision and recognises that there are many areas where the UK and EU regimes converge. Nevertheless, it highlights very clearly that there are deficiencies, particularly in the UK’s system for monitoring national security, which need to be reviewed and kept under observation.

As for the next steps, the draft UK adequacy decisions will now be assessed by representatives of the EU Member States under the “comitology procedure“. The Commission can then adopt the draft UK adequacy decisions. A bridging period during which free data transfer to the UK is permitted even without an adequacy decision ends in June 2021 (please see our blog post).

Facebook data leak affects more than 500 million users

7. April 2021

Confidential data of 533 million Facebook users has surfaced in a forum for cybercriminals. A Facebook spokesperson told Business Insider that the data came from a leak in 2019.

The leaked data includes Facebook usernames and full name, date of birth, phone number, location and biographical information, and in some cases, the email address of the affected users. Business Insider has verified the leaked data through random sampling. Even though some of the data may be outdated, the leak poses risks if, for example, email addresses or phone numbers are used for hacking. The leak was made public by the IT security firm Hudson Rock. Their employees noticed that the data sets were offered by a bot for money in a hacking forum. The data set was then offered publicly for free and thus made accessible to everyone.

The US magazine Wired points out that Facebook is doing more to confuse than to help clarify. First, Facebook referred to an earlier security vulnerability in 2019, which we already reported. This vulnerability was patched in August last year. Later, a blog post from a Facebook product manager confirmed that it was a major security breach. However, the data had not been accessed through hacking, but rather the exploitation of a legitimate Facebook feature. In addition, the affected data was so old that GDPR and U.S. privacy laws did not apply, he said. In the summer of 2019, Facebook reached an agreement with the U.S. Federal Trade Commission (FTC) to pay a $5 billion fine for all data breaches before June 12, 2019. According to Wired, the current database is not congruent with the one at issue at the time, as the most recent Facebook ID in it is from late May 2019.

Users can check whether they are affected by the data leak via the website HaveIBeenPwned.

EU and South Korea complete adequacy talks

6. April 2021

On March 30th, 2021, EU Justice Commissioner Didier Reynders and Chairperson of the Personal Information Protection Commission of the Republic of Korea Yoon Jong In announced the successful conclusion of adequacy talks between the EU und the Republic of Korea (“South Korea”). These adequacy discussions began in 2017, and there was already initially a high level of convergence between the EU and the Republic of Korea on data protection issues, which has been further enhanced by additional safeguards to further strengthen the level of protection in South Korea. Recently, South Korea’s Personal Information Protection Act (“PIPA”) took effect and the investigative and enforcement powers of South Korea’s data protection authority, the Personal Information Protection Commission (“PIPC”), were strengthened.

In the GDPR, this adequacy decision is based on Art. 45 GDPR. Article 45(3) GDPR empowers the EU Commission to adopt an implementing act to determine that a non-EU country ensures an “adequate level of protection”. This means a level of protection for personal data that is substantially equivalent to the level of protection within the EU. Once it has been determined that a non-EU country provides an “adequate level of protection”, transfers of personal data from the EU to that non-EU country can take place without further requirements. South Korea will be the 13th country to which personal data may be transferred on the basis of an adequacy decision. An adequacy decision covering both commercial providers and the public sector will enable free and secure data flows between the EU and the Republic of Korea and it will complement the EU-Republic of Korea Free Trade Agreement.

Until the free flow of data can occur, the EU Commission must initiate the procedure for adopting its adequacy finding. In this procedure, the European Data Protection Board will issue an opinion and a committee composed of representatives of the EU member states must agree. The EU Commission may then adopt the adequacy decision.

EDPB released a new Guidance on Virtual Voice Assistants

31. March 2021

In recent years, Virtual Voice Assistants (VVA) have enjoyed increased popularity among technophile consumers. VVAs are integrated in modern smartphones like Siri on Apple or Google Assistant on Android mobile devices, but can also be found in seperate terminal devices like Alexa on the Amazon Echo device. With Smart Homes trending, VVAs are finding their ways into many homes.

However, in light of their general mode of operation and their specific usage, VVAs potentially have access to a large amount of personal data. They furthermore use new technologies such as machine learning and artificial intelligence in order to improve their services.

As both private households and corporate businesses are increasingly using VVAs and questions on data protection arise, the European Data Protection Board (EDPB) sought to provide guidance to the relevant data controllers. Therefore, the EDPB published a guidance on Virtual Voice Assistants earlier this month.

In its guidance, the EDPB specifically addresses VVA providers and VVA application developers. It encourages them to take considerations of data protection into account when designing their VVA service, as layed out by the principle of data protection by design and default under Art. 25 GDPR. The EDPB suggests that, for example, controllers could fulfil their information obligations pursuant to Art. 13/14 GDPR using voice based notifications if the VVA works with a screenless terminal device. VVA designers could also enable users to initiate a data subject request though easy-to-follow voice commands.

Moreover, the EDPB states that in their opinion, providing VVA services will require a Data Protection Impact Assessment according to Art. 35 GDPR. The guidance also gives further advice on complying with general data protection principles and is still open for public consultation until 23 April 2021.

Data Breach made 136,000 COVID-19 test results publicly accessible

18. March 2021

Personal health data are considered a special category of personal data under Art. 9 of the GDPR and are therefore given special protections. A group of IT experts, including members of the German Chaos Computer Club (CCC), has now revealed security gaps in the software for test centres by which more than 136,000 COVID-19 test results of more than 80,000 data subjects have apparently been unprotected on the internet for weeks.

The IT-Security experts’ findings concern the software “SafePlay” of the Austrian company Medicus AI. Many test centres use this software to allocate appointments and to make test results digitally available to those tested. In fact, more than 100 test centres and mobile test teams in Germany and Austria are affected by the recent data breach. These include public facilities in Munich, Berlin, Mannheim as well as fixed and temporary testing stations in companies, schools and daycare centres.

In order to view the test results unlawfully, one only needed to create an account for a COVID-19 test. The URL for the test result contained the number of the test. If this number was simply counted up or down, the “test certificates” of other people became freely accessible. In addition to the test result, the test certificate also contained the name, date of birth, private address, nationality and ID number of the person concerned.

It remains unresolved whether the vulnerabilities have been exploited prior to the discovery by the CCC. The CCC notified both Medius AI and the Data Protection Authorities about the leak which led to a quick response by the company. However, IT experts and Privacy-focused NGOs commented that Medicus AI was irresponsible and grossly negligent with respect to their security measures leading to the potential disclosure of an enormous amount of sensitive personal health data.

European Commission publishes draft UK adequacy decisions

25. February 2021

On February 19th, 2021, the European Commission (EC) has published the draft of two adequacy decisions for the transfer of personal data to the United Kingdom (UK), one under the General Data Protection Regulation (GDPR) and the second for the Law Enforcement Directive. If approved, the decisions would confer adequacy status on the UK and ensure that personal data from the EU can continue to flow freely to the UK. In the EC’s announcement launching the process to adopt the newly drafted adequacy decisions Didier Reynders, Commissioner for Justice, is quoted:

We have thoroughly checked the privacy system that applies in the UK after it has left the EU. Now European Data Protection Authorities will thoroughly examine the draft texts. EU citizens’ fundamental right to data protection must never be compromised when personal data travel across the Channel. The adequacy decisions, once adopted, would ensure just that.

In the GDPR, this adequacy decision is based on Art. 45 GDPR. Article 45(3) GDPR empowers the EU Commission to adopt an implementing act to determine that a non-EU country ensures an “adequate level of protection”. This means a level of protection for personal data that is substantially equivalent to the level of protection within the EU. Once it has been determined that a non-EU country provides an “adequate level of protection”, transfers of personal data from the EU to that non-EU country can take place without further requirements. In the UK, the processing of personal data is governed by the “UK GDPR” and the Data Protection Act 2018, which are based on the EU GDPR. The UK is and has committed to remain part of the European Convention on Human Rights and “Convention 108” of the Council of Europe. “Convention 108” is a binding treaty under international law to protect individuals from abuses in the electronic processing of personal data, and in particular provides for restrictions on cross-border data flows where data is to be transferred to states where no comparable protection exists.

The GDPR adequacy decision draft addresses several areas of concern. One of these is the power of intelligence services in the UK. In this respect, the draft focuses on legal bases, restrictions and safeguards for the collection of information for national security purposes. It also details the oversight structure over the intelligence services and the remedies available to those affected. Another aspect discussed is the limitation of data subjects’ rights in the context of UK immigration law. The EC concludes that interference with individuals’ fundamental rights is limited to what is strictly necessary to achieve a legitimate purpose and that there is effective legal protection against such interference. As the UK GDPR is based on the GDPR and therefore the UK privacy laws should provide an adequate level of protection for data subjects, the main risks for EU data subjects do not lie in the current status of these laws but in possible changes of these laws in the future. For this reason, the EU Commission has built a fixed period of validity into the draft adequacy decision. If adopted, this decision would be valid for a period of four years and the adequacy finding could be extended for a further four years if the level of protection in the UK remains adequate. However, this extension would not be automatic, but subject to a thorough review. This draft marks the first time that the EU has imposed a time limit on an adequacy decision. Other adequacy decisions are subject to monitoring and regular review but are not time-limited by default.

The UK government welcomed the EC’s draft in a statement, while also calling on the EU to “swiftly complete” the process for adopting and formalizing the adequacy decisions, as the “bridging mechanism” will only remain in force until June 30th. Under the EU-UK Trade and Cooperation Agreement, the EU and UK agreed on a transition period of up to six months from January 1st, 2021, during which the UK is treated as an adequate jurisdiction (please see our blog post). The draft adequacy decisions address the flow of data from the EU to the UK. The flow of data from the UK to the EU is governed by UK legislation that has applied since 1 January 2021. The UK has decided that the EU ensures an adequate level of protection and that data can therefore flow freely from the UK to the EU.

Next, the non-binding opinion of the European Data Protection Board is sought (Art. 70 GDPR). After hearing the opinion of the European Data Protection Board, the representatives of the member states must then confirm the draft in the so-called comitology procedure. This procedure is used when the EC is given the power to implement legal acts that lay down conditions for the uniform application of a law. A series of procedures ensure that EU countries have a say in the implementing act. After the comitology procedure, the EC is free to adopt the drafts.

University fined for omitted notification of a data breach

4. February 2021

The President of the Personal Data Protection Office in Poland (UODO) imposed a fine on the Medical University of Silesia in the amount of PLN 25.000 (approx. EUR 5.600). The university had suffered a data breach of which it should have notified the supervisory authority and the data subjects according to Articles 33, 34 GDPR, but failed to do so.

First indications of the data breach reached UODO in early June 2020. It was related to exams held at the end of May 2020 by videoconference on an e-learning platform. These were also being recorded. Before the exam, students were identified by their IDs or student cards, so a large amount of their personal data was documented on the recordings. After the exam was completed, the recordings were made available on the platform. However, not only the examinees had access to the platform, but also a wider group of people, about which the students had not been informed. In addition, using a direct link, any extern person could access the recordings and therefore the data of the examinees. Many students, fearing that the video would be deleted to cover up the incident, secured the file or took photographs of the computer screens to protect evidence. Eventually, the chancellor (being the decision-making unit) expressed the position that the incident of 200 people viewing the IDs of some 100-150 other people cannot be considered a personal data breach.

The controller, who was requested to clarify the situation by UODO, did not dispute the data breach. In fact, the virtual room of the platform is only available to the exam group and only those people have access to the recordings. The violation occurred because one of the employees did not close access to the virtual room after the exam. Though, the controller stated that no notification was required. In his opinion the risk to the rights or freedoms of the data subjects was low. Moreover, after the incident, the system was modified to prevent students from downloading the exam files. The controller also indicated that he identified the individuals who had done so and informed them about their criminal liability for disseminating the data.

Despite several letters from UODO, the university still omitted to report the data breach and notify the data subjects. Therefore, administrative proceedings were initiated. UODO found that the controller failed to comply with his obligations to notify both the supervisory authority and affected data subjects as well as improperly assessed the risk involved.

When imposing the fine, the President of UODO took into account the duration of the infringement (several months), the intentional action of the controller and his unsatisfactory cooperation with the supervisory authority. The fine will serve not only a repressive but also a preventive function, as it shows that the obligations arisen in connection with data breaches cannot be ignored. All the more so because an inappropriate approach to the obligations imposed by the GDPR may lead to negative consequences for those affected by the breaches.

Pages: 1 2 3 4 5 6 7 8 9 10 ... 12 13 14 Next
1 2 3 14