Category: General Data Protection Regulation

New EU SCC must be used as of now

29. September 2021

In June 2021, the European Commission published the long-awaited new Standard Contractual Clauses (SCC) for the transfers of personal data to so-called third countries under the General Data Protection Regulation (GDPR) (please see our blog post). These new SCC modules replace the three 10-year-old SCC sets that were adopted under the EU Data Protection Directive 95/46/EC and thus could not meet the requirements of the GDPR for data transfers to third countries, nor the significant Schrems II ruling of July 16th, 2020 (please see our blog post). The transfer of data to third countries has not only recently become problematic and a focus of supervisory authorities.

As of Monday, September 27th, 2021, these new SCC must be used for new contracts entered into after September 26th, 2021, and for new processing activities that begin after September 26th, if the contract or processing activity involves the transfer of personal data to so-called inadequate third countries. These are countries outside of the European Economic Area (EEA) not deemed to have an adequate level of data protection by an adequacy decision of the European Commission.

Contracts signed before September 27th, 2021, based on the old SCC will still be considered adequate until December 27th, 2022. For these contracts, the old SCCs already signed can be maintained in the meantime as long as the processing of personal data that is the subject of the contract in question does not change. The SCC used for these contracts must be updated to the new SCC, or other data transfer mechanisms in accordance with the GDPR, by December 27th, 2022. As of that date, all SCC used as safeguards for data transfers to inadequate third countries must be the new SCC.

noyb filed complaints against the cookie paywalls of seven major news websites in Austria and Germany

25. August 2021

Privacy Activist Max Schrems’ data protection organization noyb (an acronym for “none of your business”) announced on August 13th, 2021, they filed complaints against the cookie paywalls of seven major German and Austrian news websites. In the statement, they question whether consent can be “voluntarily” given if you have to pay to keep your data.

An increasing amount of websites asks their users to either agree to data being passed on to hundreds of tracking companies (which generates a few cents of revenue for the website) or take out a subscription (for up to € 80 per year). Can consent be considered “freely given” if the alternative is to pay 10, 20 or 100 times the market price of your data to keep it to yourself?

With these paywalls, the user must decide whether to agree to the use of his or her own data for advertising purposes or to enter into a paid subscription with the respective publisher. However, personal data may only be processed if there is a legal basis for doing so. Such a legal basis may arise, for example, from Article 6 (1) (a) of the GDPR, if the data subject has given his or her consent to this processing. Such consent must be “freely given”. According to Rectical 42, sentence 5, “consent is not regarded as freely given if the data subject has no genuine or free choice or is unable to refuse or withdraw consent without detriment.” noyb is of the opinion that the paywall solution lacks the necessary voluntariness for consent and thus also lacks a legal basis according to Art. 6 (1) a) DSGVO.

Art. 7 (4) GDPR demands, “when assessing whether consent is freely given, utmost account shall be taken of whether, inter alia, the performance of a contract, including the provision of a service, is conditional on consent to the processing of personal data that is not necessary for the performance of that contract.”

In contrast, in a decision on November 30th, 2018, the Austrian data protection authority did not see a violation of the GDPR in a paywall system, as the data subject receives a recognizable benefit, and expressed that the decision was thus voluntary after all.

Accordingly, users’ personal data could be considered a “means of payment” with which they pay for a paid subscription instead of a monetary benefit. Consent to data processing would thus be necessary for fulfillment, as it represents the quid pro quo the data subject, in other words, the purchase price. How the responsible data protection authorities will ultimately decide remains to be seen.

These complaints by noyb represent the organization’s second major campaign this month. On August 10, they have already filed 422 formal complaints with 10 European regulators based on inadequate cookie banners.

CJEU ruling on One-Stop-Shop mechanism

25. June 2021

On June 15th, 2021, the Court of Justice of the European Union (CJEU) ruled that “under certain conditions, a national supervisory authority may exercise its power to bring any alleged infringement of the GDPR before a court of a member state, even though that authority is not the lead supervisory authority”. It grants each supervisory authority the power to bring matters within its supervisory area before the courts. If a non-lead supervisory authority wishes to bring cross-border cases to court, it can do so under the so-called emergency procedure under Article 66 of the GDPR.

The General Data Protection Regulation (GDPR) provides that the data protection authority of the country in which a company has its principal place of business in the EU has primary jurisdiction for cross-border proceedings against such companies (the so-called one-stop-shop principle). Facebook and a number of other international companies have their EU headquarters in Ireland. The Irish data protection authority has been criticised several times for dragging out numerous important cases against tech companies. The CJEU’s ruling is likely to lead to more enforcement proceedings by local data protection authorities.

In 2015 – before the GDPR came into force – the Belgian data protection authority filed a lawsuit in Belgian courts against Facebook’s collection of personal data via hidden tracking tools. These tracking tools even tracked users without Facebook accounts. After the GDPR came into force, Facebook argued that lawsuits against data protection violations could only be filed in Ireland. A court of appeal in Brussels then referred the question to the ECJ as to whether proceedings against Facebook were admissible in Belgium. This has now been confirmed by the ECJ. The Belgian court is now free to make a final decision (please see our blog post).

The CJEU has now ruled that, in principle, the lead data protection authority is responsible for prosecuting alleged GDPR violations if they involve cross-border data processing. The data processing must therefore take place in more than one Member State or have an impact on individuals in several member states. However, it is also specified that the “one-stop-shop” principle of the GDPR obliges the lead authority to cooperate closely with the respective local supervisory authority concerned. In addition, local data protection authorities may also have jurisdiction pursuant to Art. 56 (2) and Art. 66 GDPR. According to the CJEU, if the respective requirements of these provisions are met, a local supervisory authority may also initiate legal proceedings. The CJEU has clarified that actions by non-lead data protection authorities can still be upheld if they are based on the Data Protection Directive, the predecessor of the GDPR.

The EU consumer association BEUC called the ruling a positive development. BEUC Director General Monique Goyens said:

Most Big Tech companies are based in Ireland, and it should not be up to that country’s authority alone to protect 500 million consumers in the EU.

While Facebook’s associate general counsel Jack Gilbert said:

We are pleased that the CJEU has upheld the value and principles of the one-stop-shop mechanism, and highlighted its importance in ensuring the efficient and consistent application of GDPR across the EU.

New SCCs published by the EU Commission for international data transfers

10. June 2021

On June 4th 2021, the EU Commission adopted new standard contractual clauses (SCC) for international data transfers. The SCCs are model contracts that can constitute a suitable guarantee under Art. 46 of the General Data Protection Regulation (GDPR) for the transfer of personal data to third countries. Third countries are those outside the EU/European Economic Area (EEA), e.g. the USA.

The new clauses were long awaited, as the current standard contractual clauses are more than 10 years old and thus could neither take into account the requirements regarding third country transfers of the GDPR nor the significant Schrems II ruling of July 16th, 2020. Thus, third country transfers had become problematic and had not only recently been targeted by investigations by supervisory authorities, inter alia in Germany.

What is new about the SCCs now presented is above all their structure. The different types of data transfers are no longer spread over two different SCC models, but are found in one document. In this respect, they are divided into four different “modules”. This should allow for a flexible contract design. For this purpose, the appropriate module is to be selected according to the relationship of the parties. The following modules are included in the new SCCs:

Module 1: Transfer of personal data between two controllers.
Module 2: Transfer of personal data from the controller to the processor
Module 3: Transfer of personal data between two processors
Module 4: Transfer of personal data from the processor to the controller

The content of the new provisions also includes an obligation to carry out a data transfer impact assessment, i.e. the obligation to satisfy oneself that the contractual partner from the third country is in a position to fulfil its obligations under the current SCCs. Also newly included are the duty to defend against government requests that contradict the requirements of the standard protection clauses and to inform the competent supervisory authorities about the requests. The data transfer impact assessment must be documented and submitted to the supervisory authorities upon request.

The documents are the final working documents. The official publication of the SCCs in the Official Journal of the European Union took place on June 7th, 2021. From then on and within a period of 18 months until December 27th, 2022, the existing contracts with partners from third countries, in particular Microsoft or Amazon, must be supplemented with the new SCCs.

However, even if the new SCCs are used, a case-by-case assessment of the level of data protection remains unavoidable because the new clauses alone will generally not be sufficient to meet the requirements of the ECJ in the above-mentioned ruling. In such a case-by-case examination, the text of the contract and the actual level of data protection must be examined. The latter should be done by means of a questionnaire to the processor in the third country.

Accordingly, it is not enough to simply sign the new SCC, but the controller must take further action to enable secure data transfer to third countries.

EPRS publishes report on post-Brexit EU-UK Data Transfer Mechanisms

20. April 2021

On April 9th, 2021, the European Parliamentary Research Service (EPRS) published a report on data transfers in the private sector between the EU and the U.K. following Brexit.

The report reviews and assesses trade dealings, adequacy challenges and transfer instruments under the General Data Protection Regulation (GDPR). The report is intended to help take regulatory and business decisions, and in the Press Release the European Parliament stated that “a clear understanding of the state of play and future prospects for EU-UK transfers of personal data is indispensable”.

The report provides in-depth analysis of an adequacy decision for the UK as a viable long-term solution for data flows between the U.K. and the EU, also considering possible mechanisms for data transfer in the potential absence of an adequacy decision, such as Standard Contractual Clauses, Binding Corporate Rules, codes of conduct, and certification mechanism.

In this analysis the EPRS also sheds light on adequacy concerns such as U.K. surveillance laws and practices, shortcomings of the implementation of the GDPR, weak enforcement of data protection laws, and wavering commitment to EU data protection standards.

As part of its conclusion, the EPRS stated that the European Data Protection Board’s (‘EDPB’) opinion on the draft decision, which has just been published (please see our blogpost here), will likely scrutinise the Commission’s approach and provide recommendations on next steps.

EDPB adopts opinion on draft UK adequacy decisions

16. April 2021

In accordance with its obligation under Article 70 (1) (s) of the General Data Protection Regulation (GDPR), on April 13th, 2021, the European Data Protection Board (“EDPB”) adopted its opinions on the EU Commissions (“EC”) draft UK adequacy decision (please see our blog post). “Opinion 14/2021” is based on the GDPR and assesses both general data protection aspects and the public authority access to personal data transferred from the EEA for law enforcement and national security purposes contained in the draft adequacy decision, a topic the EC also discussed in detail. At the same time, the EDPB also issued “Opinion 15/2021” on the transfer of personal data under the Law Enforcement Directive (LED).

The EDPB notes that there is a strong alignment between the EU and the UK data protection regimes, especially in the principles relating to the processing of personal data. It expressly praises the fact that the adequacy decision is to apply for a limited period, as the EDPB also sees the danger that the UK could change its data protection laws. Andrea Jelinek, EDPB Chair, is quoted:

“The UK data protection framework is largely based on the EU data protection framework. The UK Data Protection Act 2018 further specifies the application of the GDPR in UK law, in addition to transposing the LED, as well as granting powers and imposing duties on the national data protection supervisory authority, the ICO. Therefore, the EDPB recognises that the UK has mirrored, for the most part, the GDPR and LED in its data protection framework and when analysing its law and practice, the EDPB identified many aspects to be essentially equivalent. However, whilst laws can evolve, this alignment should be maintained. So we welcome the Commission’s decision to limit the granted adequacy in time and the intention to closely monitor developments in the UK.”

But the EDPB also highlights areas of concern that need to be further monitored by the EC:

1. The immigration exemption, which restricts the rights of those data subjects affected.

2. How the transfer of personal data from the EEA to the UK could undermine EU data protection rules, for example on basis of future UK adequacy decisions.

3. Access to personal data by public authorities is given a lot of space in the opinion. For example, the Opinion analyses in detail the Investigatory Powers Act 2016 and related case law. The EDPB welcomes the numerous oversight and redress mechanisms in the UK but identifies a number of issues that need “further clarification and/or oversight”, namely bulk searches, independent assessment and oversight of the use of automated processing tools, and the safeguards provided under UK law when it comes to disclosure abroad, particularly with regard to the application of national security exemptions.

In summary, this EDPB opinion does not put any obstacles in the way of an adequacy decision and recognises that there are many areas where the UK and EU regimes converge. Nevertheless, it highlights very clearly that there are deficiencies, particularly in the UK’s system for monitoring national security, which need to be reviewed and kept under observation.

As for the next steps, the draft UK adequacy decisions will now be assessed by representatives of the EU Member States under the “comitology procedure“. The Commission can then adopt the draft UK adequacy decisions. A bridging period during which free data transfer to the UK is permitted even without an adequacy decision ends in June 2021 (please see our blog post).

Facebook data leak affects more than 500 million users

7. April 2021

Confidential data of 533 million Facebook users has surfaced in a forum for cybercriminals. A Facebook spokesperson told Business Insider that the data came from a leak in 2019.

The leaked data includes Facebook usernames and full name, date of birth, phone number, location and biographical information, and in some cases, the email address of the affected users. Business Insider has verified the leaked data through random sampling. Even though some of the data may be outdated, the leak poses risks if, for example, email addresses or phone numbers are used for hacking. The leak was made public by the IT security firm Hudson Rock. Their employees noticed that the data sets were offered by a bot for money in a hacking forum. The data set was then offered publicly for free and thus made accessible to everyone.

The US magazine Wired points out that Facebook is doing more to confuse than to help clarify. First, Facebook referred to an earlier security vulnerability in 2019, which we already reported. This vulnerability was patched in August last year. Later, a blog post from a Facebook product manager confirmed that it was a major security breach. However, the data had not been accessed through hacking, but rather the exploitation of a legitimate Facebook feature. In addition, the affected data was so old that GDPR and U.S. privacy laws did not apply, he said. In the summer of 2019, Facebook reached an agreement with the U.S. Federal Trade Commission (FTC) to pay a $5 billion fine for all data breaches before June 12, 2019. According to Wired, the current database is not congruent with the one at issue at the time, as the most recent Facebook ID in it is from late May 2019.

Users can check whether they are affected by the data leak via the website HaveIBeenPwned.

EU and South Korea complete adequacy talks

6. April 2021

On March 30th, 2021, EU Justice Commissioner Didier Reynders and Chairperson of the Personal Information Protection Commission of the Republic of Korea Yoon Jong In announced the successful conclusion of adequacy talks between the EU und the Republic of Korea (“South Korea”). These adequacy discussions began in 2017, and there was already initially a high level of convergence between the EU and the Republic of Korea on data protection issues, which has been further enhanced by additional safeguards to further strengthen the level of protection in South Korea. Recently, South Korea’s Personal Information Protection Act (“PIPA”) took effect and the investigative and enforcement powers of South Korea’s data protection authority, the Personal Information Protection Commission (“PIPC”), were strengthened.

In the GDPR, this adequacy decision is based on Art. 45 GDPR. Article 45(3) GDPR empowers the EU Commission to adopt an implementing act to determine that a non-EU country ensures an “adequate level of protection”. This means a level of protection for personal data that is substantially equivalent to the level of protection within the EU. Once it has been determined that a non-EU country provides an “adequate level of protection”, transfers of personal data from the EU to that non-EU country can take place without further requirements. South Korea will be the 13th country to which personal data may be transferred on the basis of an adequacy decision. An adequacy decision covering both commercial providers and the public sector will enable free and secure data flows between the EU and the Republic of Korea and it will complement the EU-Republic of Korea Free Trade Agreement.

Until the free flow of data can occur, the EU Commission must initiate the procedure for adopting its adequacy finding. In this procedure, the European Data Protection Board will issue an opinion and a committee composed of representatives of the EU member states must agree. The EU Commission may then adopt the adequacy decision.

EDPB released a new Guidance on Virtual Voice Assistants

31. March 2021

In recent years, Virtual Voice Assistants (VVA) have enjoyed increased popularity among technophile consumers. VVAs are integrated in modern smartphones like Siri on Apple or Google Assistant on Android mobile devices, but can also be found in seperate terminal devices like Alexa on the Amazon Echo device. With Smart Homes trending, VVAs are finding their ways into many homes.

However, in light of their general mode of operation and their specific usage, VVAs potentially have access to a large amount of personal data. They furthermore use new technologies such as machine learning and artificial intelligence in order to improve their services.

As both private households and corporate businesses are increasingly using VVAs and questions on data protection arise, the European Data Protection Board (EDPB) sought to provide guidance to the relevant data controllers. Therefore, the EDPB published a guidance on Virtual Voice Assistants earlier this month.

In its guidance, the EDPB specifically addresses VVA providers and VVA application developers. It encourages them to take considerations of data protection into account when designing their VVA service, as layed out by the principle of data protection by design and default under Art. 25 GDPR. The EDPB suggests that, for example, controllers could fulfil their information obligations pursuant to Art. 13/14 GDPR using voice based notifications if the VVA works with a screenless terminal device. VVA designers could also enable users to initiate a data subject request though easy-to-follow voice commands.

Moreover, the EDPB states that in their opinion, providing VVA services will require a Data Protection Impact Assessment according to Art. 35 GDPR. The guidance also gives further advice on complying with general data protection principles and is still open for public consultation until 23 April 2021.

Data Breach made 136,000 COVID-19 test results publicly accessible

18. March 2021

Personal health data are considered a special category of personal data under Art. 9 of the GDPR and are therefore given special protections. A group of IT experts, including members of the German Chaos Computer Club (CCC), has now revealed security gaps in the software for test centres by which more than 136,000 COVID-19 test results of more than 80,000 data subjects have apparently been unprotected on the internet for weeks.

The IT-Security experts’ findings concern the software “SafePlay” of the Austrian company Medicus AI. Many test centres use this software to allocate appointments and to make test results digitally available to those tested. In fact, more than 100 test centres and mobile test teams in Germany and Austria are affected by the recent data breach. These include public facilities in Munich, Berlin, Mannheim as well as fixed and temporary testing stations in companies, schools and daycare centres.

In order to view the test results unlawfully, one only needed to create an account for a COVID-19 test. The URL for the test result contained the number of the test. If this number was simply counted up or down, the “test certificates” of other people became freely accessible. In addition to the test result, the test certificate also contained the name, date of birth, private address, nationality and ID number of the person concerned.

It remains unresolved whether the vulnerabilities have been exploited prior to the discovery by the CCC. The CCC notified both Medius AI and the Data Protection Authorities about the leak which led to a quick response by the company. However, IT experts and Privacy-focused NGOs commented that Medicus AI was irresponsible and grossly negligent with respect to their security measures leading to the potential disclosure of an enormous amount of sensitive personal health data.

Pages: 1 2 3 4 5 6 7 8 9 10 ... 12 13 14 Next
1 2 3 14