Category: EU

EDPB published Guideline on Data Breach Examples for Controllers

28. January 2021

On January 18th, 2021, the European Data Protection Board (EDPB) published their draft Guidelines 01/2021 on Examples regarding Data Breach Notification.

These Guidelines are supposed to give further support to Controllers alongside the initial Guidelines on Personal Data Breach Notification under the GDPR, adopted by the Article 29 Working Party in February 2018. These new Guidelines are meant to consider different types of situations that the Supervisory Authorities have come across in the last two and a half years since the implementation of the GDPR.

The EDPB’s intention is to assist Controllers in deciding how to handle data breaches, namely by identifying the factors that they must consider when conducting risk assessments to determine whether a breach must be reported to relevant Supervisory Authorities as well as if a notification to the affected Data Subjects is necessary.

The draft Guidelines present examples of common data breach scenarios, including:

• ransomware attacks, where a malicious code encrypts the personal data and the attacker subsequently asks the controller for a ransom in exchange for the decryption code
• data exfiltration attacks, which exploit vulnerabilities in online services offered by the controller and typically aim at copying, exfiltrating and abusing personal data for malicious purposes
• human errors resulting in data breaches that are fairly common and can be both intentional and unintentional
• lost or stolen devices and paper documents
• “mispostal” scenarios, that arise from human error without malicious intent
• social engineering, such as identity theft and email exfiltration

The draft Guidelines further emphasize key elements of data breach management and response that organizations should consider, namely:

• proactively identifying system vulnerabilities in order to prevent data breaches from happening in the first place
• assessing whether a breach is likely to result in a risk to the rights and freedoms of the Data Subject, the timing of this assessment and the importance of Controllers not delaying a notification because of unclear circumstances
• implementing plans, procedures and guidelines indicating how to handle data breaches that have clear reporting lines and persons responsible for the recovery process
• organizing regular trainings for employees to raise awareness on data breach management, and the latest developments in the area
• documenting breaches in each and every case, irrespective of the risk they pose

The Guidelines will be open for public consultation until March 2nd, 2021, during which the EDPB will gather feedback on the draft.

CJEU Advocate General’s opinion on GDPR’s One-Stop-Shop mechanism

26. January 2021

On January 13, 2021, the Advocate General (“AG”) of the Court of Justice of the European Union (“CJEU”) published an opinion in the case of Facebook Ireland Limited, Facebook INC, Facebook Belgium BVBA v the Belgian Data Protection Authority “Gegevensbeschermingsautoriteit” (“Belgian DPA”), addressing the General Data Protection Regulation’s (“GDPR”) One-Stop-Shop mechanism.

In 2015, the Belgian DPA initiated several legal proceedings against Facebook Group members in local courts. The allegation was that Facebook placed cookies on devices of Belgian users without their consent, thereby collecting data in an excessive manner. Facebook argued that with the GDPR becoming applicable in 2018, the Belgian DPA lost its competence to continue the legal proceedings, as Facebook’s lead supervisory authority under the GDPR is the Irish Data Protection Commission. The Belgian Court of Appeal referred several questions to the CJEU, including whether the GDPR’s One-Stop-Shop regime prevented national DPA’s from initiating proceedings in the national courts when it is not the lead DPA.

The AG responded that, in his opinion, the lead DPA has the general jurisdiction over cross-border data processing, while a national DPA may exceptionally bring proceedings before its own national courts. The national DPA’s right is subject to the One-Stop-Shop regime and cooperation and consistency mechanism of the GDPR. Thus, each national DPA has the competence to initiate proceedings against possible infringements affecting its territory, the significant regulatory role of the lead DPA limits this competence with respect to cross-border data processing.

One of the concerns expressed by the Belgian DPA was the risk of insufficient enforcement if only lead DPA’s may act against organizations that do not comply with the GDPR. In this regard, the GA emphasizes that Art. 61 GDPR specifically provides for appropriate mechanisms to address such concerns. National DPA’s have the possibility to ask the lead DPA for assistance in investigations, and if such assistance is not provided, the national DPA concerned may take action itself.

In certain circumstances, the AG sees the possibility for national DPAs not acting as lead DPA to initiate proceedings before their national court, if

  • the DPA is acting outside of the material scope of the GDPR; e.g., because the processing does not involve personal data;
  • cross-border data processing is carried out by public authorities, in the public interest, or to comply with legal obligations;
  • the processor is not established in the EU;
  • there is an urgent need to act to protect the rights and freedoms of data subjects (Art. 66 GDPR);
  • the lead DPA has decided not to process a case.

With regards to data subjects, the AG notes that data subjects can bring action against any controller or processor before the court of their Member State and may file a complaint with their Member State’s DPA, regardless of which Member State’s DPA is the lead DPA.

The AG’s opinion is not legally binding on the CJEU, although the CJEU will take it into account. A final judgment of the CJEU is expected in the coming months. Thereafter, the Belgian Court of Appeal will have to decide its case in accordance with the CJEU’s judgment. The CJEU’s decision will most likely have a lasting impact on the division of roles between lead DPAs and other national DPAs, as well as on the ability of national DPAs to take enforcement actions into their own hands.

WhatsApp’s privacy policy update halted

22. January 2021

Already at the beginning of December 2020, first indications came up signaling that WhatsApp will change its terms of service and privacy policy. Earlier this year, users received the update notice when launching the app on their device. It stated that the new terms concern additional information on how WhatsApp processes user data and how businesses can use Facebook hosted services to store and manage their WhatsApp chats. The terms should be accepted by February 8th, 2021, to continue using the chat service. Otherwise, the deletion of the account was suggested, because it will not be possible to use WhatsApp without accepting the changes. The notice has caused all sorts of confusion and criticism, because it has mistakenly made many users believe that the agreement allows WhatsApp to share all collected user data with company parent Facebook, which had faced repeated privacy controversies in the past.

Users’ fears in this regard are not entirely unfounded. As a matter of fact, outside the EU, WhatsApp user data has already been flowing to Facebook since 2016 – for advertising purposes, among other things. Though, for the EU and the United Kingdom, other guidelines apply without any data transfer.

The negative coverage and user reactions caused WhatsApp to hastily note that the changes explicitly do not affect EU users. Niamh Sweeney, director of policy at WhatsApp, said via Twitter that it remained the case that WhatsApp did not share European user data with Facebook for the purpose of using this data to improve Facebook’s products or ads.

However, since the topic continues to stir the emotions, WhatsApp felt compelled to provide clarification with a tweet and a FAQ. The statements make it clear once again that the changes are related to optional business features and provide further transparency about how the company collects and uses data. The end-to-end encryption, with which chat content is only visible to the participating users, will not be changed. Moreover, the new update does not expand WhatsApp’s ability to share data with Facebook.

Nevertheless, despite all efforts, WhatsApp has not managed to explain the changes in an understandable way. It has even had to accept huge user churn in recent days. The interest in messenger alternatives has increased enormously. Eventually, the public backlash led to an official announcement that the controversial considered update will be delayed until May 15th, 2021. Due to misinformation and concern, users shall be given more time to review the policy on their own in order to understand WhatsApp’s privacy and security principles.

EU-UK Trade Deal in light of Data Protection

4. January 2021

Almost fit to be called a Christmas miracle, the European Union (EU) and the United Kingdom (UK) came to an agreement on December 24th, 2020. The Trade Agreement, called in full length “EU-UK Trade and Cooperation Agreement“, is set out to define new rules from the date of the UK Exit from the EU, January 1st, 2021.

President of the European Commission, Ursula von der Leyen, claimed it was a deal worth fighting for, “because we now have a fair and balanced agreement with the UK, which will protect our European interests, ensure fair competition, and provide much needed predictability for our fishing communities. Finally, we can leave Brexit behind us and look to the future. Europe is now moving on.

In light of Data Protection however, the new Trade Deal has not given much certainty of what is to come next.

Both sides are aware that an adequacy decision by the EU Commission is very important with regard to data protection and cross-border data flows. Accordingly, the EU has agreed to allow a period of four months, extendable by a further two months, during which data can be transferred between EU Member States and the UK without additional safeguards. This period was granted to give the Commission enough time to make an adequacy decision. Accordingly, data transfers can continue as before until possibly mid-2021. However, this arrangement is only valid if the UK does not change its data protection laws in the meantime.

With regard to direct marketing, the situation has not changed either: for individuals, active consent must be given unless there was a prior contractual relationship and the advertising relates to similar products as the prior contract. Furthermore, the advertising must also be precisely recognisable as such, and the possibility of revoking consent must be given in every advertising mail.

However, much else has yet to be clarified. Questions such as the competence of the UK Data Protection Authority, the Information Commissioner’s Office (ICO), as well as the fate of its ongoing investigations, have not yet been answered. As of now, companies with their original EU Headquarters in the UK will have to designate a new Lead Supervisory Authority (Art. 56 GDPR) for their business in the EU.

The upcoming months will determine if questions with high relevance to businesses’ day to day practice will be able to be answered reassuringly.

European Commission proposes draft “Digital Service Act” and “Digital Market Act”

21. December 2020

On December 15th, the European Commission published drafts on the “Digital Service Act” (“DSA”) and the “Digital Market Act” (“DMA”), which are intended to restrict large online platforms and stimulate competition.

The DSA is intended to rework the 20-year-old e-Commerce Directive and introduce a paradigm shift in accountability. Under the DSA, platforms would have to prove that they acted in a timely manner in removing or blocking access to illegal content, or that they have no actual knowledge of such content. Violators would face fines of up to 6% of annual revenue. Authorities could order providers to take action against specific illegal content, after which they must provide immediate feedback on what action was taken and when. Providing false, incomplete or misleading information as part of the reporting requirement or failing to conduct an on-site inspection could result in fines of up to 1% of annual revenue. The scope of said illegal content is to include for example, criminal hate comments, discriminatory content, depictions of child sexual abuse, non-consensual sharing of private images, unauthorized use of copyrighted works, and terrorist content. Hosting providers will be required to establish efficient notice and action mechanisms that allow individuals to report and take action against posts they deem illegal. Platforms would not only be required to remove illegal content, but also explain to users why the content was blocked and give them the opportunity to complain.

Any advertising on ad-supported platforms would be required to be clearly identifiable as advertising and clearly state who sponsored it. Exceptions are to apply to smaller journalistic portals and bloggers, while even stricter rules would apply to large platforms. For example, platforms with more than 45 million active users in the EU could be forced to grant comprehensive access to stored data, provided that trade secrets are not affected, and to set up archives that make it possible to identify disinformation and illegal advertising.

Social network operators would have to conduct annual risk assessments and review how they deal with systemic threats, such as the spread of illegal content. They would also be required to provide clear, easy-to-understand and detailed reports at least once a year on the content moderation they have carried out during that period.

Newly appointed “Digital Service Coordinators” in each EU-Member-State are supposed to enforce the regulation, for example by ordering platforms to share data with researchers who shall investigate the platforms relevant activities, while a new European committee is to ensure that the DSA is applied uniformly across the EU. On demand of the Digital Service Coordinators platforms would have to provide researchers with key data, so they can investigate the platforms relevant activities.

The DMA includes a list of competition requirements for large platforms, so called “gatekeepers”, that have a monopoly-like status. The regulations aim to strengthen smaller competitors and prevent the large gatekeepers from using their dominance to impose practices perceived as unfair. They would neither be allowed to exclusively pre-install their own applications, nor to force other operating system developers or hardware manufacturers to have programs pre-installed exclusively by the gatekeeper’s company. In addition, preventing users from uninstalling included applications would be prohibited. Other common measures of self-preference would also be prohibited. For example, gatekeepers would no longer be allowed to use data generated by their services for their own commercial activities without also making the information available to other commercial users. If a provider wanted to merge data generated by different portals, he would have to obtain explicit consent from users to do so.

The publication of the DSA and the DMA is the next step in the European Commission’s 2020 European strategy for data, following the proposal of the Data Governance Act in November. Like the Data Governance Act, the DSA and DMA aim to push back the dominance of tech giants, particularly those from the U.S. and China, while promoting competition.

EDPS considers Privacy Shield replacement unlikely for a while

18. December 2020

The data transfer agreements between the EU and the USA, namely Safe Harbor and its successor Privacy Shield, have suffered a hard fate for years. Both have been declared invalid by the European Court of Justice (CJEU) in the course of proceedings initiated by Austrian lawyer and privacy activist Max Schrems against Facebook. In either case, the court came to the conclusion that the agreements did not meet the requirements to guarantee equivalent data protection standards and thus violated Europeans’ fundamental rights due to data transfer to US law enforcement agencies enabled by US surveillance laws.

The judgement marking the end of the EU-US Privacy Shield (“Schrems II”) has a huge impact on EU companies doing business with the USA, which are now expected to rely on Standard Contractual Clauses (SCCs). However, the CJEU tightened the requirements for the SCCs. When using them in the future, companies have to determine whether there is an adequate level of data protection in the third country. Therefore, in particular cases, there may need to be taken additional measures to ensure a level of protection that is essentially the same as in the EU.

Despite this, companies were hoping for a new transatlantic data transfer pact. Though, the European Data Protection Supervisor (EDPS) Wojciech Wiewiórowski expressed doubts on an agreement in the near future:

I don’t expect a new solution instead of Privacy Shield in the space of weeks, and probably not even months, and so we have to be ready that the system without a Privacy Shield like solution will last for a while.

He justified his skepticism with the incoming Biden administration, since it may have other priorities than possible changes in the American national security laws. An agreement upon a new data transfer mechanism would admittedly depend on leveling US national security laws with EU fundamental rights.

With that in mind, the EU does not remain inactive. It is also trying to devise different ways to maintain its data transfers with the rest of the world. In this regard, the EDPS appreciated European Commission’s proposed revisions to SCCs, which take into consideration the provisions laid down in CJEU’s judgement “Schrems II”.

The proposed Standard Contractual Clauses look very promising and they are already introducing many thoughts given by the data protection authorities.

Swedish court confirms Google’s violations of the GDPR

16. December 2020

The Administrative Court of Stockholm announced on November 23rd, 2020, that it had rejected Google LLC’s appeal against the decision of the Swedish Data Protection Authority (Datainspektionen) determining Google’s violations of the GDPR. Google as a search engine operator had not fulfilled its obligations regarding the right to be forgotten (RTBF). However, the court reduced the fine from a total of SEK 75 million (approx. € 7,344,000) to SEK 52 million (approx. € 5,091,000).

Background to the case was the Swedish DPA’s audit in 2017 concerning Google’s handling of requests on delisting, which means removal of certain results from a search engine. The DPA concluded the inspection by ordering Google to delist certain individuals’ names due to inaccuracy, irrelevance and superfluous information. In 2018 the DPA initiated a follow-up audit because of indications that Google had not fully complied with the previously issued order. It resulted in issuing an administrative fine of SEK 75 million in March 2020.

The DPA raised attention to the fact that the GDPR increases the obligations of data controllers and data processors as well as strengthens the rights of individuals, which include the right to have their search result delisted. Though, Google has not been fully complying with its obligations, as it has not properly removed two of the search result listings that the DPA had ordered to delete. In one case Google has done a too narrow interpretation of what web addresses to remove, in the other case Google has failed to remove it without undue delay.

Moreover, the DPA criticized Google’s procedure of managing delisting requests and found it to be undermining data subjects’ rights. Following the removal of a search result listing, Google notifies the website to which the link is directed. The delisting request form, directed to the data subject raising the request, states that information on the removed web addresses can be provided to the webmaster. This information has to be seen as misleading since the data subject is made to understand that its consent to the notification is required in order to process the request. Therefore, such practice might result in individuals refraining from exercising their right to request delisting, which violates Art. 5 (1) lit. a) GDPR. What’s more, in the opinion of the DPA the delisting notifications to the webmasters are not covered by legal obligations according to Art. 6 (1) lit. c), 17 (2) GDPR, nor legitimate interests pursuant to Art. 6 (1) lit. f) GDPR. Also, Google’s routine of regularly sending information to webmasters constitutes processing of personal data being incompatible with the purpose for which the data was originally collected. This practice infringes Art. 5 (1) lit. b), 6 (4) GDPR.

Google appealed the decision of the DPA. Though, the Swedish Administrative Court of Stockholm reaffirmed the DPA’s opinion and confirmed Google’s violations of the GDPR.

The court stated that the process concerning delisting requests must facilitate for the individual to exercise its rights. That means, any process that restricts the individuals’ rights may violate Art. 15 through 22 GDPR. The court also specified why the personal data had been processed beyond their original purpose. Since the notifications are only sent after Google has removed a search result, the purpose of the processing has already expired when the notification is sent. Thus, the notification cannot be considered effective in achieving the purpose specified by Google.

Google shall now delist specific search results and cease to inform webmasters of requests. Also, Google must adapt its data subject rights procedure within eight weeks after the court’s judgment has gained legal force.

Update: The Council of the European Union publishes recommendations on encryption

8. December 2020

In November, the Austrian broadcasting network “Österreichischer Rundfunk” sparked a controversial discussion by publishing leaked drafts of the Council of the European Union (“EU Council”) on encryption (please see our blog post). After these drafts had been criticized by several politicians, journalists and NGOs, the EU Council published “Recommendations for a way forward on the topic of encryption” on December 1st, in which it considers it important to carefully balance between protecting fundamental rights with ensuring law enforcement investigative powers.

The EU Council sees a dilemma between the need for strong encryption in order to protect privacy on one hand, and the misuse of encryption by criminal subjects such as terrorists and organized crime on the other hand. They further note:

“We acknowledge this dilemma and are determined to find ways that will not compromise
either one, upholding the principle of security through encryption and security despite
encryption.”

The paper lists several intentions that are supposed to help find solutions to this dilemma.

First, it directly addresses EU institutions, agencies, and member states, asking them to coordinate their efforts in developing technical, legal and operational solutions. Part of this cooperation is supposed to be the joint implementation of standardized high-quality training programs for law enforcement officers that are tailored to the skilled criminal environment. International cooperation, particularly with the initiators of the “International Statement: End-to-End Encryption and Public Safety“, is proclaimed as a further intention.

Next the technology industry, civil society and academic world are acknowledged as important partners with whom EU institutions shall establish a permanent dialogue. The recommendations address internet service providers and social media platforms directly, noting that only with their involvement can the full potential of technical expertise be realized. Europol’s EU Innovation Hub and national research and development teams are named key EU institutions for maintaining this dialogue.

The EU Council concludes that the continuous development of encryption requires regular evaluation and review of technical, operational, and legal solutions.

These recommendations can be seen as a direct response to the discussion that arose in November. The EU Council is attempting to appease critics by emphasizing the value of encryption, while still reiterating the importance of law enforcement efficiency. It remains to be seen how willing the private sector will cooperate with the EU institutions and what measures exactly the EU Council intends to implement. This list of intentions lacks clear guidelines, recommendations or even a clearly formulated goal. Instead, the parties are asked to work together to find solutions that offer the highest level of security while maximizing law enforcement efficiency. In summary, these “recommendations” are more of a statement of intent than implementable recommendations on encryption.

EU offers new alliance with the USA on data protection

4. December 2020

The European Commission and the High Representative of the Union for Foreign Affairs and Security Policy outlined a new EU-US agenda for global change, which was published on December 2nd, 2020. It constitutes a proposal for a new, forward-looking transatlantic cooperation covering a variety of matters, including data protection.

The draft plan states the following guiding principles:

  • Advance of global common goods, providing a solid base for stronger multilateral action and institutions that will support all like-minded partners to join.
  • Pursuing common interests and leverage collective strength to deliver results on strategic priorities.
  • Looking for solutions that respect common values of fairness, openness and competition – including where there are bilateral differences.

As said in the draft plan, it is a “once-in-a-generation” opportunity to forge a new global alliance. It includes an appeal for the EU and US to bury the hatchet on persistent sources of transatlantic tension and join forces to shape the digital regulatory environment. The proposal aims to create a shared approach to enforcing data protection law and combatting cybersecurity threats, which could also include possible restrictive measures against attributed attackers from third countries. Moreover, a transatlantic agreement concerning Artificial Intelligence forms a part of the recommendation. The purpose is setting a blueprint for regional and global standards. The EU also wants to openly discuss diverging views on data governance and facilitate free data flow with trust on the basis of high safeguards. Furthermore, the creation of a specific dialogue with the US on the responsibility of online platforms and Big Tech is included in the proposal as well as the development of a common approach to protecting critical technologies.

The draft plan is expected to be submitted for endorsement by the European Council at a meeting on December 10-11th, 2020. It suggests an EU-US Summit in the first half of 2021 as the moment to launch the new transatlantic agenda.

The Controversy around the Council of the European Union’s Declaration on End-to-End Encryption

27. November 2020

In the course of November 2020, the Council of the European Union issued several draft versions of a joint declaration with the working title “Security through encryption and security despite encryption”. The drafts were initially intended only for internal purposes, but leaked and first published by the Austrian brodcasting network “Österreichischer Rundfunk” (“ORF”) in an article by journalist Erich Möchel. Since then, the matter has sparked widespread public interest and media attention.

The controversy around the declaration arose when the ORF commentator Möchel presented further information from unknown sources that “compentent authorities” shall be given “exceptional access” to the end-to-end encryption of communications. This would mean that communications service providers like WhatsApp, Signal etc. would be obliged to allow a backdoor and create a general key to encrypted communications which they would deposit with public authorities. From comparing the version of the declaration from 6 November 2020 with the previous version from 21 October 2020, he highlighted that in the previous version it states that additional practical powers shall be given to “law enforcement and judicial authorities”, whereas in the more recent version, the powers shall be given to “competent authorities in the area of security and criminal justice”. He adds that the new broader wording would include European intelligence agencies as well and allow them to undermine end-to-end encryption. Furthermore, he also indicated that plans to restrict end-to-end encyption in Western countries are not new, but originally proposed by the “Five Eyes” intelligence alliance of the United States, Canada, United Kingdom, Australia and New Zealand.

As a result of the ORF article, the supposed plans to restrict or ban end-to-end encryption have been widely criticised by Politicians, Journalists, and NGOs stating that any backdoors to end-to-end encryption would render any secure encryption impossible.

However, while it can be verified that the “Five Eyes” propose the creation of general keys to access end-to-end encrypted communications, similar plans for the EU cannot be clearly deduced from the EU Council’s declaration at hand. The declaration itself recognises end-to-end encryption as highly beneficial to protect governments, critical infrastructures, civil society, citizens and industry by ensuring privacy, confidentiality and data integrity of communications and personal data. Moreover, it mentions that EU data protection authorities have identified it as an important tool in light of the Schrems II decision of the CJEU. At the same time, the Council’s declaration illustrates that end-to-end encryption poses large challenges for criminal investigations when gathering evidencein cases of cyber crime, making it at times “practically impossible”. Lastly, the Council calls for an open, unbiased and active discussion with the tech industry, research and academia in order to achieve a better balance between “security through encryption and security despite encryption”.

Möchel’s sources for EU plans to ban end-to-end encryption through general keys remain unknown and unverifiable. Despite general concerns for overarching surveillance powers of governments, the public can only approach the controversy around the EU Council’s declaration with due objectivity and remain observant on whether or how the EU will regulate end-to-end encryption and find the right balance between the privacy rights of European citizens and the public security and criminal justice interests of governments.

Pages: Prev 1 2 3 4 5 6 7 8 9 10 ... 19 20 21 Next
1 2 3 4 5 6 21