Category: GDPR

EDPB issues guidance on data transfers following Schrems II

17. November 2020

Following the recent judgment C-311/18 (Schrems II) by the Court of Justice of the European Union (CJEU), the European Data Protection Board (EDPB) published “Recommendations on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data” on November 11th. These measures are to be considered when assessing the transfer of personal data to countries outside of the European Economic Area (EEA), or so-called third countries. These recommendations are subject to public consultation until the end of November. Complementing these recommendations, the EDPB published “Recommendations on the European Essential Guarantees for surveillance measures”. Added together both recommendations are guidelines to assess sufficient measures to meet standards of the General Data Protection Regulation (GDPR), even if data is transferred to a country lacking protection comparable to that of the GDPR.

The EDPB highlights a six steps plan to follow when checking whether a data transfer to a third country meets the standards set forth by the GDPR.

The first step is to map all transfers of personal data undertaken, especially transfers into a third country. The transferred data must be adequate, relevant and limited to what is necessary in relation to the purpose. A major factor to consider is the storage of data in clouds. Furthermore, onwards transfer made by processors should be included. In a second step, the transfer tool used needs to be verified and matched to those listed in Chapter V of the GDPR. The third step is assessing if anything in the law or practice of the third country can impinge on the effectiveness of the safeguards of the transfer tool. The before mentioned Recommendations on European Essential Guarantees are supposed to help to evaluate a third countries laws, regarding the access of data by public authorities for the purpose of surveillance.

If the conclusion that follows the previous steps is that the third countries legislation impinges on the effectiveness of the Article 46 GDPR tool, the fourth step is identifying supplementary measures that are necessary to bring the level of protection of the data transfer up to EU Standards, or at least an equivalent, and adopting these. Recommendations for such measures are listed in Annex 2 of the EDPB Schrems II Recommendations. They may be of contractual, technical, or organizational nature. In Annex 2 the EDPB mentions seven technical cases they found and evaluates them. Five were deemed to be scenarios for which effective measures could be found. These are:

1. Data storage in a third country, that does not require access to the data in the clear.
2. Transfer of pseudonymized data.
3. Encrypted data merely transiting third countries.
4. Transfer of data to by law specially protected recipients.
5. Split or multi-party processing.

Maybe even more relevant are the two scenarios the EDPB found no effective measures for and therefore deemed to not be compliant with GDPR standards.:

6. Transfer of data in the clear (to cloud services or other processors)
7. Remote access (from third countries) to data in the clear, for business purposes, such as, for example, Human Resources.

These two scenarios are frequently used in practice. Still, the EDPB recommends not to execute these transfers in the upcoming future.
Examples of contractual measures are the obligation to implement necessary technical measures, measures regarding transparency of (requested) access by government authorities and measures to be taken against such requests. Accompanying this the European Commission published a draft regarding standard contractual clauses for transferring personal data to non-EU countries, as well as organizational measures such as internal policies and responsibilities regarding government interventions.

The last two steps are undertaking the formal procedural steps to adapt supplementary measures required and re-evaluating the former steps in appropriate intervals.

Even though these recommendations are not (yet) binding, companies should take a further look at the recommendations and check if their data transfers comply with the new situation.

Poland: Addresses of judges, politicians and pro-life activists published on Twitter

12. November 2020

In recent days, social networks in Poland have teemed with posts containing private addresses and telephone numbers of judges of the Constitutional Tribunal, politicians and activists openly supporting the abortion sentence. In conjunction with the publication of the above on Twitter, the President of the Personal Data Protection Office (UODO) took immediate steps to protect the personal data and privacy of these persons.

Background to this was the judgement of the Constitutional Tribunal repealing the provisions allowing abortion in cases of, for example, serious genetic defects or severe impairment of the human fetus. This provoked resistance from a part of Polish society and led to a street revolution of “liberal” men and women. Unfortunately, the agitation turned into invectives, destruction of property, public disorder and personal arguments. As a result, personal data of people supporting the prohibition of abortion have been shared thousands of times on all social media too. For this reason, numerous protesters appeared at the indicated houses, covered the walls of the surrounding buildings with vulgar inscriptions, and the addressees began to receive packages, e.g. with a set of hangers.

On October 29th, 2020 the President of the UODO responded to the case:

Publishing private addresses and contact details of pro-life activists, politicians and judges by users of the Twitter social network is an action leading to the disclosure of a wide sphere of privacy, and thus posing threats to health and life, such as possible acts of violence and aggression directed against these people and their family members.

The announcement stated that the President of the UODO requested an immediate procedure by the Irish supervisory authority, which is responsible for the processing of personal data via Twitter. Pointing out the enormous scale of threats, he indicated the need to verify the response time to reported irregularities and the possibility of introducing automated solutions to prevent the rapid furtherance of such content by other portal users. He also notified the law enforcement authorities that Twitter users had committed a crime consisting in the processing of personal data without a legal basis. The lawfulness had neither been guaranteed by consent according to Art. 6 (1) lit. a GDPR nor legitimate interests pursuant to Art. 6 (1) lit. f GDPR or any other legal basis. Thus, the processing has to be seen as illegitimate as also stated by the President of the UODO. The law enforcement authorities will be obliged to examine and document both the scope of personal data disclosed in a way that violates the principles of personal data protection and to determine the group of entities responsible for unlawful data processing. The President of the UODO also applied to the Minister of Justice – Public Prosecutor General for placing this case under special supervision due to the escalation of conflict and aggression, which pose a high risk of violating the life interests of both people whose data is published on social media and their family members.

In conclusion, the President of the UODO added:

The intensification of actions of all competent authorities in this matter is necessary due to the unprecedented nature of the violations and the alarming announcements of disclosing the data of more people, as well as the deepening wave of aggression.

The CCPA is not enough: Californians will vote on the CPRA

28. October 2020

On 3 November 2020, the day of the US Presidential Election, Californian citizens will also be able to vote on the California Privacy Rights Act of 2020 (“CPRA”) in a state ballot. The CPRA shall expand Califonian consumers’ privacy rights given by the California Consumer Privacy Act of 2018 (“CCPA”) which only came into effect on 2 January 2020.

The NGO “Californians for Consumer Privacy”, led by privacy activist Alastair Mactaggart, initiated the upcoming state ballot on the CPRA. Mactaggart’s NGO already qualified for a state ballot on the adoption of the CCPA by collecting over 629,000 signatures of California citizens in 2018. However, the NGO dropped the proposal in 2018 after California state legislators persuaded the intitiators that they will pass the CCPA through the legislative process. But because several significant amendments to the original proposal were passed during the legislative process, the NGO created the new CPRA initiative in 2020. This time, the group submitted more than 900,000 signatures. The CPRA is supposed to expand on the provisions of the CCPA. In case the CPRA is approved by California voters on November 3rd, it could not be easily amended and would require further direct voter action. Most provisions of the CPRA would become effective on 1 January 2023 and would only apply to information collected from 1 January 2022.

Some of the key provisions of the newly proposed CPRA seem to draw inspiration from the provisions of the European General Data Protection Regulations (“GDPR”) and include the establishment of an enforcement agency (the “California Privacy Protections Agency”), explicitly protecting “Sensitive Personal Information” of consumers and granting the right to rectify inaccurate personal information. The CPRA would furthermore require businesses to abide to information obligations comparable to those required by Art. 12-14 GDPR.

As the day of the state ballot is fast approaching, recent polls suggest that the CPRA will likely pass and complement the already existing CCPA, forming the US’ strictest privacy rules to date.

EDPB addresses Privacy by Design and Default in 40th Plenary Session

26. October 2020

Following public consultation, the Europan Data Protection Board (EDPB) adopted a final version of the Guidelines on Data Protection by Design & Default during its 40th plenary session on October 20th, 2020.

The Guidelines’ focal point is the obligation of Data Protection by Design and by Default as set forth in Article 25 GDPR. At its core is the effective implementation of the data protection principles and data subjects’ rights and freedoms by design and by default, which means that controllers are obliged to implement appropriate technical and organisational measures as well as the necessary safeguards, designed to establish data protection principles in practice and to protect the rights and freedoms of data subjects while processing their personal data.

The Guidelines further contain guidance on how to effectively implement the data protection principles in Article 5 GDPR, listing key design and default points, as well as giving examples through practical cases. They also provide recommendations for controllers on how to achieve Privacy by Design and Default.

However, this is not the only decision made by the EDPB during the plenary. The EDPB decided to set up a Coordinated Enforcement Framework (CEF), which provides a structure for coordinating recurring annual activities by EDPB Supervisory Authorities. The objective is to coordinate joint activities, which may range from joint awareness raising and information gathering to enforcement sweeps and joint investigations.

The EDPB hopes that this raises awareness, as well as give data subjects more confidence to excercise their rights under the GDPR.

Decision to fine the Norwegian Public Roads Administration

23. October 2020

The Norwegian Data Protection Authority (Datatilsynet) has issued the Norwegian Public Roads Administration (Statens vegvesen) a fine of EUR 37.400 (NOK 400.000) for improprieties related to the use of the monitoring system installed on toll ways in Norway. They concerned processing personal data for purposes that were noncompliant with the originally stated and for not erasing video recordings after 7 days from their registration.

The penalized entity is the controller of a system processing personal data obtained from the area of ​​toll roads in Norway. This system records personal data which especially enable the identification of vehicles (and hence their owners) that pass through public toll stations. The primary purpose of processing these personal data was to ensure safety on public roads and to optimize the operation of the tunnel and drawbridges in the county Østfold. The Norwegian Public Roads Administration however, used the recordings particularly in order to document improper fulfilments of concluded contracts by certain subjects. According to the Norwegian Data Protection Authority, such procedure is unlawful and not compliant with the originally stated purposes.

The Norwegian Public Roads Administration was also accused of infringements related to deletion of personal data in due time. In accordance with Norwegian regulations, recordings from monitoring (and thus personal data) may be stored until the reason for its storage ceases, but no longer than 7 days from recording the material. In the course of proceedings it turned out that the monitoring system did not have the function of deleting personal data at all. Therefore, the Norwegian Public Roads Administration was not able to fulfil its obligation according to Art. 17 GDPR. The lack of this functionality additionally indicates that the controller, while implementing the monitoring system, also omitted the requirements specified in Art. 25 GDPR.

Taking into account these circumstances, the Norwegian Data Protection Authority stated a violation of the mentioned GDPR regulations.

Appeal against record fine for GDPR violation in Poland dismissed

22. October 2020

On 10th September 2019 the Polish Data Protection Commissioner imposed a record fine in the amount of more than PLN 2,8 million or the equivalent of € 660.000 on the company Morele.net for violating the implementation of appropriate technical and organisational measures as well as the lack of verifiability of the prior consents to data processing. The Krakow-based company runs various online shops and stores customer data on a central database. According to the Personal Data Protection Office (UODO), there has been 2,2 million customers affected.

Starting point were especially two incidents at the end of 2018, when unauthorised persons got access to the customer database of the company and the contained personal data. The company notified the data breach to the UODO, which accused it particularly of violation of the confidentiality principle (Articles 5 (1) lit. f, 24 (1), 25 (1), 32 (1) lit. b, d, (2) GDPR) by failing to use sufficient technical and organisational measures to safeguard the data of its customers, such as a two-factor authentication. As claimed by the UODO, the selection of the authentication mechanism should always be preceded by an adequate risk analysis with a corresponding determination of protection requirements. The company did not adequately comply with this. However, it should have been sufficiently aware of the phishing risks as the Computer Emergency Response Team (CERT Polska) had already pointed it out.

In addition, the UODO accused the company of violation of the lawfulness, fairness, transparency and accountability principles (Articles 5 (1) lit. a, (2), 6 (1), 7 (1) GDPR) by not being able to prove that (where necessary) the personal data from installment applications had been processed on the basis of consents of data subjects. Furthermore, after a risk analysis, the company deleted the corresponding data from the database in December 2018, but according to the UODO, the deletion was not sufficiently documented.

When assessing the fine, there were many aspects which played a decisive role. Most of all, the extent of the violation (2,2 million customers) and the fact that the company processes personal data professionally in the course of its business activities and therefore has to apply a higher level of security. However, mitigating circumstances were also taken into account, such as the good cooperation with the supervisory authority, no previous ascertainable violations of the GDPR and no identifiable financial advantages for the company.

On 3rd September 2020, the Provincial Administrative Court (WSA) in Warsaw issued a judgment on Morele.net’s appeal against the decision. The WSA dismissed the appeal and considered that the decision on the fine imposed on the company was justified. Furthermore, the WSA stated that the UODO had correctly assessed the facts in the case concerned and considered that the fine imposed was high but within the limits of the law and justified by circumstances. It is expected that the company will lodge a complaint with the Supreme Administrative Court of Poland.

H&M receives record-breaking 35 Mio Euro GDPR Fine in Germany

21. October 2020

In the beginning of October, the Hamburg Data Protection Commissioner (“HmbBfDI”) imposed a record-breaking 35,258,707.95 Euro GDPR fine on the German branch of the Swedish clothing-retail giant H&M. It is the highest fine, based on a GDPR violation, a German Data Protection Authority has ever issued.

Since 2014, the management of the H&M service centre in Nuremberg extensively monitored the private lives of their employees in various ways. Following holidays and sick leaves of employees, team leaders would conduct so-called “Welcome Back Talks” in which they recorded employees’ holiday experiences, symptoms of illnesses and medical diagnoses. Some H&M supervisors gathered a broad data base of their employees’ private lives as they recorded details on family issues and religious beliefs from one-on-one talks and even corridor conversations. The recordings had a high level of detail and were updated over time and in some cases were shared with up to 50 other managers throughout the whole company. The H&M supervisors also used this Personal Data to create profiles of their employees and to base future employment decisions and measures on this information. The clandestine data collection only became known as a result of a configuration error in 2019 when the notes were accessible company-wide for a few hours.

After the discovery, the H&M executives presented the HmbBfDI a comprehensive concept on improving Data Protection at their Nuremberg sub-branch. This includes newly appointing a Data Protection coordinator, monthly Data Protection status updates, more strongly communicated whistleblower protection and a consistent process for granting data subject rights. Furthermore, H&M has apologised to their employees and paid the affected people a considerable compensation.

With their secret monitoring system at the service centre in Nuremberg, H&M severely violated the GDPR principles of lawfulness, fairness, and transparency of processing pursuant to Art. 5 no. 1 lit. a) and Art. 6 GDPR because they did not have a legal basis for collecting these Personal Data from their employees. The HmbBfDI commented in his statement on the magnitude of the fine saying that “the size of the fine imposed is appropriate and suitable to deter companies from violating the privacy of their employees”.

First judicial application of Schrems II in France

20. October 2020

France’s highest administrative court (Conseil d’État) issued a summary judgment that rejected a request for the suspension of France’s centralized health data platform – Health Data Hub (HDH) – on October 13th, 2020. The Conseil d’État further recognized that there is a risk of U.S. intelligence services requesting the data and called for additional guarantees.

For background, France’s HDH is a data hub supposed to consolidate all health data of people receiving medical care in France in order to facilitate data sharing and promote medical research. The French Government initially chose to partner with Microsoft and its cloud platform Azure. On April 15th, 2020, the HDH signed a contract with Microsoft’s Irish affiliate to host the health data in data centers in the EU. On September 28th, 2020, several associations, unions and individual applicants appealed to the summary proceedings judge of the Conseil d’État, asking for the suspension of the processing of health data related to the COVID-19 pandemic in the HDH. The worry was that the hosting of data by a company which is subject to U.S. laws entails data protection risks due to the potential surveillance done under U.S. national surveillance laws, as has been presented and highlighted in the Schrems II case.

On October 8th, 2020, the Commission Nationale de l’Informatique et Libertées (CNIL) submitted comments on the summary proceeding before the Conseil d’État. The CNIL considered that, despite all of the technical measures implemented by Microsoft (including data encryption), Microsoft could still be able to access the data it processes on behalf of the HDH and could be subject, in theory, to requests from U.S. intelligence services under FISA (or even EO 12333) that would require Microsoft to transfer personal data stored and processed in the EU.
Further, the CNIL recognized that the Court of Justice of the European Union (CJEU) in the Schrems II case only examined the situation where an operator transfers, on its own initiative, personal data to the U.S. However, according to the CNIL, the reasons for the CJEU’s decision also require examining the lawfulness of a situation in which an operator processes personal data in the EU but faces the possibility of having to transfer the data following an administrative or judicial order or request from U.S. intelligence services, which was not clearly stated in the Schrems II ruling. In that case, the CNIL considered that U.S. laws (FISA and EO 12333) also apply to personal data stored outside of the U.S.

In the decision of the Conseil d’État, it agreed with the CNIL that it cannot be totally discounted that U.S. public authorities could request Microsoft and its Irish affiliate to access some of the data held in the HDH. However, the summary proceedings judge did not consider the CJEU’s ruling in the Schrems II case to also require examination of the conditions under which personal data may be processed in the EU by U.S. companies or their affiliates as data processors. EU law does not prohibit subcontracting U.S. companies to process personal data in the EU. In addition, the Conseil d’État considered the violation of the GDPR in this case was purely hypothetical because it presupposes that U.S. authorities are interested in accessing the health data held in the HDH. Further, the summary proceedings judge noted that the health data is pseudonymized before being shared within the HDH, and is then further encrypted by Microsoft.

In the end, the judge highlighted that, in light of the COVID-19 pandemic, there is an important public interest in continuing the processing of health data as enabled by the HDH. The conclusion reached by the Conseil d’ètat was that there is no adequate justification for suspending the data processing activities conducted by the HDH, but the judge ordered the HDH to work with Microsoft to further strengthen privacy rights.

EU looking to increase Enforcement Powers over Tech Giants

24. September 2020

In an interview with The Financial Times on Sunday, EU-Commissioner Thierry Breton stated that the European Union is considering plans to increase its enforcement powers regarding tech giants.

This empowerment is supposed to include punitive measures such as forcing tech firms to break off and sell their EU operations if the dominance on the market becomes too large. It is further considered to enable the EU to be able to boot tech companies from the EU single market entirely. Breton stated these measures would of course only be used in extreme circumstances, but did not elaborate on what would qualify as extreme.

“There is a feeling from end-users of these platforms that they are too big to care,” Thierry Breton told The Financial Times. In the interview, he compared tech giants’ market power to the big banks before the financial crisis. “We need better supervision for these big platforms, as we had again in the banking system,” he stated.

In addition, the European Union is considering a rating system, in which companies would be given scores in different categories such as tax compliance, taking action against illegal content, etc. However, Breton said that it is not the intend to make companies liable for their users’ content.

Breton further said that the first drafts of the new law will be ready by the end of the year.

Once the final draft is in place, it will require approval both by the European Parliament as well as the European Council, before it can be enacted.

Privacy Activist Schrems unleashes 101 Complaints

21. September 2020

Lawyer and privacy activist Maximilian Schrems has become known for his legal actions leading to the invalidation of “Safe Harbor” in 2015 and of the “EU-U.S. Privacy Shield” this year (we reported). Following the landmark court decision on the “EU-U.S. Privacy Shield”, Schrems recently announced on the website of his NGO “noyb” (non-of-your-business) that he has filed 101 complaints against 101 European companies in 30 different EU and EEA countries with the responsible Data Protection Authorities. Schrems exercised the right to lodge a complaint with the supervisory authority that every data subject has if he or she considers that the processing of personal data relating to him or her infringes the Regulation, pursuant to Art. 77 GDPR.

The complaints concern the companies’ continued use of Google Analytics and Facebook Connect that transfer personal data about each website visitor (at least IP-address and Cookie data) to Google and Facebook which reside in the United States and fall under U.S. surveillance laws, such as FISA 702. Schrems also published a list of the 101 companies which include Sky Deutschland, the University of Luxembourg and the Cyprus Football Association. With his symbolic action against 101 companies, Schrems wanted to point to the widespread inactivity among many companies that still do not take the data protection rights of individuals seriously despite the recent ruling by the Court of Justice of the European Union.

In response, the European Data Protection Board (“EDPB”) has set up a “task force” to handle complaints against European companies using Google Analytics and Facebook services. The taskforce shall analyse the matter and ensure a close cooperation among the members of the Board which consists of all European supervisory authorities as well as the European Data Protection Supervisor.

Pages: Prev 1 2 3 4 5 6 7 8 9 10 ... 17 18 19 Next
1 2 3 4 5 6 19