Category: General

China issued new Draft for Personal Information Protection Law

23. November 2020

At the end of October 2020, China issued a draft for a new „Personal Information Protection Law” (PIPL). This new draft is the introduction of a comprehensive system in terms of data protection, which seems to have taken inspiration from the European General Data Protection Regulation (GDPR).

With the new draft, China’s regulations regarding data protection will be consisting of China’s Cybersecurity Law, Data Security Law (draft) and Draft PIPL. The new draft legislation contains provisions relating to issues presented by new technology and applications, all of this in around 70 articles. The fines written in the draft for non-compliance are quite high, and will bring significant impact to companies with operations in China or targeting China as a market.

The data protection principles drawn out in the draft PIPL include transparency, fairness, purpose limitation, data minimization, limited retention, data accuracy and accountability. The topics that are covered include personal information processing, the cross-border transfer of personal information, the rights of data subjects in relation to data processing, obligations of data processors, the authority in charge of personal information as well as legal liabilities.

Unlike China’s Cybersecurity Law, which provides limited extraterritorial application, the draft PIPL proposes clear and specific extraterritorial application to overseas entities and individuals that process the personal data of data subjects in China.

Further, the definition of “personal data” and “processing” under the draft PIPL are very similar to its equivalent term under the GDPR. Organizations or individuals outside China that fall into the scope of the draft PIPL are also required to set up a dedicated organization or appoint a representative in China, in addition to also report relevant information of their domestic organization or representative to Chinese regulators.

In comparison to the GDPR, the draft PIPL extends the term of “sensitive data” to also include nationality, financial accounts, as well as personal whereabouts. However, sensitive personal information is defined as information that once leaked or abused may cause damage to personal reputation or seriously endanger personal and property safety, which opens the potential for further interpretation.

The draft legislation also regulates cross-border transfers of personal information, which shall be possible if it is certified by recognized institutions, or the data processor executes a cross-border transfer agreement with the recipient located outside of China, to ensure that the processing meets the protection standard provided under the draft PIPL. Where the data processor is categorized as a critical information infrastructure operator or the volume of data processed by the data processor exceeds the level stipulated by the Cyberspace Administration of China (CAC), the cross-border transfer of personal information must pass a security assessment conducted by the CAC.

It further to keep in mind that the draft PIPL enlarges the range of penalties beyond those provided in the Cybersecurity Law, which will put a much higher pressure on liabilities for Controllers operating in China.

Currently, the period established to receive open comments on the draft legislation has ended, but the next steps have not yet been reported, and it not yet sure when the draft legislation will come into full effect.

California Voters approve new Privacy Legislation CPRA

20. November 2020

On November 3rd 2020, Californian citizens were able to vote on the California Privacy Rights Act of 2020 (“CPRA”) in a state ballot (we reported). As polls leading up to the vote already suggested, California voters approved the new Privacy legislation, also known as “Prop 24”. The CPRA was passed with 56.2% of Yes Votes to 43.8% of No Votes. Most provisions of the CPRA will enter into force on 1 January 2021 and will become applicable to businesses on 1 January 2023. It will, at large, only apply to information collected from 1 January 2022.

The CPRA will complement and expand privacy rights of California citizens considerably. Among others, the amendments will include:

  • Broadening the term “sale” of personal information to “sale or share” of private information,
  • Adding new requirements to qualify as a “service provider” and defining the term “contractor” anew,
  • Defining the term “consent”,
  • Introducing the category of “Sensitive Information”, including a consumer’s Right to limit the use of “Sensitive Information”,
  • Introducing the concept of “Profiling” and granting consumers the Right to Opt-out of the use of the personal information for Automated Decision-Making,
  • Granting consumers the Right to correct inaccurate information,
  • Granting consumers the Right to Data Portability, and
  • Establishing the California Privacy Protection Agency (CalPPA) with a broad scope of responsibilities and enforcement powers.

Ensuring compliance with the CPRA will require proper preparation. Affected businesses will have to review existing processes or implement new processes in order to guarantee the newly added consumer rights, meet the contractual requirements with service providers/contractors, and show compliance with the new legislation as a whole.

In an interview after the passage of the CPRA, the initiator of the CCPA and the CPRA Alastair Mactaggard commented that

Privacy legislation is here to stay.

He hopes that California Privacy legislation will be a model for other states or even the U.S. Congress to follow, in order to offer consumers in other parts of the country the same Privacy rights as there are in California now.

Canadian Government proposes new federal privacy law

18. November 2020

On November 17th, Navdeep Bains, the Canadian Minister of Information Science and Economic Development, introduced Bill C-11, which is intended to modernize and reshape the Canadian privacy framework and to comply with EU and U.S. legislation. Its short title is Digital Charter Implementation Act,2020 (DCIA). A fact sheet accompanying the DCIA states:

“… If passed, the DCIA would significantly increase protections to Canadians’ personal information by giving Canadians more control and greater transparency when companies handle their personal information. The DCIA would also provide significant new consequences for non-compliance with the law, including steep fines for violations. …”

Part one of the DCIA is the Consumer Privacy Protection Act (CPPA), which is intended to establish a new privacy law in the Canadian private sector. New consent rules are to be adopted, data portability is introduced as a requirement, the subject’s access to its personal data is enhanced as well as their rights to erase personal data. Data subjects further have the right to request businesses to explain how a prediction, recommendation, or decision was reached that was made by an automated decision-making system. Furthermore, they have the right to know how personal data is being used, as well as the right to review and challenge the amount of personal data that is being collected by a company or government. On demand, a privacy management program must be provided to the Canadian Office of the Privacy Commissioner (OPC). For non-compliance companies face possible fines up to 5% of the company’s global revenue, or C$25 Million, whichever is higher. According to Bains, these are the highest fines in all the G7-nations. Businesses can ask the OPC to approve their codes of practice and certification systems, and in socially beneficial cases, disclose de-identified data with public entities.

Bill C-11 further contains the “Personal Information and Privacy Protection Tribunal Act”, which is supposed to make enforcement of privacy rights faster and more efficient. For that purpose, more resources are committed to the OPC. The OPC can now issue “orders”, which have the same effect as Federal Court orders. Further, the OPC may force companies to comply or order them to stop collecting and using personal data. The newly formed Data Protection Tribunal can raise penalties and hear appeals regarding orders issued by the OPC.

Lastly, a private right of action is also included in the bill. This allows individuals to sue companies within two years after the commissioner issues a finding of privacy violation that is upheld by the Tribunal.

European Commission issues draft on Standard Contractual Clauses

A day after the European Data Protection Board (EDPB) issued its recommendations on supplementary measures, on November 12th the European Commission issued a draft on implementing new Standard Contractual Clauses (SCCs) for data transfers to non-EU countries (third countries). The draft is open for feedback until December 10th, 2020, and includes a 12-month transition period during which companies are to implement the new SCCs. These SCCs are supposed to assist controllers and processors in transferring personal data from an EU-country to a third-country, implementing measures that guarantee GDPR-standards and regarding the Court of Justice of the European Union’s (CJEU) “Schrems II” ruling.

The Annex includes modular clauses suitable for four different scenarios of data transfer. These scenarios are: (1) Controller-to-controller-transfer; (2) Controller-to-processor-transfer; (3) Processor-processor-transfer; (4) Processor-to-controller-transfer. Newly implemented in these SCCs are the latter two scenarios. Since the clauses in the Annex are modular, they can be mixed and matched into a contract fitting the situation at hand. Furthermore, more than two parties can adhere to the SCC and the modular approach even allows for additional parties to accede later on.

The potential of government access to personal data is distinctly addressed, since this was a main issue following the “Schrems II” ruling. Potential concerns are met by implementing clauses that address how the data importer must react when laws of the third country impinge on his ability to comply with the contract, especially the SCCs, and how he must react in case of government interference.  Said measures include notifying the data exporter and the data subject of any government interference, such as legally binding requests of access to personal data, and, if possible, sharing further information on these requests on a regular basis, documenting them and challenging them legally. Termination clauses have been added, in case the data importer cannot comply further, e.g. because of changes in the third country’s law.

Further clauses regard matters such as data security, transparency, accuracy and onwards transfer of personal data, which represent issues that have all been tackled in the older SCCs, but are to be updated now.

EDPB issues guidance on data transfers following Schrems II

17. November 2020

Following the recent judgment C-311/18 (Schrems II) by the Court of Justice of the European Union (CJEU), the European Data Protection Board (EDPB) published “Recommendations on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data” on November 11th. These measures are to be considered when assessing the transfer of personal data to countries outside of the European Economic Area (EEA), or so-called third countries. These recommendations are subject to public consultation until the end of November. Complementing these recommendations, the EDPB published “Recommendations on the European Essential Guarantees for surveillance measures”. Added together both recommendations are guidelines to assess sufficient measures to meet standards of the General Data Protection Regulation (GDPR), even if data is transferred to a country lacking protection comparable to that of the GDPR.

The EDPB highlights a six steps plan to follow when checking whether a data transfer to a third country meets the standards set forth by the GDPR.

The first step is to map all transfers of personal data undertaken, especially transfers into a third country. The transferred data must be adequate, relevant and limited to what is necessary in relation to the purpose. A major factor to consider is the storage of data in clouds. Furthermore, onwards transfer made by processors should be included. In a second step, the transfer tool used needs to be verified and matched to those listed in Chapter V of the GDPR. The third step is assessing if anything in the law or practice of the third country can impinge on the effectiveness of the safeguards of the transfer tool. The before mentioned Recommendations on European Essential Guarantees are supposed to help to evaluate a third countries laws, regarding the access of data by public authorities for the purpose of surveillance.

If the conclusion that follows the previous steps is that the third countries legislation impinges on the effectiveness of the Article 46 GDPR tool, the fourth step is identifying supplementary measures that are necessary to bring the level of protection of the data transfer up to EU Standards, or at least an equivalent, and adopting these. Recommendations for such measures are listed in Annex 2 of the EDPB Schrems II Recommendations. They may be of contractual, technical, or organizational nature. In Annex 2 the EDPB mentions seven technical cases they found and evaluates them. Five were deemed to be scenarios for which effective measures could be found. These are:

1. Data storage in a third country, that does not require access to the data in the clear.
2. Transfer of pseudonymized data.
3. Encrypted data merely transiting third countries.
4. Transfer of data to by law specially protected recipients.
5. Split or multi-party processing.

Maybe even more relevant are the two scenarios the EDPB found no effective measures for and therefore deemed to not be compliant with GDPR standards.:

6. Transfer of data in the clear (to cloud services or other processors)
7. Remote access (from third countries) to data in the clear, for business purposes, such as, for example, Human Resources.

These two scenarios are frequently used in practice. Still, the EDPB recommends not to execute these transfers in the upcoming future.
Examples of contractual measures are the obligation to implement necessary technical measures, measures regarding transparency of (requested) access by government authorities and measures to be taken against such requests. Accompanying this the European Commission published a draft regarding standard contractual clauses for transferring personal data to non-EU countries, as well as organizational measures such as internal policies and responsibilities regarding government interventions.

The last two steps are undertaking the formal procedural steps to adapt supplementary measures required and re-evaluating the former steps in appropriate intervals.

Even though these recommendations are not (yet) binding, companies should take a further look at the recommendations and check if their data transfers comply with the new situation.

Brazil Update: Senate approves President-appointed ANPD Board of Directors

11. November 2020

Since 18 September 2020, the main provisions of the Brazilian Data Protection Law “LGPD” are in effect. At the same time, Brazilian businesses have been facing legal uncertainty because Brazil’s national Data Protection Authority (“ANPD”) is still not fully functional (we reported). The ANPD shall provide businesses with vital guidance, inter alia, by assessing foreign countries’ level of data protection for international data transfers, Art. 34 LGPD.

On 15 October 2020, the President of Brazil appointed the five members for the ANPD Board of Directors. Following the formal approval process of President appointees in Brazil (“Sabatina”), the Infrastructure and Services Commission of Brazil’s Senate approved of the President’s appointees on 19 October 2020.

Finally, on 20 October 2020, the Senate’s plenary approved of the five appointees. This marks another major step in the ANPD becoming fully operational. The serving terms of the Board of Directors will be staggered:

  • Serving a six-year term: Waldemar Ortunho, current president of Telebras, a state-owned telecommunications company
  • Serving a five-year term: Arthur Pereira Sabbat, currently the Director of the Institutional Security Office (GSI) for the Government’s cybersecurity
  • Serving a four-year term: Joacil Basilio Rael, currently advisor at Telebras
  • Serving a three-year term: Nairane Farias Rabelo, currently Partner at a law firm specialized in Tax Law and Data Protection Law
  • Serving a two-year term: Miriam Wimmer, currently a Director of Telecommunications Services at the Brazilian Ministry of Science, Technology, Innovation and Communications

However, Annex II to the Presidential Decree 10.474 establishing the ANPD sets forth that many more yet vacant positions of the ANPD will have to be filled before it may be fully functional. Until then, Brazilian businesses remain waiting on guidance from the ANPD.

Patients blackmailed after data breach at Finnish private psychotherapy center

9. November 2020

An unknown party breached Vastaamo, a Finnish private psychotherapy center. They accessed the electronic patient record, gathering thousands of confidential patient records.  According to a message left on a Finnish web-forum, they accessed up to 40 000 confidential records of psychotherapy patients. These include not only confidential information regarding therapy sessions but also personal information, such as the social security number. In Finland, this number allows the user to take on credits or found companies. On September 29th Vastaamo notified the Finnish authorities, while they notified the affected via E-Mail and letter after October 21st.

Though the attack prompted an emergency meeting of the Finnish Cabinet, up until now neither Finnish authorities nor Vastaamo released information regarding the nature of the breach.

The initial breach likely occurred in November 2018, while it is believed, there was a second attack that occurred before March 2019. In September 2020, the hackers contacted Vastaamo, demanding a payment of 40 Bitcoin (€ 450 000,00). Vastaamo refused to pay and instead contacted the police and other Finnish authorities. On instruction by the Finnish National Police, Vastaamo published information regarding the data breach, only after some of the data was published on the Tor Network on October 21st. Furthermore, the Board dismissed former CEO Ville Tapio, claiming he concealed the breach.

Further, in late October, the hackers sent messages to patients and employees of Vastaamo, threatening to post their patient files on the internet and demanding payments in Bitcoin. The national police advised victims not to pay the hacker, and instead asked them to save extortion emails or other evidence and file a police report. Until October 30th, Finland’s national police received up to 15 000 reports of offenses regarding this data-breach.

The National Supervisory Authority for Welfare and Health started an investigation of Vastaamo, while the Social Insurance Institution of Finland stopped referrals to Vastaamo.

Ever since the beginning of the Covid-19 pandemic the healthcare and the public health sectors are attacked more frequently, especially in the form of ransomware. The FBI’s Cyber Security Unit (CISA) and the US Department of Health and Human Services have issued a joint advisory regarding the matter. Adding onto that, according to IBM’s annual Cost of a Data Breach Report, the healthcare sector has the highest average breach cost, at 7.13 million per breach.

The CCPA is not enough: Californians will vote on the CPRA

28. October 2020

On 3 November 2020, the day of the US Presidential Election, Californian citizens will also be able to vote on the California Privacy Rights Act of 2020 (“CPRA”) in a state ballot. The CPRA shall expand Califonian consumers’ privacy rights given by the California Consumer Privacy Act of 2018 (“CCPA”) which only came into effect on 2 January 2020.

The NGO “Californians for Consumer Privacy”, led by privacy activist Alastair Mactaggart, initiated the upcoming state ballot on the CPRA. Mactaggart’s NGO already qualified for a state ballot on the adoption of the CCPA by collecting over 629,000 signatures of California citizens in 2018. However, the NGO dropped the proposal in 2018 after California state legislators persuaded the intitiators that they will pass the CCPA through the legislative process. But because several significant amendments to the original proposal were passed during the legislative process, the NGO created the new CPRA initiative in 2020. This time, the group submitted more than 900,000 signatures. The CPRA is supposed to expand on the provisions of the CCPA. In case the CPRA is approved by California voters on November 3rd, it could not be easily amended and would require further direct voter action. Most provisions of the CPRA would become effective on 1 January 2023 and would only apply to information collected from 1 January 2022.

Some of the key provisions of the newly proposed CPRA seem to draw inspiration from the provisions of the European General Data Protection Regulations (“GDPR”) and include the establishment of an enforcement agency (the “California Privacy Protections Agency”), explicitly protecting “Sensitive Personal Information” of consumers and granting the right to rectify inaccurate personal information. The CPRA would furthermore require businesses to abide to information obligations comparable to those required by Art. 12-14 GDPR.

As the day of the state ballot is fast approaching, recent polls suggest that the CPRA will likely pass and complement the already existing CCPA, forming the US’ strictest privacy rules to date.

H&M receives record-breaking 35 Mio Euro GDPR Fine in Germany

21. October 2020

In the beginning of October, the Hamburg Data Protection Commissioner (“HmbBfDI”) imposed a record-breaking 35,258,707.95 Euro GDPR fine on the German branch of the Swedish clothing-retail giant H&M. It is the highest fine, based on a GDPR violation, a German Data Protection Authority has ever issued.

Since 2014, the management of the H&M service centre in Nuremberg extensively monitored the private lives of their employees in various ways. Following holidays and sick leaves of employees, team leaders would conduct so-called “Welcome Back Talks” in which they recorded employees’ holiday experiences, symptoms of illnesses and medical diagnoses. Some H&M supervisors gathered a broad data base of their employees’ private lives as they recorded details on family issues and religious beliefs from one-on-one talks and even corridor conversations. The recordings had a high level of detail and were updated over time and in some cases were shared with up to 50 other managers throughout the whole company. The H&M supervisors also used this Personal Data to create profiles of their employees and to base future employment decisions and measures on this information. The clandestine data collection only became known as a result of a configuration error in 2019 when the notes were accessible company-wide for a few hours.

After the discovery, the H&M executives presented the HmbBfDI a comprehensive concept on improving Data Protection at their Nuremberg sub-branch. This includes newly appointing a Data Protection coordinator, monthly Data Protection status updates, more strongly communicated whistleblower protection and a consistent process for granting data subject rights. Furthermore, H&M has apologised to their employees and paid the affected people a considerable compensation.

With their secret monitoring system at the service centre in Nuremberg, H&M severely violated the GDPR principles of lawfulness, fairness, and transparency of processing pursuant to Art. 5 no. 1 lit. a) and Art. 6 GDPR because they did not have a legal basis for collecting these Personal Data from their employees. The HmbBfDI commented in his statement on the magnitude of the fine saying that “the size of the fine imposed is appropriate and suitable to deter companies from violating the privacy of their employees”.

First judicial application of Schrems II in France

20. October 2020

France’s highest administrative court (Conseil d’État) issued a summary judgment that rejected a request for the suspension of France’s centralized health data platform – Health Data Hub (HDH) – on October 13th, 2020. The Conseil d’État further recognized that there is a risk of U.S. intelligence services requesting the data and called for additional guarantees.

For background, France’s HDH is a data hub supposed to consolidate all health data of people receiving medical care in France in order to facilitate data sharing and promote medical research. The French Government initially chose to partner with Microsoft and its cloud platform Azure. On April 15th, 2020, the HDH signed a contract with Microsoft’s Irish affiliate to host the health data in data centers in the EU. On September 28th, 2020, several associations, unions and individual applicants appealed to the summary proceedings judge of the Conseil d’État, asking for the suspension of the processing of health data related to the COVID-19 pandemic in the HDH. The worry was that the hosting of data by a company which is subject to U.S. laws entails data protection risks due to the potential surveillance done under U.S. national surveillance laws, as has been presented and highlighted in the Schrems II case.

On October 8th, 2020, the Commission Nationale de l’Informatique et Libertées (CNIL) submitted comments on the summary proceeding before the Conseil d’État. The CNIL considered that, despite all of the technical measures implemented by Microsoft (including data encryption), Microsoft could still be able to access the data it processes on behalf of the HDH and could be subject, in theory, to requests from U.S. intelligence services under FISA (or even EO 12333) that would require Microsoft to transfer personal data stored and processed in the EU.
Further, the CNIL recognized that the Court of Justice of the European Union (CJEU) in the Schrems II case only examined the situation where an operator transfers, on its own initiative, personal data to the U.S. However, according to the CNIL, the reasons for the CJEU’s decision also require examining the lawfulness of a situation in which an operator processes personal data in the EU but faces the possibility of having to transfer the data following an administrative or judicial order or request from U.S. intelligence services, which was not clearly stated in the Schrems II ruling. In that case, the CNIL considered that U.S. laws (FISA and EO 12333) also apply to personal data stored outside of the U.S.

In the decision of the Conseil d’État, it agreed with the CNIL that it cannot be totally discounted that U.S. public authorities could request Microsoft and its Irish affiliate to access some of the data held in the HDH. However, the summary proceedings judge did not consider the CJEU’s ruling in the Schrems II case to also require examination of the conditions under which personal data may be processed in the EU by U.S. companies or their affiliates as data processors. EU law does not prohibit subcontracting U.S. companies to process personal data in the EU. In addition, the Conseil d’État considered the violation of the GDPR in this case was purely hypothetical because it presupposes that U.S. authorities are interested in accessing the health data held in the HDH. Further, the summary proceedings judge noted that the health data is pseudonymized before being shared within the HDH, and is then further encrypted by Microsoft.

In the end, the judge highlighted that, in light of the COVID-19 pandemic, there is an important public interest in continuing the processing of health data as enabled by the HDH. The conclusion reached by the Conseil d’ètat was that there is no adequate justification for suspending the data processing activities conducted by the HDH, but the judge ordered the HDH to work with Microsoft to further strengthen privacy rights.

Pages: Prev 1 2 3 ... 9 10 11 12 13 14 15 ... 30 31 32 Next
1 10 11 12 13 14 32