Tag: Data Processing

Thailand’s Personal Data Protection Act enters into force

29. June 2022

On June 1, 2022, Thailand’s Personal Data Protection Act (PDPA) entered into force after three years of delays after its enactment in May 2019. Due to the COVID-19 pandemic, the Thai government issued royal decrees to extend the compliance deadline to June 1, 2022.

The PDPA is widely based on the EU General Data Protection Regulation (GDPR). In particular, it also requires data controllers and processors to have a valid legal basis for processing personal data (i.e., data that can identify living natural persons directly or indirectly). If such personal data is sensitive personal data (e.g. health data, biometric data, race, religion, sexual preference and criminal record), data controllers and processors must ensure that data subjects give explicit consent for any collection, use or disclosure of such data. Exemptions are granted for public interest, contractual obligations, vital interest or compliance with the law.

The PDPA also ensures that data subjects have specific rights, which are very similar to the GDPR: the right to be informed, access, rectify and update data, as well as restrict and object to processing and the right to data erasure and portability.

One major difference to the GDPR is that, while there are fines for breaching the PDPA obligations, certain data breaches involving sensitive personal data and unlawful disclosure also carry criminal penalties including imprisonment of up to one year.

Just like the GDPR, the PDPA also affects both entities in Thailand as well as entities abroad that process personal data for the provision of products and/or services within Thai borders.

Just as we have seen with the GDPR, it will be important to observe the evolution the PDPA will venture through as it becomes more incorporated into the Thai companies’ compliance.

Irish DPC fines Meta 17 Million Euros over 2018 data breaches

16. March 2022

On March 15th, 2022, the Irish Data Protection Commission (DPC) has imposed a fine on Meta Platforms 17 million euros over a series of twelve data breaches, which happened from June to December 2018.

The inquiry of the DPC which led to this decision examined the extent to which Meta Platforms complied with the requirements of Arti. 5(1)(f), Art. 5(2), Art. 24(1) and Art. 32(1) GDPR in relation to the processing of personal data relevant to the twelve breach notifications.

As the result of this inquiry, the DPC found that Meta Platforms infringed Art. 5(2) and 24(1) GDPR.  In particular, the DPC assessed that Meta Platforms failed to have in place appropriate technical and organisational measures which would enable it to readily demonstrate the security measures that it implemented in practice to protect the data of its European users in the case of those twelve data breaches.

The processing under examination constituted a “cross-border” processing, and as such the DPC’s decision was subject to the co-decision-making process outlined in Art. 60 GDPR. This resulted in all of the other European supervisory authorities to be engaged in this decision as co-decision-makers.  While objections to the DPC’s draft decision were raised by two of the European supervisory authorities, consensus was achieved through further engagement between the DPC, and the supervisory authorities concerned.

“Accordingly, the DPC’s decision represents the collective views of both the DPC and its counterpart supervisory authorities throughout the EU,” the DPC stated in their press release.

A Meta spokesperson has commented on the decision, stating, “This fine is about record keeping practices from 2018 that we have since updated, not a failure to protect people’s information. We take our obligations under the GDPR seriously and will carefully consider this decision as our processes continue to evolve.”

ICO releases Guidance on Video Surveillance

7. March 2022

At the end of February 2022, The UK Information Commissioners’ Office (ICO) published a guidance for organizations that capture CCTVs footage in order to provide advice for when they operate video surveillance systems that view or record individuals.

The recommendations aim to focus on best practices for data activities related to “emerging capabilities that can assist human decision making, such as the use of Facial Recognition Technology and machine learning algorithms.” As per the Guidance, surveillance systems specifically include traditional CCTV, Automatic Number Plate Recognition, Body Worn Video, Drones, Facial Recognition Technology, dashcams and smart doorbell cameras.

In their Guidance, the ICO offers checklists with points that controllers can use in order to monitor their use of video surveillance and keep track of their compliance with the applicable law. It further touches on the principles of data protection and how they specifically apply to video surveillance. In addition, it helps companies with the documentation of a Data Processing Impact Assessment.

The Guidance gives in depth advice on video surveillance at the workplace as well as if video feeds should also record audio.

Overall, the Guidance aims to sensibilize controllers regarding the various issues faced with when using video surveillance, and gives them in depth help on what to do to be compliant with the data protection regulations in the UK.

Belgian DPA approves first EU Data Protection Code of Conduct for Cloud Service Providers

21. June 2021

On May 20th, 2021, the Belgian Data Protection Authority (Belgian DPA) announced that it had approved the EU Data Protection Code of Conduct for Cloud Service Providers (EU Cloud CoC). The EU Cloud CoC is the first transnational EU code of conduct since the entry into force of the EU General Data Protection Regulation in May 2018.

The EU Cloud CoC represents a sufficient guarantee pursuant to Article 28 (1) and 28 (5) of the GDPR, as well as Recital 81 of the GDPR, which makes the adherence to the code by cloud service providers a valid way to secure potential data transfers.

In particular, the EU Cloud CoC aims to establish good data protection practices for cloud service providers, giving data subjects more security in terms of the handling of their personal data by cloud service providers. In addition, the Belgian DPA accredited SCOPE Europe as the monitoring body for the code of conduct, which will ensure that code members comply with the requirements set out by the code.

It further offers cloud service providers with practical guidance and a set of specific binding requirements (such as requirements regarding the use of sub-processors, audits, compliance with data subject rights requests, transparency, etc.), as well as objectives to help cloud service providers demonstrate compliance with Article 28 of the GDPR.

In the press release, the Chairman of the Belgian DPA stated that „the approval of the EU Cloud CoC was achieved through narrow collaboration within the European Data Protection Board and is an important step towards a harmonised interpretation and application of the GDPR in a crucial sector for the digital economy“.

Data Protection and Clinical Trials – Part 1

10. February 2021

In the two and a half years since the General Data Protection Regulation (GDPR) has come into effect, a lot of organizations have gotten used to the new laws and standards it has established. However, there are still a lot of unanswered questions in certain industries, one of those industries being life sciences, and more specifically clinical trials.

The GDPR and the guidance of the European Data Protection Board (EDPB) allow for a lot of speculation, due to the fact that they are unable to fully specify the reach and definitive approach to data protection in a lot of industries.

This short series aims to give an overview on the handling of clinical trials from a data protection point of view, as well as answers to important questions that come up in day to day business in the industry.

In general, clinical trials are a processing activity according to Art. 4 (2) GDPR, therefore the basic data protection obligations are to be applied to clinical trials, such as:

  • Following the basic GDPR principles laid out in Art. 5 GDPR, namely lawfulness, fairness and transparency, purpose limitation, data minimisation, data accuracy, storage limitation, integrity, confidentiality and accountability
  • Information obligations of the controller according to Art. 13, 14 GDPR
  • Data Subjects Rights according to Art. 15 to Art. 21 GDPR
  • Obligation to have a record of processing activities according to Art. 30 para. 1, 2 GDPR
  • Security Measures need to be in place, in compliance with Art. 32 GDPR
  • Data Breach Notifications to the supervisory authority as well as the data subjects according to Art. 33, 34 GDPR
  • A Data Protection Impact Assessment has to be done prior to the start of the clinical trials, according to Art. 35 GDPR

However, the first and foremost important question regarding the processing of personal data for clinical trials is:

Which legal basis is applicable to the processing?

The EDPB addressed this issue in their Opinion on the Interplay between Clinical Trials and the GDPR, and has, in a first instance, differentiated between the processing of personal data for clinical trial protocols as primary purpose of the processing, and, on the other hand, clinical trials as a secondary purpose next to, for example, patient care.

According to the EDPB’s opinion, the applicable legal basis is to be determined by the controller on a case by case basis. However, the EDPB does give their own general assessment on the legal basis applicable for the different scenarios that have crystalized in the eyes of the EDPB:

  • Primary use of the processed personal data for clinical trials
    a. Processing activities related to reliability and safety
    -> Legal obligations of the controller, Art. 6 para. 1 (c) GDPR in conjunction with Art. 9 para. 1 (i) GDPR
    b. Processing activities purely related to research activities
    -> Task carried out in the public interest, Art. 6 para. 1 (e) GDPR in conjunction with Art. 9 para. 2 (i) or (j) GDPR
    -> Legitimate interest of the controller, Art. 6 para. 1 (f) GDPR in conjunction with Art. 9 para. 2 (j) GDPR
    -> In specific circumstances, explicit consent of the data subject, Art. 6 para. 1 (a) GDPR and Art. 9 para. 2 (a) GDPR
  • Secondary use of the clinical trial data outside the clinical trial protocol for scientific purposes
    -> Explicit consent of the data subject, Art. 6 para. 1 (a) GDPR and Art. 9 para. 2 (a) GDPR

While the guidance in assessing the legal basis for the processing is helpful, the EDPB does not address any further open issues regarding clinical trials in their opinion. Nonetheless, there are further subjects that cause confusion.

However, some of these subjects will be treated in our next part of this series, where we will have a closer look at clinical trial sponsorship from outside the EEA as well as the questions revolving around controllership roles in clinical trials.

China issued new Draft for Personal Information Protection Law

23. November 2020

At the end of October 2020, China issued a draft for a new „Personal Information Protection Law” (PIPL). This new draft is the introduction of a comprehensive system in terms of data protection, which seems to have taken inspiration from the European General Data Protection Regulation (GDPR).

With the new draft, China’s regulations regarding data protection will be consisting of China’s Cybersecurity Law, Data Security Law (draft) and Draft PIPL. The new draft legislation contains provisions relating to issues presented by new technology and applications, all of this in around 70 articles. The fines written in the draft for non-compliance are quite high, and will bring significant impact to companies with operations in China or targeting China as a market.

The data protection principles drawn out in the draft PIPL include transparency, fairness, purpose limitation, data minimization, limited retention, data accuracy and accountability. The topics that are covered include personal information processing, the cross-border transfer of personal information, the rights of data subjects in relation to data processing, obligations of data processors, the authority in charge of personal information as well as legal liabilities.

Unlike China’s Cybersecurity Law, which provides limited extraterritorial application, the draft PIPL proposes clear and specific extraterritorial application to overseas entities and individuals that process the personal data of data subjects in China.

Further, the definition of “personal data” and “processing” under the draft PIPL are very similar to its equivalent term under the GDPR. Organizations or individuals outside China that fall into the scope of the draft PIPL are also required to set up a dedicated organization or appoint a representative in China, in addition to also report relevant information of their domestic organization or representative to Chinese regulators.

In comparison to the GDPR, the draft PIPL extends the term of “sensitive data” to also include nationality, financial accounts, as well as personal whereabouts. However, sensitive personal information is defined as information that once leaked or abused may cause damage to personal reputation or seriously endanger personal and property safety, which opens the potential for further interpretation.

The draft legislation also regulates cross-border transfers of personal information, which shall be possible if it is certified by recognized institutions, or the data processor executes a cross-border transfer agreement with the recipient located outside of China, to ensure that the processing meets the protection standard provided under the draft PIPL. Where the data processor is categorized as a critical information infrastructure operator or the volume of data processed by the data processor exceeds the level stipulated by the Cyberspace Administration of China (CAC), the cross-border transfer of personal information must pass a security assessment conducted by the CAC.

It further to keep in mind that the draft PIPL enlarges the range of penalties beyond those provided in the Cybersecurity Law, which will put a much higher pressure on liabilities for Controllers operating in China.

Currently, the period established to receive open comments on the draft legislation has ended, but the next steps have not yet been reported, and it not yet sure when the draft legislation will come into full effect.

Appeal against record fine for GDPR violation in Poland dismissed

22. October 2020

On 10th September 2019 the Polish Data Protection Commissioner imposed a record fine in the amount of more than PLN 2,8 million or the equivalent of € 660.000 on the company Morele.net for violating the implementation of appropriate technical and organisational measures as well as the lack of verifiability of the prior consents to data processing. The Krakow-based company runs various online shops and stores customer data on a central database. According to the Personal Data Protection Office (UODO), there has been 2,2 million customers affected.

Starting point were especially two incidents at the end of 2018, when unauthorised persons got access to the customer database of the company and the contained personal data. The company notified the data breach to the UODO, which accused it particularly of violation of the confidentiality principle (Articles 5 (1) lit. f, 24 (1), 25 (1), 32 (1) lit. b, d, (2) GDPR) by failing to use sufficient technical and organisational measures to safeguard the data of its customers, such as a two-factor authentication. As claimed by the UODO, the selection of the authentication mechanism should always be preceded by an adequate risk analysis with a corresponding determination of protection requirements. The company did not adequately comply with this. However, it should have been sufficiently aware of the phishing risks as the Computer Emergency Response Team (CERT Polska) had already pointed it out.

In addition, the UODO accused the company of violation of the lawfulness, fairness, transparency and accountability principles (Articles 5 (1) lit. a, (2), 6 (1), 7 (1) GDPR) by not being able to prove that (where necessary) the personal data from installment applications had been processed on the basis of consents of data subjects. Furthermore, after a risk analysis, the company deleted the corresponding data from the database in December 2018, but according to the UODO, the deletion was not sufficiently documented.

When assessing the fine, there were many aspects which played a decisive role. Most of all, the extent of the violation (2,2 million customers) and the fact that the company processes personal data professionally in the course of its business activities and therefore has to apply a higher level of security. However, mitigating circumstances were also taken into account, such as the good cooperation with the supervisory authority, no previous ascertainable violations of the GDPR and no identifiable financial advantages for the company.

On 3rd September 2020, the Provincial Administrative Court (WSA) in Warsaw issued a judgment on Morele.net’s appeal against the decision. The WSA dismissed the appeal and considered that the decision on the fine imposed on the company was justified. Furthermore, the WSA stated that the UODO had correctly assessed the facts in the case concerned and considered that the fine imposed was high but within the limits of the law and justified by circumstances. It is expected that the company will lodge a complaint with the Supreme Administrative Court of Poland.

First judicial application of Schrems II in France

20. October 2020

France’s highest administrative court (Conseil d’État) issued a summary judgment that rejected a request for the suspension of France’s centralized health data platform – Health Data Hub (HDH) – on October 13th, 2020. The Conseil d’État further recognized that there is a risk of U.S. intelligence services requesting the data and called for additional guarantees.

For background, France’s HDH is a data hub supposed to consolidate all health data of people receiving medical care in France in order to facilitate data sharing and promote medical research. The French Government initially chose to partner with Microsoft and its cloud platform Azure. On April 15th, 2020, the HDH signed a contract with Microsoft’s Irish affiliate to host the health data in data centers in the EU. On September 28th, 2020, several associations, unions and individual applicants appealed to the summary proceedings judge of the Conseil d’État, asking for the suspension of the processing of health data related to the COVID-19 pandemic in the HDH. The worry was that the hosting of data by a company which is subject to U.S. laws entails data protection risks due to the potential surveillance done under U.S. national surveillance laws, as has been presented and highlighted in the Schrems II case.

On October 8th, 2020, the Commission Nationale de l’Informatique et Libertées (CNIL) submitted comments on the summary proceeding before the Conseil d’État. The CNIL considered that, despite all of the technical measures implemented by Microsoft (including data encryption), Microsoft could still be able to access the data it processes on behalf of the HDH and could be subject, in theory, to requests from U.S. intelligence services under FISA (or even EO 12333) that would require Microsoft to transfer personal data stored and processed in the EU.
Further, the CNIL recognized that the Court of Justice of the European Union (CJEU) in the Schrems II case only examined the situation where an operator transfers, on its own initiative, personal data to the U.S. However, according to the CNIL, the reasons for the CJEU’s decision also require examining the lawfulness of a situation in which an operator processes personal data in the EU but faces the possibility of having to transfer the data following an administrative or judicial order or request from U.S. intelligence services, which was not clearly stated in the Schrems II ruling. In that case, the CNIL considered that U.S. laws (FISA and EO 12333) also apply to personal data stored outside of the U.S.

In the decision of the Conseil d’État, it agreed with the CNIL that it cannot be totally discounted that U.S. public authorities could request Microsoft and its Irish affiliate to access some of the data held in the HDH. However, the summary proceedings judge did not consider the CJEU’s ruling in the Schrems II case to also require examination of the conditions under which personal data may be processed in the EU by U.S. companies or their affiliates as data processors. EU law does not prohibit subcontracting U.S. companies to process personal data in the EU. In addition, the Conseil d’État considered the violation of the GDPR in this case was purely hypothetical because it presupposes that U.S. authorities are interested in accessing the health data held in the HDH. Further, the summary proceedings judge noted that the health data is pseudonymized before being shared within the HDH, and is then further encrypted by Microsoft.

In the end, the judge highlighted that, in light of the COVID-19 pandemic, there is an important public interest in continuing the processing of health data as enabled by the HDH. The conclusion reached by the Conseil d’ètat was that there is no adequate justification for suspending the data processing activities conducted by the HDH, but the judge ordered the HDH to work with Microsoft to further strengthen privacy rights.

Facebook collects location data despite deactivation

19. December 2019

Facebook has admitted at the request of several US senators that they continuously collect location data, even if the user previously deactivated this feature.

In case of deactivating this feature, location data is collected, for example, by IP address mapping or user activity. This includes, for example, a self-conducted location-tag in a certain restaurant or at a special location, but also the case of being linked by friends to a photo that contains a location-tag.

In the letter that Senator Josh Hawley published on Twitter, Facebook states that they have only the best intentions in collecting the data. According to the statement, this is the only way, for example, to place personalized ads or inform a user when someone logs in to a completely different location than usual with their account.

While Facebook states that the location data – based on e.g. the IP address –  does not indicate an exact Location but only the postcode, for example, it means that there is no way for users to opt-out of the collection of location data.

Category: General
Tags: ,

China publishes provisions on the protection of personal data of children

10. October 2019

On 23 August 2019, the Cyberspace Administration of China published regulations on the cyber protection of personal data of children, which came into force on 1 October 2019. China thus enacted the first rules focusing exclusively on the protection of children’s personal data.

In the regulations, “children” refers to minors under the age of 14. This corresponds to the definition in the national “Information Security Technology – Personal Information Security Specification”.

The provisions regulate activities related to the collection, storage, use, transfer and disclosure of personal data of children through networks located on the territory of China. However, the provisions do not apply to activities conducted outside of China or to similar activities conducted offline.

The provisions provide a higher standard of consent than the Cybersecurity Law of China. To obtain the consent of a guardian, a network operator has to provide the possibility of refusal and expressly inform the guardian of the following:

  • Purpose, means and scope of collection, storage, use, transfer and disclosure of children’s personal information;
  • Storage location of children’s personal information, retention period and how the relevant information will be handled after expiration of the retention period;
  • Safeguard measures protecting children’s personal information;
  • Consequences of rejection by a guardian;
  • The channels and means of filing or reporting complaints; and
  • How to correct and delete children’s personal information.

The network operator also has to restrict internal access to children’s personal information. In particular, before accessing the information, personnel must obtain consent of the person responsible for the protection of children’s personal data or an authorised administrator.

If children’s personal data are processed by a third party processor, the network operator is obliged to carry out a security assessment of the data processor commissioned to process the children’s personal data. He also has to conclude an entrustment agreement with the data processor. The data processor is obliged to support the network operator in fulfilling the request of the guardian to delete the data of a child after termination of the service. Subletting or subcontracting by the data processor is prohibited.

If personal data of children is transferred to a third party, the network operator shall carry out a security assessment of the commissioned person or commission a third party to carry out such an assessment.

Children or their legal guardians have the right to demand the deletion of children’s personal data under certain circumstances. In any case, they have the right to demand the correction of personal data of children if they are collected, stored, used or disclosed by a network operator. In addition, the legal guardians have the right to withdraw their consent in its entirety.

In the event of actual or potential data breaches, the network operator is obliged to immediately initiate its emergency plan and take remedial action. If the violation has or may have serious consequences, the network operator must immediately report the violation to the competent authorities and inform the affected children and their legal guardians by e-mail, letter, telephone or push notification. Where it is challenging to send the notification to any data subject, the network operator shall take appropriate and effective measures to make the notification public. However, the rules do not contain a precise definition of the serious consequences.

In the event that the data breach is caused or observed by a data processor, the data processor is obliged to inform the network operator in good time.

Pages: 1 2 Next
1 2