Tag: Data Processing

Belgian DPA approves first EU Data Protection Code of Conduct for Cloud Service Providers

21. June 2021

On May 20th, 2021, the Belgian Data Protection Authority (Belgian DPA) announced that it had approved the EU Data Protection Code of Conduct for Cloud Service Providers (EU Cloud CoC). The EU Cloud CoC is the first transnational EU code of conduct since the entry into force of the EU General Data Protection Regulation in May 2018.

The EU Cloud CoC represents a sufficient guarantee pursuant to Article 28 (1) and 28 (5) of the GDPR, as well as Recital 81 of the GDPR, which makes the adherence to the code by cloud service providers a valid way to secure potential data transfers.

In particular, the EU Cloud CoC aims to establish good data protection practices for cloud service providers, giving data subjects more security in terms of the handling of their personal data by cloud service providers. In addition, the Belgian DPA accredited SCOPE Europe as the monitoring body for the code of conduct, which will ensure that code members comply with the requirements set out by the code.

It further offers cloud service providers with practical guidance and a set of specific binding requirements (such as requirements regarding the use of sub-processors, audits, compliance with data subject rights requests, transparency, etc.), as well as objectives to help cloud service providers demonstrate compliance with Article 28 of the GDPR.

In the press release, the Chairman of the Belgian DPA stated that „the approval of the EU Cloud CoC was achieved through narrow collaboration within the European Data Protection Board and is an important step towards a harmonised interpretation and application of the GDPR in a crucial sector for the digital economy“.

Data Protection and Clinical Trials – Part 1

10. February 2021

In the two and a half years since the General Data Protection Regulation (GDPR) has come into effect, a lot of organizations have gotten used to the new laws and standards it has established. However, there are still a lot of unanswered questions in certain industries, one of those industries being life sciences, and more specifically clinical trials.

The GDPR and the guidance of the European Data Protection Board (EDPB) allow for a lot of speculation, due to the fact that they are unable to fully specify the reach and definitive approach to data protection in a lot of industries.

This short series aims to give an overview on the handling of clinical trials from a data protection point of view, as well as answers to important questions that come up in day to day business in the industry.

In general, clinical trials are a processing activity according to Art. 4 (2) GDPR, therefore the basic data protection obligations are to be applied to clinical trials, such as:

  • Following the basic GDPR principles laid out in Art. 5 GDPR, namely lawfulness, fairness and transparency, purpose limitation, data minimisation, data accuracy, storage limitation, integrity, confidentiality and accountability
  • Information obligations of the controller according to Art. 13, 14 GDPR
  • Data Subjects Rights according to Art. 15 to Art. 21 GDPR
  • Obligation to have a record of processing activities according to Art. 30 para. 1, 2 GDPR
  • Security Measures need to be in place, in compliance with Art. 32 GDPR
  • Data Breach Notifications to the supervisory authority as well as the data subjects according to Art. 33, 34 GDPR
  • A Data Protection Impact Assessment has to be done prior to the start of the clinical trials, according to Art. 35 GDPR

However, the first and foremost important question regarding the processing of personal data for clinical trials is:

Which legal basis is applicable to the processing?

The EDPB addressed this issue in their Opinion on the Interplay between Clinical Trials and the GDPR, and has, in a first instance, differentiated between the processing of personal data for clinical trial protocols as primary purpose of the processing, and, on the other hand, clinical trials as a secondary purpose next to, for example, patient care.

According to the EDPB’s opinion, the applicable legal basis is to be determined by the controller on a case by case basis. However, the EDPB does give their own general assessment on the legal basis applicable for the different scenarios that have crystalized in the eyes of the EDPB:

  • Primary use of the processed personal data for clinical trials
    a. Processing activities related to reliability and safety
    -> Legal obligations of the controller, Art. 6 para. 1 (c) GDPR in conjunction with Art. 9 para. 1 (i) GDPR
    b. Processing activities purely related to research activities
    -> Task carried out in the public interest, Art. 6 para. 1 (e) GDPR in conjunction with Art. 9 para. 2 (i) or (j) GDPR
    -> Legitimate interest of the controller, Art. 6 para. 1 (f) GDPR in conjunction with Art. 9 para. 2 (j) GDPR
    -> In specific circumstances, explicit consent of the data subject, Art. 6 para. 1 (a) GDPR and Art. 9 para. 2 (a) GDPR
  • Secondary use of the clinical trial data outside the clinical trial protocol for scientific purposes
    -> Explicit consent of the data subject, Art. 6 para. 1 (a) GDPR and Art. 9 para. 2 (a) GDPR

While the guidance in assessing the legal basis for the processing is helpful, the EDPB does not address any further open issues regarding clinical trials in their opinion. Nonetheless, there are further subjects that cause confusion.

However, some of these subjects will be treated in our next part of this series, where we will have a closer look at clinical trial sponsorship from outside the EEA as well as the questions revolving around controllership roles in clinical trials.

China issued new Draft for Personal Information Protection Law

23. November 2020

At the end of October 2020, China issued a draft for a new „Personal Information Protection Law” (PIPL). This new draft is the introduction of a comprehensive system in terms of data protection, which seems to have taken inspiration from the European General Data Protection Regulation (GDPR).

With the new draft, China’s regulations regarding data protection will be consisting of China’s Cybersecurity Law, Data Security Law (draft) and Draft PIPL. The new draft legislation contains provisions relating to issues presented by new technology and applications, all of this in around 70 articles. The fines written in the draft for non-compliance are quite high, and will bring significant impact to companies with operations in China or targeting China as a market.

The data protection principles drawn out in the draft PIPL include transparency, fairness, purpose limitation, data minimization, limited retention, data accuracy and accountability. The topics that are covered include personal information processing, the cross-border transfer of personal information, the rights of data subjects in relation to data processing, obligations of data processors, the authority in charge of personal information as well as legal liabilities.

Unlike China’s Cybersecurity Law, which provides limited extraterritorial application, the draft PIPL proposes clear and specific extraterritorial application to overseas entities and individuals that process the personal data of data subjects in China.

Further, the definition of “personal data” and “processing” under the draft PIPL are very similar to its equivalent term under the GDPR. Organizations or individuals outside China that fall into the scope of the draft PIPL are also required to set up a dedicated organization or appoint a representative in China, in addition to also report relevant information of their domestic organization or representative to Chinese regulators.

In comparison to the GDPR, the draft PIPL extends the term of “sensitive data” to also include nationality, financial accounts, as well as personal whereabouts. However, sensitive personal information is defined as information that once leaked or abused may cause damage to personal reputation or seriously endanger personal and property safety, which opens the potential for further interpretation.

The draft legislation also regulates cross-border transfers of personal information, which shall be possible if it is certified by recognized institutions, or the data processor executes a cross-border transfer agreement with the recipient located outside of China, to ensure that the processing meets the protection standard provided under the draft PIPL. Where the data processor is categorized as a critical information infrastructure operator or the volume of data processed by the data processor exceeds the level stipulated by the Cyberspace Administration of China (CAC), the cross-border transfer of personal information must pass a security assessment conducted by the CAC.

It further to keep in mind that the draft PIPL enlarges the range of penalties beyond those provided in the Cybersecurity Law, which will put a much higher pressure on liabilities for Controllers operating in China.

Currently, the period established to receive open comments on the draft legislation has ended, but the next steps have not yet been reported, and it not yet sure when the draft legislation will come into full effect.

Appeal against record fine for GDPR violation in Poland dismissed

22. October 2020

On 10th September 2019 the Polish Data Protection Commissioner imposed a record fine in the amount of more than PLN 2,8 million or the equivalent of € 660.000 on the company Morele.net for violating the implementation of appropriate technical and organisational measures as well as the lack of verifiability of the prior consents to data processing. The Krakow-based company runs various online shops and stores customer data on a central database. According to the Personal Data Protection Office (UODO), there has been 2,2 million customers affected.

Starting point were especially two incidents at the end of 2018, when unauthorised persons got access to the customer database of the company and the contained personal data. The company notified the data breach to the UODO, which accused it particularly of violation of the confidentiality principle (Articles 5 (1) lit. f, 24 (1), 25 (1), 32 (1) lit. b, d, (2) GDPR) by failing to use sufficient technical and organisational measures to safeguard the data of its customers, such as a two-factor authentication. As claimed by the UODO, the selection of the authentication mechanism should always be preceded by an adequate risk analysis with a corresponding determination of protection requirements. The company did not adequately comply with this. However, it should have been sufficiently aware of the phishing risks as the Computer Emergency Response Team (CERT Polska) had already pointed it out.

In addition, the UODO accused the company of violation of the lawfulness, fairness, transparency and accountability principles (Articles 5 (1) lit. a, (2), 6 (1), 7 (1) GDPR) by not being able to prove that (where necessary) the personal data from installment applications had been processed on the basis of consents of data subjects. Furthermore, after a risk analysis, the company deleted the corresponding data from the database in December 2018, but according to the UODO, the deletion was not sufficiently documented.

When assessing the fine, there were many aspects which played a decisive role. Most of all, the extent of the violation (2,2 million customers) and the fact that the company processes personal data professionally in the course of its business activities and therefore has to apply a higher level of security. However, mitigating circumstances were also taken into account, such as the good cooperation with the supervisory authority, no previous ascertainable violations of the GDPR and no identifiable financial advantages for the company.

On 3rd September 2020, the Provincial Administrative Court (WSA) in Warsaw issued a judgment on Morele.net’s appeal against the decision. The WSA dismissed the appeal and considered that the decision on the fine imposed on the company was justified. Furthermore, the WSA stated that the UODO had correctly assessed the facts in the case concerned and considered that the fine imposed was high but within the limits of the law and justified by circumstances. It is expected that the company will lodge a complaint with the Supreme Administrative Court of Poland.

First judicial application of Schrems II in France

20. October 2020

France’s highest administrative court (Conseil d’État) issued a summary judgment that rejected a request for the suspension of France’s centralized health data platform – Health Data Hub (HDH) – on October 13th, 2020. The Conseil d’État further recognized that there is a risk of U.S. intelligence services requesting the data and called for additional guarantees.

For background, France’s HDH is a data hub supposed to consolidate all health data of people receiving medical care in France in order to facilitate data sharing and promote medical research. The French Government initially chose to partner with Microsoft and its cloud platform Azure. On April 15th, 2020, the HDH signed a contract with Microsoft’s Irish affiliate to host the health data in data centers in the EU. On September 28th, 2020, several associations, unions and individual applicants appealed to the summary proceedings judge of the Conseil d’État, asking for the suspension of the processing of health data related to the COVID-19 pandemic in the HDH. The worry was that the hosting of data by a company which is subject to U.S. laws entails data protection risks due to the potential surveillance done under U.S. national surveillance laws, as has been presented and highlighted in the Schrems II case.

On October 8th, 2020, the Commission Nationale de l’Informatique et Libertées (CNIL) submitted comments on the summary proceeding before the Conseil d’État. The CNIL considered that, despite all of the technical measures implemented by Microsoft (including data encryption), Microsoft could still be able to access the data it processes on behalf of the HDH and could be subject, in theory, to requests from U.S. intelligence services under FISA (or even EO 12333) that would require Microsoft to transfer personal data stored and processed in the EU.
Further, the CNIL recognized that the Court of Justice of the European Union (CJEU) in the Schrems II case only examined the situation where an operator transfers, on its own initiative, personal data to the U.S. However, according to the CNIL, the reasons for the CJEU’s decision also require examining the lawfulness of a situation in which an operator processes personal data in the EU but faces the possibility of having to transfer the data following an administrative or judicial order or request from U.S. intelligence services, which was not clearly stated in the Schrems II ruling. In that case, the CNIL considered that U.S. laws (FISA and EO 12333) also apply to personal data stored outside of the U.S.

In the decision of the Conseil d’État, it agreed with the CNIL that it cannot be totally discounted that U.S. public authorities could request Microsoft and its Irish affiliate to access some of the data held in the HDH. However, the summary proceedings judge did not consider the CJEU’s ruling in the Schrems II case to also require examination of the conditions under which personal data may be processed in the EU by U.S. companies or their affiliates as data processors. EU law does not prohibit subcontracting U.S. companies to process personal data in the EU. In addition, the Conseil d’État considered the violation of the GDPR in this case was purely hypothetical because it presupposes that U.S. authorities are interested in accessing the health data held in the HDH. Further, the summary proceedings judge noted that the health data is pseudonymized before being shared within the HDH, and is then further encrypted by Microsoft.

In the end, the judge highlighted that, in light of the COVID-19 pandemic, there is an important public interest in continuing the processing of health data as enabled by the HDH. The conclusion reached by the Conseil d’ètat was that there is no adequate justification for suspending the data processing activities conducted by the HDH, but the judge ordered the HDH to work with Microsoft to further strengthen privacy rights.

Facebook collects location data despite deactivation

19. December 2019

Facebook has admitted at the request of several US senators that they continuously collect location data, even if the user previously deactivated this feature.

In case of deactivating this feature, location data is collected, for example, by IP address mapping or user activity. This includes, for example, a self-conducted location-tag in a certain restaurant or at a special location, but also the case of being linked by friends to a photo that contains a location-tag.

In the letter that Senator Josh Hawley published on Twitter, Facebook states that they have only the best intentions in collecting the data. According to the statement, this is the only way, for example, to place personalized ads or inform a user when someone logs in to a completely different location than usual with their account.

While Facebook states that the location data – based on e.g. the IP address –  does not indicate an exact Location but only the postcode, for example, it means that there is no way for users to opt-out of the collection of location data.

Category: General
Tags: ,

China publishes provisions on the protection of personal data of children

10. October 2019

On 23 August 2019, the Cyberspace Administration of China published regulations on the cyber protection of personal data of children, which came into force on 1 October 2019. China thus enacted the first rules focusing exclusively on the protection of children’s personal data.

In the regulations, “children” refers to minors under the age of 14. This corresponds to the definition in the national “Information Security Technology – Personal Information Security Specification”.

The provisions regulate activities related to the collection, storage, use, transfer and disclosure of personal data of children through networks located on the territory of China. However, the provisions do not apply to activities conducted outside of China or to similar activities conducted offline.

The provisions provide a higher standard of consent than the Cybersecurity Law of China. To obtain the consent of a guardian, a network operator has to provide the possibility of refusal and expressly inform the guardian of the following:

  • Purpose, means and scope of collection, storage, use, transfer and disclosure of children’s personal information;
  • Storage location of children’s personal information, retention period and how the relevant information will be handled after expiration of the retention period;
  • Safeguard measures protecting children’s personal information;
  • Consequences of rejection by a guardian;
  • The channels and means of filing or reporting complaints; and
  • How to correct and delete children’s personal information.

The network operator also has to restrict internal access to children’s personal information. In particular, before accessing the information, personnel must obtain consent of the person responsible for the protection of children’s personal data or an authorised administrator.

If children’s personal data are processed by a third party processor, the network operator is obliged to carry out a security assessment of the data processor commissioned to process the children’s personal data. He also has to conclude an entrustment agreement with the data processor. The data processor is obliged to support the network operator in fulfilling the request of the guardian to delete the data of a child after termination of the service. Subletting or subcontracting by the data processor is prohibited.

If personal data of children is transferred to a third party, the network operator shall carry out a security assessment of the commissioned person or commission a third party to carry out such an assessment.

Children or their legal guardians have the right to demand the deletion of children’s personal data under certain circumstances. In any case, they have the right to demand the correction of personal data of children if they are collected, stored, used or disclosed by a network operator. In addition, the legal guardians have the right to withdraw their consent in its entirety.

In the event of actual or potential data breaches, the network operator is obliged to immediately initiate its emergency plan and take remedial action. If the violation has or may have serious consequences, the network operator must immediately report the violation to the competent authorities and inform the affected children and their legal guardians by e-mail, letter, telephone or push notification. Where it is challenging to send the notification to any data subject, the network operator shall take appropriate and effective measures to make the notification public. However, the rules do not contain a precise definition of the serious consequences.

In the event that the data breach is caused or observed by a data processor, the data processor is obliged to inform the network operator in good time.

High Court dismisses challenge regarding Automated Facial Recognition

12. September 2019

On 4 September, the High Court of England and Wales dismissed a challenge to the police’s use of Automated Facial Recognition Technology (“AFR”). The court ruled that the use of AFR was proportionate and necessary to meet the legal obligations of the police.

The pilot project AFR Locate was used for certain events and public places when the commission of crimes was likely. Up to 50 faces per second can be detected. The faces are then compared by biometric data analysis with wanted persons registered in police databases. If no match is found the images are deleted immediately and automatically.

An individual has initiated a judicial review process after he has not been identified as a wanted person, but is likely to have been captured by AFR Locate. He considered this to be illegal, in particular due to a violation of the right to respect for private and family life under Article 8 of the European Convention on Human Rights (“ECHR”) and data protection law in the United Kingdom. In his view, the police did not respect the data protection principles. In particular, that approach would violate the principle of Article 35 of the Data Protection Act 2018 (“DPA 2018”), which requires the processing of personal data for law enforcement purposes to be lawful and fair. He also pointed out that the police had failed to carry out an adequate data protection impact assessment (“DPIA”).

The Court stated that the use of AFR has affected a person’s rights under Article 8 of the ECHR and that this type of biometric data has a private character in itself. Despite the fact that the images were erased immediately, this procedure constituted an interference with Article 8 of the ECHR, since it suffices that the data is temporarily stored.

Nevertheless, the Court found that the police’s action was in accordance with the law, as it falls within the police’s public law powers to prevent and detect criminal offences. The Court also found that the use of the AFR system is proportionate and that the technology can be used openly, transparently and with considerable public commitment, thus fulfilling all existing criteria. It was only used for a limited period, for a specific purpose and published before it was used (e.g. on Facebook and Twitter).

With regard to data protection law, the Court considers that the images of individuals captured constitute personal data, even if they do not correspond to the lists of persons sought, because the technology has singled them out and distinguished them from others. Nevertheless, the Court held that there was no violation of data protection principles, for the same reasons on which it denied a violation of Art. 8 ECHR. The Court found that the processing fulfilled the conditions of legality and fairness and was necessary for the legitimate interest of the police in the prevention and detection of criminal offences, as required by their public service obligations. The requirement of Sec. 35 (5) DPA 2018 that the processing is absolutely necessary was fulfilled, as was the requirement that the processing is necessary for the exercise of the functions of the police.

The last requirement under Sec. 35 (5) of the DPA 2018 is that a suitable policy document is available to regulate the processing. The Court considered the relevant policy document in this case to be short and incomplete. Nevertheless, it refused to give a judgment as to whether the document was adequate and stated that it would leave that judgment to the Information Commissioner Office (“ICO”), as it would publish more detailed guidelines.

Finally, the Court found that the impact assessment carried out by the police was sufficient to meet the requirements of Sec. 64 of DPA 2018.

The ICO stated that it would take into account the High Court ruling when finalising its recommendations and guidelines for the use of live face recognition systems.

Google strives to reconcile advertising and privacy

27. August 2019

While other browser developers are critical of tracking, Google wants to introduce new standards to continue enabling personalized advertising. With the implementation of the “Privacy Sandbox” and the introduction of a new identity management system, the developer of the Chrome browser wants to bring browsers to an uniform level in processing of user data and protect the privacy of users more effectively.

The suggestions are the first steps of the privacy initiative announced by Google in May. Google has published five ideas. For example, browsers are to manage a “Privacy Budget” that gives websites limited access to user data so that users can be sorted into an advertising target group without being personally identified. Google also plans to set up central identity service providers that offer limited access to user data via an application programming interface (API) and inform users about the information they have passed on.

Measures like Apple’s, which have introduced Intelligent Tracking Protection, are not in Google’s interest, as Google generates much of its revenue from personalized advertising. In a blog post, Google also said that blocking cookies promotes non-transparent techniques such as fingerprinting. Moreover, without the ability to display personalized advertising, the future of publishers would be jeopardized. Their costs are covered by advertising. Recent studies have shown, that the financing of publishers decreases by an average of 52% if advertising loses relevance due to the removal of cookies.

Based on these ideas, the discussion among developers about the future of web browsers and how to deal with users’ privacy should now begin. Google’s long-term goal is a standardization process to which all major browser developers should adhere. So far, Google has had only limited success with similar initiatives.

Spanish DPA imposes fine on Spanish football league

13. June 2019

The Spanish data protection authority Agencia Española de Protección de Datos (AEPD) has imposed a fine of 250.000 EUR on the organisers of the two Spanish professional football leagues for data protection infringements.

The organisers, Liga Nacional de Fútbol Profesional (LFP), operate an app called “La Liga”, which aims to uncover unlicensed performances of games broadcasted on pay-TV. For this purpose, the app has recorded a sample of the ambient sounds during the game times to detect any live game transmissions and combined this with the location data. Privacy-ticker already reported.

AEPD criticized that the intended purpose of the collected data had not been made transparent enough, as it is necessary according to Art. 5 paragraph 1 GDPR. Users must approve the use explicitly and the authorization for the microphone access can also be revoked in the Android settings. However, AEPD is of the opinion that La Liga has to warn the user of each data processing by microphone again. In the resolution, the AEPD points out that the nature of the mobile devices makes it impossible for the user to remember what he agreed to each time he used the La Liga application and what he did not agree to.

Furthermore, AEPD is of the opinion that La Liga has violated Art. 7 paragraph 3 GDPR, according to which the user has the possibility to revoke his consent to the use of his personal data at any time.

La Liga rejects the sanction because of injustice and will proceed against it. It argues that the AEPD has not made the necessary efforts to understand how the technology works. They explain that the technology used is designed to produce only one particular acoustic fingerprint. This fingerprint contains only 0.75% of the information. The remaining 99.25% is discarded, making it technically impossible to interpret human voices or conversations. This fingerprint is also converted into an alphanumeric code (hash) that is not reversible to the original sound. Nevertheless, the operators of the app have announced that they will remove the controversial feature as of June 30.

Pages: 1 2 Next
1 2