Category: GDPR

EU and South Korea complete adequacy talks

6. April 2021

On March 30th, 2021, EU Justice Commissioner Didier Reynders and Chairperson of the Personal Information Protection Commission of the Republic of Korea Yoon Jong In announced the successful conclusion of adequacy talks between the EU und the Republic of Korea (“South Korea”). These adequacy discussions began in 2017, and there was already initially a high level of convergence between the EU and the Republic of Korea on data protection issues, which has been further enhanced by additional safeguards to further strengthen the level of protection in South Korea. Recently, South Korea’s Personal Information Protection Act (“PIPA”) took effect and the investigative and enforcement powers of South Korea’s data protection authority, the Personal Information Protection Commission (“PIPC”), were strengthened.

In the GDPR, this adequacy decision is based on Art. 45 GDPR. Article 45(3) GDPR empowers the EU Commission to adopt an implementing act to determine that a non-EU country ensures an “adequate level of protection”. This means a level of protection for personal data that is substantially equivalent to the level of protection within the EU. Once it has been determined that a non-EU country provides an “adequate level of protection”, transfers of personal data from the EU to that non-EU country can take place without further requirements. South Korea will be the 13th country to which personal data may be transferred on the basis of an adequacy decision. An adequacy decision covering both commercial providers and the public sector will enable free and secure data flows between the EU and the Republic of Korea and it will complement the EU-Republic of Korea Free Trade Agreement.

Until the free flow of data can occur, the EU Commission must initiate the procedure for adopting its adequacy finding. In this procedure, the European Data Protection Board will issue an opinion and a committee composed of representatives of the EU member states must agree. The EU Commission may then adopt the adequacy decision.

EDPB released a new Guidance on Virtual Voice Assistants

31. March 2021

In recent years, Virtual Voice Assistants (VVA) have enjoyed increased popularity among technophile consumers. VVAs are integrated in modern smartphones like Siri on Apple or Google Assistant on Android mobile devices, but can also be found in seperate terminal devices like Alexa on the Amazon Echo device. With Smart Homes trending, VVAs are finding their ways into many homes.

However, in light of their general mode of operation and their specific usage, VVAs potentially have access to a large amount of personal data. They furthermore use new technologies such as machine learning and artificial intelligence in order to improve their services.

As both private households and corporate businesses are increasingly using VVAs and questions on data protection arise, the European Data Protection Board (EDPB) sought to provide guidance to the relevant data controllers. Therefore, the EDPB published a guidance on Virtual Voice Assistants earlier this month.

In its guidance, the EDPB specifically addresses VVA providers and VVA application developers. It encourages them to take considerations of data protection into account when designing their VVA service, as layed out by the principle of data protection by design and default under Art. 25 GDPR. The EDPB suggests that, for example, controllers could fulfil their information obligations pursuant to Art. 13/14 GDPR using voice based notifications if the VVA works with a screenless terminal device. VVA designers could also enable users to initiate a data subject request though easy-to-follow voice commands.

Moreover, the EDPB states that in their opinion, providing VVA services will require a Data Protection Impact Assessment according to Art. 35 GDPR. The guidance also gives further advice on complying with general data protection principles and is still open for public consultation until 23 April 2021.

Data Breach made 136,000 COVID-19 test results publicly accessible

18. March 2021

Personal health data are considered a special category of personal data under Art. 9 of the GDPR and are therefore given special protections. A group of IT experts, including members of the German Chaos Computer Club (CCC), has now revealed security gaps in the software for test centres by which more than 136,000 COVID-19 test results of more than 80,000 data subjects have apparently been unprotected on the internet for weeks.

The IT-Security experts’ findings concern the software “SafePlay” of the Austrian company Medicus AI. Many test centres use this software to allocate appointments and to make test results digitally available to those tested. In fact, more than 100 test centres and mobile test teams in Germany and Austria are affected by the recent data breach. These include public facilities in Munich, Berlin, Mannheim as well as fixed and temporary testing stations in companies, schools and daycare centres.

In order to view the test results unlawfully, one only needed to create an account for a COVID-19 test. The URL for the test result contained the number of the test. If this number was simply counted up or down, the “test certificates” of other people became freely accessible. In addition to the test result, the test certificate also contained the name, date of birth, private address, nationality and ID number of the person concerned.

It remains unresolved whether the vulnerabilities have been exploited prior to the discovery by the CCC. The CCC notified both Medius AI and the Data Protection Authorities about the leak which led to a quick response by the company. However, IT experts and Privacy-focused NGOs commented that Medicus AI was irresponsible and grossly negligent with respect to their security measures leading to the potential disclosure of an enormous amount of sensitive personal health data.

French Government seeks to disregard CJEU data retention of surveillance data ruling

9. March 2021

On March 3rd, POLITICO reported that the French government seeks to bypass the Court of Justice of the European Union’s (CJEU) ruling on limiting member states’ surveillance activities of phone and internet data, stating governments can only retain mass amounts of data when facing a “serious threat to national security”.

According to POLITICO, the French government has requested the country’s highest administrative court, the Council of State, to not follow the CJEU’s ruling in the matter.

Last year in October, the CJEU ruled that several national data retention rules were not compliant with EU law. This ruling included retention times set forth by the French government in matters of national security.

The French case in question opposes the government against digital rights NGOs La Quadrature du Net and Privacy International. After the CJEU’s ruling, it is now in the hands of the Council of State in France, which will have to decide on the matter.

A hearing date has not yet been decided, however POLITICO sources state that the French government is trying to bypass the CJEU’s ruling by presenting the argument of the ruling going against the country’s “constitutional identity”. This argument, first used back in 2006, is seldomly used, however can be referred to in order to avoid applying EU law at national level.

In addition, the French government accuses the CJEU to have ruled out of its competence, as matters of national security remain solely part of national competence.

The French government did not want to comment on the ongoing process, however has had a history of refusing to adopt EU court rulings into national law.

AEPD issues highest fine for GDPR violations

5. March 2021

The Spanish Data Protection Authority, the Agencia Española de Protección de Datos (AEPD), imposed a fine of EUR 6.000.000 on CaixaBank, Spain’s leading retail bank, for unlawfully processing customers’ personal data and not providing sufficient information regarding the processing of their personal data. It is the largest financial penalty ever issued by the AEPD under the GDPR, surpassing the EUR 5.000.000 fine imposed on BBVA in December 2020 for information and consent failures.

In the opinion of the AEPD, CaixaBank violated Art. 6 GDPR in many regards. The bank had not provided sufficient justification of the legal basis for the processing activities, in particular with regard to those based on the company’s legitimate interest. Furthermore, deficiencies had been identified in the processes for obtaining customers’ consent to the processing of their personal data. The bank had also failed to comply with the requirements established for obtaining valid consent as a specific, unequivocal and informed expression of intention. Moreover, the AEPD stated that the transfer of personal data to companies within the CaixaBank Group was considered an unauthorized disclosure. According to Art. 83 (5) lit. a GDPR, an administrative fine of EUR 4.000.000 EUR was issued.

Additionally, the AEPD found that CaixaBank violated Art. 13, 14 GDPR. The bank had not complied with the information obligations since the information regarding the categories of personal data concerned had not been sufficient and the information concerning the purposes of and the legal basis for the processing had been missing entirely. What’s more, the information provided in different documents and channels had not been consistent. The varying information concerned data subjects’ rights, the possibility of lodging a complaint with the AEPD, the existence of a data protection officer and his contact details as well as data retention periods. Besides, the AEPD disapproved of the use of inaccurate terminology to define the privacy policy. Following Art. 83 (5) lit. b GDPR, a fine of EUR 2.000.000 was imposed.

In conclusion, the AEPD ordered CaixaBank to bring its data processing operations into compliance with the legal requirements mentioned within six months.

Data protection authorities around the world are taking action against the facial recognition software Clearview AI

25. February 2021

The business model of the US company Clearview AI is coming under increasing pressure worldwide. The company collected billions of facial photos from publicly available sources, especially from social networks such as Facebook, Instagram, YouTube and similar services. Data subjects were not informed of the collection and use of their facial photos. Using the photos, Clearview AI created a comprehensive database and used it to develop an automated facial recognition system. Customers of this system are in particular law enforcement agencies and other prosecutors in the US, but companies can also make use of the system. In total, Clearview AI has around 2000 customers worldwide and a database with around 3 billion images.

After a comprehensive investigation by the New York Times in January 2020 drew attention to the company, opposition to the business practice is now also being voiced by the data protection authorities of various countries.

The Hamburg Data Protection Commissioner had already issued an order against Clearview AI in January 2021. According to the order, the company was to delete the biometric data of a Hamburg citizen who had complained to the authority about the storage. The reason given for the decision was that there was no legal basis for processing sensitive data and that the company was profiling by collecting photos over a longer period of time.

Now, several Canadian data protection authorities have also deemed Clearview AI’s actions illegal. In a statement, the Canadian Privacy Commissioner describes the activities as mass surveillance and an affront to the privacy rights of data subjects. The Canadian federal authority published a final report on the investigation into the Clearview AI case. In it, the company was found to have violated several Canadian federal reports.

It is interesting that the Canadian authorities even consider the data collection to be unlawful if Clearview AI were to obtain consents from the data subjects. They argue that already the purpose of the data processing is unlawful. They demand that Clearview AI cease its service in Canada and delete data already collected from Canadian citizens.

The pressure on Clearview AI is also growing due to the fact that the companies from which the data was collected are also opposing the procedure. In addition, the association “noyb” around the data protection activist Max Schrems is dealing with Clearview AI and various European data protection authorities have announced that they will take action against the facial recognition system.

European Commission publishes draft UK adequacy decisions

On February 19th, 2021, the European Commission (EC) has published the draft of two adequacy decisions for the transfer of personal data to the United Kingdom (UK), one under the General Data Protection Regulation (GDPR) and the second for the Law Enforcement Directive. If approved, the decisions would confer adequacy status on the UK and ensure that personal data from the EU can continue to flow freely to the UK. In the EC’s announcement launching the process to adopt the newly drafted adequacy decisions Didier Reynders, Commissioner for Justice, is quoted:

We have thoroughly checked the privacy system that applies in the UK after it has left the EU. Now European Data Protection Authorities will thoroughly examine the draft texts. EU citizens’ fundamental right to data protection must never be compromised when personal data travel across the Channel. The adequacy decisions, once adopted, would ensure just that.

In the GDPR, this adequacy decision is based on Art. 45 GDPR. Article 45(3) GDPR empowers the EU Commission to adopt an implementing act to determine that a non-EU country ensures an “adequate level of protection”. This means a level of protection for personal data that is substantially equivalent to the level of protection within the EU. Once it has been determined that a non-EU country provides an “adequate level of protection”, transfers of personal data from the EU to that non-EU country can take place without further requirements. In the UK, the processing of personal data is governed by the “UK GDPR” and the Data Protection Act 2018, which are based on the EU GDPR. The UK is and has committed to remain part of the European Convention on Human Rights and “Convention 108” of the Council of Europe. “Convention 108” is a binding treaty under international law to protect individuals from abuses in the electronic processing of personal data, and in particular provides for restrictions on cross-border data flows where data is to be transferred to states where no comparable protection exists.

The GDPR adequacy decision draft addresses several areas of concern. One of these is the power of intelligence services in the UK. In this respect, the draft focuses on legal bases, restrictions and safeguards for the collection of information for national security purposes. It also details the oversight structure over the intelligence services and the remedies available to those affected. Another aspect discussed is the limitation of data subjects’ rights in the context of UK immigration law. The EC concludes that interference with individuals’ fundamental rights is limited to what is strictly necessary to achieve a legitimate purpose and that there is effective legal protection against such interference. As the UK GDPR is based on the GDPR and therefore the UK privacy laws should provide an adequate level of protection for data subjects, the main risks for EU data subjects do not lie in the current status of these laws but in possible changes of these laws in the future. For this reason, the EU Commission has built a fixed period of validity into the draft adequacy decision. If adopted, this decision would be valid for a period of four years and the adequacy finding could be extended for a further four years if the level of protection in the UK remains adequate. However, this extension would not be automatic, but subject to a thorough review. This draft marks the first time that the EU has imposed a time limit on an adequacy decision. Other adequacy decisions are subject to monitoring and regular review but are not time-limited by default.

The UK government welcomed the EC’s draft in a statement, while also calling on the EU to “swiftly complete” the process for adopting and formalizing the adequacy decisions, as the “bridging mechanism” will only remain in force until June 30th. Under the EU-UK Trade and Cooperation Agreement, the EU and UK agreed on a transition period of up to six months from January 1st, 2021, during which the UK is treated as an adequate jurisdiction (please see our blog post). The draft adequacy decisions address the flow of data from the EU to the UK. The flow of data from the UK to the EU is governed by UK legislation that has applied since 1 January 2021. The UK has decided that the EU ensures an adequate level of protection and that data can therefore flow freely from the UK to the EU.

Next, the non-binding opinion of the European Data Protection Board is sought (Art. 70 GDPR). After hearing the opinion of the European Data Protection Board, the representatives of the member states must then confirm the draft in the so-called comitology procedure. This procedure is used when the EC is given the power to implement legal acts that lay down conditions for the uniform application of a law. A series of procedures ensure that EU countries have a say in the implementing act. After the comitology procedure, the EC is free to adopt the drafts.

Data Protection and Clinical Trials – Part 1

10. February 2021

In the two and a half years since the General Data Protection Regulation (GDPR) has come into effect, a lot of organizations have gotten used to the new laws and standards it has established. However, there are still a lot of unanswered questions in certain industries, one of those industries being life sciences, and more specifically clinical trials.

The GDPR and the guidance of the European Data Protection Board (EDPB) allow for a lot of speculation, due to the fact that they are unable to fully specify the reach and definitive approach to data protection in a lot of industries.

This short series aims to give an overview on the handling of clinical trials from a data protection point of view, as well as answers to important questions that come up in day to day business in the industry.

In general, clinical trials are a processing activity according to Art. 4 (2) GDPR, therefore the basic data protection obligations are to be applied to clinical trials, such as:

  • Following the basic GDPR principles laid out in Art. 5 GDPR, namely lawfulness, fairness and transparency, purpose limitation, data minimisation, data accuracy, storage limitation, integrity, confidentiality and accountability
  • Information obligations of the controller according to Art. 13, 14 GDPR
  • Data Subjects Rights according to Art. 15 to Art. 21 GDPR
  • Obligation to have a record of processing activities according to Art. 30 para. 1, 2 GDPR
  • Security Measures need to be in place, in compliance with Art. 32 GDPR
  • Data Breach Notifications to the supervisory authority as well as the data subjects according to Art. 33, 34 GDPR
  • A Data Protection Impact Assessment has to be done prior to the start of the clinical trials, according to Art. 35 GDPR

However, the first and foremost important question regarding the processing of personal data for clinical trials is:

Which legal basis is applicable to the processing?

The EDPB addressed this issue in their Opinion on the Interplay between Clinical Trials and the GDPR, and has, in a first instance, differentiated between the processing of personal data for clinical trial protocols as primary purpose of the processing, and, on the other hand, clinical trials as a secondary purpose next to, for example, patient care.

According to the EDPB’s opinion, the applicable legal basis is to be determined by the controller on a case by case basis. However, the EDPB does give their own general assessment on the legal basis applicable for the different scenarios that have crystalized in the eyes of the EDPB:

  • Primary use of the processed personal data for clinical trials
    a. Processing activities related to reliability and safety
    -> Legal obligations of the controller, Art. 6 para. 1 (c) GDPR in conjunction with Art. 9 para. 1 (i) GDPR
    b. Processing activities purely related to research activities
    -> Task carried out in the public interest, Art. 6 para. 1 (e) GDPR in conjunction with Art. 9 para. 2 (i) or (j) GDPR
    -> Legitimate interest of the controller, Art. 6 para. 1 (f) GDPR in conjunction with Art. 9 para. 2 (j) GDPR
    -> In specific circumstances, explicit consent of the data subject, Art. 6 para. 1 (a) GDPR and Art. 9 para. 2 (a) GDPR
  • Secondary use of the clinical trial data outside the clinical trial protocol for scientific purposes
    -> Explicit consent of the data subject, Art. 6 para. 1 (a) GDPR and Art. 9 para. 2 (a) GDPR

While the guidance in assessing the legal basis for the processing is helpful, the EDPB does not address any further open issues regarding clinical trials in their opinion. Nonetheless, there are further subjects that cause confusion.

However, some of these subjects will be treated in our next part of this series, where we will have a closer look at clinical trial sponsorship from outside the EEA as well as the questions revolving around controllership roles in clinical trials.

University fined for omitted notification of a data breach

4. February 2021

The President of the Personal Data Protection Office in Poland (UODO) imposed a fine on the Medical University of Silesia in the amount of PLN 25.000 (approx. EUR 5.600). The university had suffered a data breach of which it should have notified the supervisory authority and the data subjects according to Articles 33, 34 GDPR, but failed to do so.

First indications of the data breach reached UODO in early June 2020. It was related to exams held at the end of May 2020 by videoconference on an e-learning platform. These were also being recorded. Before the exam, students were identified by their IDs or student cards, so a large amount of their personal data was documented on the recordings. After the exam was completed, the recordings were made available on the platform. However, not only the examinees had access to the platform, but also a wider group of people, about which the students had not been informed. In addition, using a direct link, any extern person could access the recordings and therefore the data of the examinees. Many students, fearing that the video would be deleted to cover up the incident, secured the file or took photographs of the computer screens to protect evidence. Eventually, the chancellor (being the decision-making unit) expressed the position that the incident of 200 people viewing the IDs of some 100-150 other people cannot be considered a personal data breach.

The controller, who was requested to clarify the situation by UODO, did not dispute the data breach. In fact, the virtual room of the platform is only available to the exam group and only those people have access to the recordings. The violation occurred because one of the employees did not close access to the virtual room after the exam. Though, the controller stated that no notification was required. In his opinion the risk to the rights or freedoms of the data subjects was low. Moreover, after the incident, the system was modified to prevent students from downloading the exam files. The controller also indicated that he identified the individuals who had done so and informed them about their criminal liability for disseminating the data.

Despite several letters from UODO, the university still omitted to report the data breach and notify the data subjects. Therefore, administrative proceedings were initiated. UODO found that the controller failed to comply with his obligations to notify both the supervisory authority and affected data subjects as well as improperly assessed the risk involved.

When imposing the fine, the President of UODO took into account the duration of the infringement (several months), the intentional action of the controller and his unsatisfactory cooperation with the supervisory authority. The fine will serve not only a repressive but also a preventive function, as it shows that the obligations arisen in connection with data breaches cannot be ignored. All the more so because an inappropriate approach to the obligations imposed by the GDPR may lead to negative consequences for those affected by the breaches.

EDPB published Guideline on Data Breach Examples for Controllers

28. January 2021

On January 18th, 2021, the European Data Protection Board (EDPB) published their draft Guidelines 01/2021 on Examples regarding Data Breach Notification.

These Guidelines are supposed to give further support to Controllers alongside the initial Guidelines on Personal Data Breach Notification under the GDPR, adopted by the Article 29 Working Party in February 2018. These new Guidelines are meant to consider different types of situations that the Supervisory Authorities have come across in the last two and a half years since the implementation of the GDPR.

The EDPB’s intention is to assist Controllers in deciding how to handle data breaches, namely by identifying the factors that they must consider when conducting risk assessments to determine whether a breach must be reported to relevant Supervisory Authorities as well as if a notification to the affected Data Subjects is necessary.

The draft Guidelines present examples of common data breach scenarios, including:

• ransomware attacks, where a malicious code encrypts the personal data and the attacker subsequently asks the controller for a ransom in exchange for the decryption code
• data exfiltration attacks, which exploit vulnerabilities in online services offered by the controller and typically aim at copying, exfiltrating and abusing personal data for malicious purposes
• human errors resulting in data breaches that are fairly common and can be both intentional and unintentional
• lost or stolen devices and paper documents
• “mispostal” scenarios, that arise from human error without malicious intent
• social engineering, such as identity theft and email exfiltration

The draft Guidelines further emphasize key elements of data breach management and response that organizations should consider, namely:

• proactively identifying system vulnerabilities in order to prevent data breaches from happening in the first place
• assessing whether a breach is likely to result in a risk to the rights and freedoms of the Data Subject, the timing of this assessment and the importance of Controllers not delaying a notification because of unclear circumstances
• implementing plans, procedures and guidelines indicating how to handle data breaches that have clear reporting lines and persons responsible for the recovery process
• organizing regular trainings for employees to raise awareness on data breach management, and the latest developments in the area
• documenting breaches in each and every case, irrespective of the risk they pose

The Guidelines will be open for public consultation until March 2nd, 2021, during which the EDPB will gather feedback on the draft.

Pages: Prev 1 2 3 4 5 6 7 8 9 10 ... 22 23 24 Next
1 5 6 7 8 9 24