Category: General

ICO fines companies a total of £330,000 for sending more than 2.7 million spam text messages

16. March 2021

The Information Commissioner’s Office (ICO) has sanctioned two firms for sending unlawful and nuisance text messages to their customers. The ICO took notice because it received several complaints from people affected. One of the companies even received a total of 10,000 complaints.

The two companies had sent the unwanted text messages during the Corona pandemic and have now been sanctioned with £330,000 by the UK Data Protection Authority.

Leads Works Ltd.

One of the companies, the West Sussex-based Leads Works Ltd, sent more than 2.6 million text messages to its customers without obtaining valid consent. Between 26 May and 26 June, the authorities received more than 10,000 complaints.

In addition, Leads Works Ltd has received an enforcement notice from the ICO requiring it to stop sending unlawful direct marketing messages.

Valca Vehicle Ltd

Valca Vehicle Ltd, a company based in Manchester has been sanctioned £80,000. Between June and July 2020, the company sent over 95,000 text messages. This was also without the appropriate consent of those affected. The company has been ordered to stop sending further text messages without consent. Valca Vehicle Ltd has also been criticised for using the pandemic as an excuse for its actions.

Category: General

Google plans to stop the use of cookie tracking

15. March 2021

Google announces to stop the usage of third-party cookies in its browser Google Chrome and proclaim they will not implement other similar technologies that could track individuals while surfing on the web.

Cookies are small pieces of code used on almost every website. They are automatically downloaded when a user visits a website and from then on send data from the user back to the website operator. From this data, companies can create profiles of the user and personalize advertising based on the data collected. Originally, cookies were intended to give web browsers a “memory”. With cookies, online shops save shopping carts and users can stay logged in to online sites.

In a Blogpost published on March 3rd, 2021, David Temkin, Director of Product Management, Ads Privacy and Trust at Google, announced that the next update Google Chrome in April will allow cookie tracking to be turned of completely. With Google Chrome, only so-called “first-party cookies” of the respective website operator remain permitted. The decision will have lasting consequences, as Google Chrome has been the most widely used browser since 2012. The move comes after Google’s competitors Apple and Mozilla announced similar mechanisms for their Safari and Firefox browsers (please see our blog post). Temkin writes:

Keeping the internet open and accessible for everyone requires all of us to do more to protect privacy — and that means an end to not only third-party cookies, but also any technology used for tracking individual people as they browse the web.

Since the personalized advertising based on data, and thus the tracking of the data, is Google’s core business, Google will not stop either the data collection or the personalization of the advertising. Instead of individual profiles, Google will form cohorts of people with similar interests, to which advertising will be tailored. These cohorts are said to be broad enough to preserve the anonymity of individual users. This concept is called “Federated Learning of Cohorts” (FLoC). Google Ads FLoC based advertising is said to start in the second quarter of 2021.

Data will then be collected by the browser and stored locally and not by cookies. Every URL on a website and every content accessed can then be accessed by Google targeting algorithm. Algorithms on the end device are to calculate hash values from the browser history, for example, which enable the assignment to such a cohort. Google sends a selection of ads to the browser, which selects ads that match the cohort and shows them to the user.

While third-party cookies are gradually becoming obsolete, Google is replacing them with a system that Google can completely control itself. This will make it more difficult for competitors such as Facebook Ads in the future, as they will have to rely primarily on first-party data and on data obtained from cookies in smaller browsers.

French Government seeks to disregard CJEU data retention of surveillance data ruling

9. March 2021

On March 3rd, POLITICO reported that the French government seeks to bypass the Court of Justice of the European Union’s (CJEU) ruling on limiting member states’ surveillance activities of phone and internet data, stating governments can only retain mass amounts of data when facing a “serious threat to national security”.

According to POLITICO, the French government has requested the country’s highest administrative court, the Council of State, to not follow the CJEU’s ruling in the matter.

Last year in October, the CJEU ruled that several national data retention rules were not compliant with EU law. This ruling included retention times set forth by the French government in matters of national security.

The French case in question opposes the government against digital rights NGOs La Quadrature du Net and Privacy International. After the CJEU’s ruling, it is now in the hands of the Council of State in France, which will have to decide on the matter.

A hearing date has not yet been decided, however POLITICO sources state that the French government is trying to bypass the CJEU’s ruling by presenting the argument of the ruling going against the country’s “constitutional identity”. This argument, first used back in 2006, is seldomly used, however can be referred to in order to avoid applying EU law at national level.

In addition, the French government accuses the CJEU to have ruled out of its competence, as matters of national security remain solely part of national competence.

The French government did not want to comment on the ongoing process, however has had a history of refusing to adopt EU court rulings into national law.

Data protection authorities around the world are taking action against the facial recognition software Clearview AI

25. February 2021

The business model of the US company Clearview AI is coming under increasing pressure worldwide. The company collected billions of facial photos from publicly available sources, especially from social networks such as Facebook, Instagram, YouTube and similar services. Data subjects were not informed of the collection and use of their facial photos. Using the photos, Clearview AI created a comprehensive database and used it to develop an automated facial recognition system. Customers of this system are in particular law enforcement agencies and other prosecutors in the US, but companies can also make use of the system. In total, Clearview AI has around 2000 customers worldwide and a database with around 3 billion images.

After a comprehensive investigation by the New York Times in January 2020 drew attention to the company, opposition to the business practice is now also being voiced by the data protection authorities of various countries.

The Hamburg Data Protection Commissioner had already issued an order against Clearview AI in January 2021. According to the order, the company was to delete the biometric data of a Hamburg citizen who had complained to the authority about the storage. The reason given for the decision was that there was no legal basis for processing sensitive data and that the company was profiling by collecting photos over a longer period of time.

Now, several Canadian data protection authorities have also deemed Clearview AI’s actions illegal. In a statement, the Canadian Privacy Commissioner describes the activities as mass surveillance and an affront to the privacy rights of data subjects. The Canadian federal authority published a final report on the investigation into the Clearview AI case. In it, the company was found to have violated several Canadian federal reports.

It is interesting that the Canadian authorities even consider the data collection to be unlawful if Clearview AI were to obtain consents from the data subjects. They argue that already the purpose of the data processing is unlawful. They demand that Clearview AI cease its service in Canada and delete data already collected from Canadian citizens.

The pressure on Clearview AI is also growing due to the fact that the companies from which the data was collected are also opposing the procedure. In addition, the association “noyb” around the data protection activist Max Schrems is dealing with Clearview AI and various European data protection authorities have announced that they will take action against the facial recognition system.

European Commission publishes draft UK adequacy decisions

On February 19th, 2021, the European Commission (EC) has published the draft of two adequacy decisions for the transfer of personal data to the United Kingdom (UK), one under the General Data Protection Regulation (GDPR) and the second for the Law Enforcement Directive. If approved, the decisions would confer adequacy status on the UK and ensure that personal data from the EU can continue to flow freely to the UK. In the EC’s announcement launching the process to adopt the newly drafted adequacy decisions Didier Reynders, Commissioner for Justice, is quoted:

We have thoroughly checked the privacy system that applies in the UK after it has left the EU. Now European Data Protection Authorities will thoroughly examine the draft texts. EU citizens’ fundamental right to data protection must never be compromised when personal data travel across the Channel. The adequacy decisions, once adopted, would ensure just that.

In the GDPR, this adequacy decision is based on Art. 45 GDPR. Article 45(3) GDPR empowers the EU Commission to adopt an implementing act to determine that a non-EU country ensures an “adequate level of protection”. This means a level of protection for personal data that is substantially equivalent to the level of protection within the EU. Once it has been determined that a non-EU country provides an “adequate level of protection”, transfers of personal data from the EU to that non-EU country can take place without further requirements. In the UK, the processing of personal data is governed by the “UK GDPR” and the Data Protection Act 2018, which are based on the EU GDPR. The UK is and has committed to remain part of the European Convention on Human Rights and “Convention 108” of the Council of Europe. “Convention 108” is a binding treaty under international law to protect individuals from abuses in the electronic processing of personal data, and in particular provides for restrictions on cross-border data flows where data is to be transferred to states where no comparable protection exists.

The GDPR adequacy decision draft addresses several areas of concern. One of these is the power of intelligence services in the UK. In this respect, the draft focuses on legal bases, restrictions and safeguards for the collection of information for national security purposes. It also details the oversight structure over the intelligence services and the remedies available to those affected. Another aspect discussed is the limitation of data subjects’ rights in the context of UK immigration law. The EC concludes that interference with individuals’ fundamental rights is limited to what is strictly necessary to achieve a legitimate purpose and that there is effective legal protection against such interference. As the UK GDPR is based on the GDPR and therefore the UK privacy laws should provide an adequate level of protection for data subjects, the main risks for EU data subjects do not lie in the current status of these laws but in possible changes of these laws in the future. For this reason, the EU Commission has built a fixed period of validity into the draft adequacy decision. If adopted, this decision would be valid for a period of four years and the adequacy finding could be extended for a further four years if the level of protection in the UK remains adequate. However, this extension would not be automatic, but subject to a thorough review. This draft marks the first time that the EU has imposed a time limit on an adequacy decision. Other adequacy decisions are subject to monitoring and regular review but are not time-limited by default.

The UK government welcomed the EC’s draft in a statement, while also calling on the EU to “swiftly complete” the process for adopting and formalizing the adequacy decisions, as the “bridging mechanism” will only remain in force until June 30th. Under the EU-UK Trade and Cooperation Agreement, the EU and UK agreed on a transition period of up to six months from January 1st, 2021, during which the UK is treated as an adequate jurisdiction (please see our blog post). The draft adequacy decisions address the flow of data from the EU to the UK. The flow of data from the UK to the EU is governed by UK legislation that has applied since 1 January 2021. The UK has decided that the EU ensures an adequate level of protection and that data can therefore flow freely from the UK to the EU.

Next, the non-binding opinion of the European Data Protection Board is sought (Art. 70 GDPR). After hearing the opinion of the European Data Protection Board, the representatives of the member states must then confirm the draft in the so-called comitology procedure. This procedure is used when the EC is given the power to implement legal acts that lay down conditions for the uniform application of a law. A series of procedures ensure that EU countries have a say in the implementing act. After the comitology procedure, the EC is free to adopt the drafts.

CNIL imposes fine of 225,000 euros

Data controller and its processor have been fined 225,000 euros by the French data protection authority for breaching security requirements related to credential stuffing.

On January 27, 2021, the French data protection authority announced that it had fined a data controller €150,000 and its processor €75,000. Both had failed to take adequate security measures to protect its customers’ personal data against credential stuffing attacks on the Data Controller’s website.

Meanwhile, the names of the sanctioned companies are not known because CNIL chose not to make its decisions public.

Following several reports of data breaches on the data controller’s website between June 2018 and January 2020, CNIL undertook investigations into the data processing activities of the company concerned. In addition, the processing practices of the involved service provider (data processor) were also examined. The affected website serves several million customers to make online purchases.

Vulnerability to credential stuffing attacks

Investigations revealed that the affected website was the victim of numerous credential stuffing attacks. This kind of data breach involves using credentials of users that the attacker found on the dark web. The attacker exploits the fact that many users use the same password and username for different web services. With the help of different programs, the attacker then launches login requests on several websites at the same time. In the worst case, the attacker can then view the account information and misuse the respective data for his own purposes. In this case, data such as first and last name, email address, date of birth, customer card number and credit balance as well as details of orders placed on the website were affected. In the period between March 2018 and February 2019, around 40,000 customer accounts were allegedly made accessible to unauthorized third parties.

The investigation rather revealed that the data controller and the service provider were also at fault. The data controller and the data processor had failed to take precautions through appropriate technical and organizational measures to prevent or mitigate such attacks. According to the authority, both companies had delayed too long to implement measures to effectively combat repeated credential stuffing attacks. Although the companies had decided to detect and block the attacks by developing a specific tool, this solution was not developed until a year after the initial attacks. The companies should have used this year to take further measures. For example, it would have been possible to limit the number of requests allowed per IP address on the website or to use a CAPTCHA when users first try to log in to their accounts.

Controllers are required by Article 32 of the GDPR to protect the security of customers’ personal data as best they can. It is therefore not enough to hold out the prospect of security measures. If an attack on user data takes place, remedial measures must be taken as soon as possible.

Sanctions

CNIL decided to impose a fine on both the data controller and the data processor. It was emphasized that the data controller, must implement appropriate security measures and provide documented instructions to its data processor. At the same time, the data processor itself must work out the most appropriate technical and organizational solutions to ensure data security and propose these solutions to the data controller.

Category: General

EU Member States agree on EU Council’s Draft for the ePrivacy Regulation

22. February 2021

On February 10, 2021, representatives of the EU Member States have reached an agreement on a negotiating mandate for the draft ePrivacy Regulation.

The Council of the European Union’s (the Council) text approved by the EU Member States was prepared under Portugal’s Presidency. It will form the basis of the Council’s negotiations with the European Parliament as part of the trilogue process on the final terms of the ePrivacy Regulation, which will replace the current ePrivacy Directive.

The main key elements of the new draft are highlighted by the Council, and encompass the following points:

  • Coverage of both electronic communications content and communications metadata – the text sticks with the general principle that electronic communications data is confidential, which means that any interference by anyone other than the parties involved in the communication is prohibited, except when given permission by the ePrivacy Regulation
  • Machine-to-machine data transmitted via a public network, as this is deemed necessary to protect privacy rights in the context of Internet of Things applications
  • The scope of application includes users located in the EU, regardless of whether the processing of their data takes place outside the EU or the service provider is located in a non-EU jurisdiction
  • Regarding the use of cookies and other technologies involving the storage of information on or collection of information from a user’s device, the Council’s text provides that the use of these technologies will only be legitimate if the user has consented or for specific purposes laid down in the ePrivacy Regulation; however, users should be able to have genuine choice

In addition to broadening the scope of the current directive, the proposed regulation would most likely affect an advertising technology market that is already in the process of undergoing significant changes. As such, the European Commission is also working on the proposed Digital Service Act, Digital Governance Act and Digital Market Act.

However, the draft is presumed to initiate some arguments going forth into the next stage. Based on previous drafts, there are some differences which will need to be reconciled. Especially with regard to the permissions for accessing content and metadata of electronic communications, the two sides are somewhat divided. Where the European Parliament is pushing primarily for consent, the Council seems to have added some more permissions and exceptions to the consent rule. The content regarding data retention will be another point where intense arguments are predicted.

Criticism also comes from some countries, for example from the German Federal Commissioner for Data Protection, Ulrich Kelber. In a press release he attacked the new draft as “a severe blow to data protection”, mentioning that he is concerned by the “interference with the fundamental rights of European citizens”.

Although the new draft brings the erPrivacy Regulation back to life, it is still a long road before unison on its text is fully reached. It is certain that intense discussion in the upcoming trilogue process will continue, and the outcome will be closely watched by many.

GDPR fines and data breach reports increased in 2020

12. February 2021

In 2020 a total of €158.5 million in fines were imposed, research by DLA Piper shows. This represents a 39% increase compared to the 20 months the GDPR was previously in force since May 25th, 2018.

Since that date, a total of € 272.5 million in fines have been imposed across Europe under the General Data Protection Regulation (“GDPR”). Italian authorities imposed a total of € 69.3 million, German authorities € 69.1 million, and French authorities 54.4 million. This calculation does not include two fines against Google LLC and Google Ireland Limited totalling € 100 million  (€ 60million + € 40million) and a fine of € 35 million against Amazon Europe Core issued by the French data protection authority “Commission nationale de l’informatique et des libertés” (“CNIL”) on December 10th, 2020, (please see our respective blog post), as proceedings on these fines are pending before the Conseil d’Etat.

A total of 281,000 data breaches were reported during this period, although the countries that imposed the highest fines were not necessarily those where the most data breaches were reported. While Germany and the UK can be found in the top of both lists, with 77,747 data breaches reported in Germany, 30,536 in the UK and 66,527 in the Netherlands, only 5,389 data breaches were reported in France and only 3,460 in Italy.

Although the biggest imposed fine to date still is a fine of € 50 million issued by CNIL against Google LLC in January 2019 (please see our respective blog post) a number of high-profile fines were imposed in 2020, with 6 of the top 10 all time fines being issued in 2020 and one in 2021.

1. H&M Hennes & Mauritz Online Shop A.B. & Co. KG was fined € 35 million for monitoring several hundred employees (please see our respective blog post).

2. TIM (Italian telecommunications operator) was fined € 27 million for making unwanted promotion calls.

3. British Airways was fined € 22 million for failing to protect personal and financial data of more than 400,000 customers (please see our blog post)

4. Marriott International was fined € 20 million for a data breach affecting up to 383 million customers (please see our respective blog post)

5. Wind Tre S.p.A. was fined € 17 million for unsolicited marketing communications.

A comparison of the highest fines shows that most of them were imposed due to an insufficient legal basis for the processing of personal data (Art. 5 & 6 GDPR) or due to insufficient technical and organizational measures to ensure an appropriate level of security (Art. 32 GDPR).

While the European authorities have shown their willingness to enforce the GDPR rules, they have also shown leniency due to the impact that the COVID 19 pandemic has had on businesses. At least in part due to the impact of the pandemic, the penalties planned by the UK ICO have been softened. A planned fine of €205 million for British Airways was reduced to €22 million and a planned fine of €110 million for Marriott International was reduced to €20 million. GDPR investigations are also often lengthy and contentious, so the increased fines may in part be due to more investigations having had sufficient time to be completed. For example, the dispute over the above fines for British Airways and Marriott International has already started in 2019.

Not only the fines but also the number of data breach notifications increased in 2020. In 2020 121,165 data breaches were reported, an average of 331 notifications per day, compared to 278 per day in 2019. In terms of reported data breaches per 100,000 inhabitants, there is a stark contrast between Northern and Southern European countries. In 2020, Denmark recorded 155.6 data breaches per 100,000 inhabitants, the Netherlands 150, Ireland 127.8, while Greece, Italy and Croatia reported the lowest number of data breaches per inhabitant.

The trend shows that the GDPR is being taken more and more seriously by companies and authorities, and this trend is likely to continue as authorities become more confident in enforcing the GDPR. Fines are only likely to increase, especially as none of the fines imposed so far even come close to the maximum possible amount of 4% of a company’s global annual turnover. The figures also show that while the laws are in principle the same and are supposed to be applied the same in all EEA countries, nations have different approaches to interpreting and implementing them. In the near future, we can expect to see the first penalties resulting from the GDPR restrictions on data transfers to third countries, especially in the aftermath of the Schrems II ruling on data transfers to the USA.

Data Protection and Clinical Trials – Part 1

10. February 2021

In the two and a half years since the General Data Protection Regulation (GDPR) has come into effect, a lot of organizations have gotten used to the new laws and standards it has established. However, there are still a lot of unanswered questions in certain industries, one of those industries being life sciences, and more specifically clinical trials.

The GDPR and the guidance of the European Data Protection Board (EDPB) allow for a lot of speculation, due to the fact that they are unable to fully specify the reach and definitive approach to data protection in a lot of industries.

This short series aims to give an overview on the handling of clinical trials from a data protection point of view, as well as answers to important questions that come up in day to day business in the industry.

In general, clinical trials are a processing activity according to Art. 4 (2) GDPR, therefore the basic data protection obligations are to be applied to clinical trials, such as:

  • Following the basic GDPR principles laid out in Art. 5 GDPR, namely lawfulness, fairness and transparency, purpose limitation, data minimisation, data accuracy, storage limitation, integrity, confidentiality and accountability
  • Information obligations of the controller according to Art. 13, 14 GDPR
  • Data Subjects Rights according to Art. 15 to Art. 21 GDPR
  • Obligation to have a record of processing activities according to Art. 30 para. 1, 2 GDPR
  • Security Measures need to be in place, in compliance with Art. 32 GDPR
  • Data Breach Notifications to the supervisory authority as well as the data subjects according to Art. 33, 34 GDPR
  • A Data Protection Impact Assessment has to be done prior to the start of the clinical trials, according to Art. 35 GDPR

However, the first and foremost important question regarding the processing of personal data for clinical trials is:

Which legal basis is applicable to the processing?

The EDPB addressed this issue in their Opinion on the Interplay between Clinical Trials and the GDPR, and has, in a first instance, differentiated between the processing of personal data for clinical trial protocols as primary purpose of the processing, and, on the other hand, clinical trials as a secondary purpose next to, for example, patient care.

According to the EDPB’s opinion, the applicable legal basis is to be determined by the controller on a case by case basis. However, the EDPB does give their own general assessment on the legal basis applicable for the different scenarios that have crystalized in the eyes of the EDPB:

  • Primary use of the processed personal data for clinical trials
    a. Processing activities related to reliability and safety
    -> Legal obligations of the controller, Art. 6 para. 1 (c) GDPR in conjunction with Art. 9 para. 1 (i) GDPR
    b. Processing activities purely related to research activities
    -> Task carried out in the public interest, Art. 6 para. 1 (e) GDPR in conjunction with Art. 9 para. 2 (i) or (j) GDPR
    -> Legitimate interest of the controller, Art. 6 para. 1 (f) GDPR in conjunction with Art. 9 para. 2 (j) GDPR
    -> In specific circumstances, explicit consent of the data subject, Art. 6 para. 1 (a) GDPR and Art. 9 para. 2 (a) GDPR
  • Secondary use of the clinical trial data outside the clinical trial protocol for scientific purposes
    -> Explicit consent of the data subject, Art. 6 para. 1 (a) GDPR and Art. 9 para. 2 (a) GDPR

While the guidance in assessing the legal basis for the processing is helpful, the EDPB does not address any further open issues regarding clinical trials in their opinion. Nonetheless, there are further subjects that cause confusion.

However, some of these subjects will be treated in our next part of this series, where we will have a closer look at clinical trial sponsorship from outside the EEA as well as the questions revolving around controllership roles in clinical trials.

Giant database leak exposes data on 220 million Brazilians

28. January 2021

On January 19th, 2021, the dfndr lab, PSafe’s cybersecurity laboratory, reported a leak in a Brazilian database that may have exposed the CPF number and other confidential information of millions of people.

According to the cybersecurity experts, who use artificial intelligence techniques to identify malicious links and fake news, the leaked data they have found contains detailed information on 104 million vehicles and about 40 million companies. Overall, the leak poses a risk to close to 220 million Brazilians.

The personal data contained in the affected database includes names, birthdates and individual taxpayer registry identification, with distinct vehicle information, including license plate numbers, municipality, colour, make, model, year of manufacture, engine capacity and even the type of fuel used. The breach both affects almost all Brazilian citizens, as well as authorities.

In a press release, the director of the dfndr lab, Emilio Simoni, explained that the biggest risk following this data leak is that this data will be used in phishing scams, in which a person is induced to provide more personal information on a fake page.

In their statement, PSafe does not disclose either the name of the company involved or how the information was leaked, whether it was due to a security breach, hacker invasion or easy access. However, regardless of the cause of the leak, the new Brazilian Data Protection Security Law provides for fines that can reach R $ 50 million for an infraction of this type.

Pages: Prev 1 2 3 4 5 6 7 8 9 10 ... 22 23 24 Next
1 2 3 4 24