Category: GDPR

The Commission’s Proposal for the European Health Data Space raises data protection concerns

21. July 2022

On May 3, 2022, the European Commission (EC) published its proposal for the creation of the European Health Data Space (EHDS). This proposal, if adopted, would foresee the creation of an EU-wide infrastructure that allows to link health data sets for practitioners, researchers, and industry. In its communication, the EC points at the necessity for promoting “the free, cross-border flows of personal data” with the aim of creating an “internal market for personal health data and digital health products and services”.

Doctors in Germany, by way of an example, would then be able to access the medical file of a Spanish patient that is currently undergoing medical treatment in Germany. In this context, it might be worthy to note that not all Member States are maintaining electronic records of patients having the consequence that this proposal would require certain member states to take steps towards digitalization. With regard to researchers and industry, the underlying incentive of this proposal is to enable them to draw from health data available to create new solutions and to push forward innovation.

Nevertheless, health data are sensitive data within the meaning of the GDPR, which means that access to such data is only exceptionally possible. This begs the question whether and how access to personal health data that this proposal is intending to enable, can be reconciled with the GDPR. Recently, the European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS) issued a joint opinion on this new legislative initiative expressing several concerns in relation to the proposal from a data protection perspective.

If one takes the example of health data processed while accessing healthcare, then the legal ground of art. 9 (2) (h) GDPR, namely that of medical diagnosis or provision of health, would be applicable. Further processing for any other purpose, however, would then require the data subject’s consent.

In the words of EDPB Chair Andrea Jelinek: “The EU Health Data Space will involve the processing of large quantities of data which are of a highly sensitive nature. Therefore, it is of the utmost importance that the rights of the European Economic Area’s (EEA) individuals are by no means undermined by this Proposal. The description of the rights in the Proposal is not consistent with the GDPR and there is a substantial risk of legal uncertainty for individuals who may not be able to distinguish between the two types of rights. We strongly urge the Commission to clarify the interplay of the different rights between the Proposal and the GDPR.”

Diving into the details of the joint opinion, the EDPB and EDPS strongly recommends making secondary use of personal data steaming from wellness applications, such as wellness and behavioral data, be subject to the prior consent of the data subject, in case these data, contrary to EDPB and EDPS’ recommendation, are not excluded from the scope of this proposal altogether.

That would not only be in line with the GDPR, but would also make possible to differentiate between health data generated by wellness applications, on the one hand, and health data generated by medical devices, on the other hand.

The fundamental difference between both data lies in the different degrees of quality and the fact that wellness applications do also process, for instance, food practices which therefore allows to draw conclusions from data subjects’ daily activities, habits, and practices.

DPC sends draft decision on Meta’s EU-US data transfers to other European DPAs

14. July 2022

On July 7, 2022, it became known that the Irish Data Protection Commission (DPC) had forwarded a draft decision concerning Meta’s EU-US data transfers to other European DPAs for consultation. Having to respect a four-week-period, European DPAs may comment on this draft or formulate objections to it. In such an event, the DPC would be given an additional month to respond to the objections raised (article 60 GDPR).

According to information available to politico, the DPC is intending to halt Meta’s EU-US transfer. The DPC is said to have concluded in its out of “own volition” draft decision that Meta can no longer rely on the SCCs when it transfers their user’s personal data to US based servers. In other words, even though Meta has implemented the EU’s SSCs, it cannot be ruled out that US intelligence services may gain access to personal data of data subjects using facebook, instagram and other meta products.

Following the striking down of both, the Safe Harbour Agreement in 2015 and the EU-US Privacy Shield in 2020 by the Court of Justice of the European Union, this draft decision seems to question the legality and compatibility of EU-US data transfers with the GDPR for a third time.

In this context it is worthy to consider a statement Meta made in its annual report to the United States Securities and Exchange Commission (SEC):

“If a new transatlantic data transfer framework is not adopted and we are unable to continue to rely on SCCs or rely upon other alternative means of data transfers from Europe to the United States, we will likely be unable to offer a number of our most significant products and services, including Facebook and Instagram, in Europe, which would materially and adversely affect our business, financial condition, and results of operations.”

Despite the possibility of a halt of Meta’s EU-US data transfers, there is reason to believe that this DPC initiated procedure will be continued in the future and that it will go beyond the previously mentioned four-weeks timeline. “We expect other DPAs to issue objections, as some major issues are not dealt with in the DPC’s draft. This will lead to another draft and then a vote”, says NOYB’s Max Schrems who filed the original complaint to the DPC. Hence, it seems rather unlikely that an instant stop of an EU-US transfer will occur. Instead, we could rather expect article 65 GDPR to be triggered meaning that the EDPB would be required to issue a final decision, including a vote, on the matter.

With no concrete EU-US transfer agreement in sight and the ongoing uncertainty on whether the DPC will eventually succeed with its draft decision, this matter continues to be of big interest.

Artificial Intelligence and Personal Data: a hard co-existence. A new perspective for the EU

7. July 2022

In the last decades AI has had an impressive development in various fields. At the same time, with each step forward the new machines and the new processes they are programmed to perform need to collect way more data than before in order to function properly.

One of the first things that come to mind is how can the rise of AI and the principle of data minimization, as contained in Art. 5 para. 1 lit. c) GDPR, be reconciled? At first glance it seems contradictory that there may be a way: after all, the GDPR clearly states that the number of personal data collected should be as small as possible. A study carried out by the Panel for the Future of Science and Technology of the European Union suggests that, given the wide scope (referring to the exceptions contained in the article) conceded by the norm, this issue could be addressed by measures like pseudonymization. This means that the data collected by the AI is deprived of every information that could refer personal data to a specific individual without additional information, thus lowering the risks for individuals.

The main issue with the current legal framework of the European Union regarding personal data protection is the fact that certain parts have been left vague, which causes uncertainty also in the regulation of artificial intelligence. To address this problem, the EU has put forward a proposal for a new Artificial Intelligence Act (“AIA”), aiming to create a common and more “approachable” legal framework.

One of the main features of this Act is that it divides the application of artificial intelligence in three main categories of risk levels:

  1. Creating an unacceptable risk, thus prohibited AIs (e.g. systems that violate fundamental rights).
  2. Creating a high risk, subject to specific regulation.
  3. Creating a low or minimum risk, with no further regulation.

Regarding high-risk AIs, the AIA foresees the creation of post-market monitoring obligations. If the AI in question violates any part of the AIA, it can then be forcibly withdrawn from the market by the regulator.

This approach has been welcomed by the Joint Opinion of the EDPB – EDPS, although the two bodies stated that the draft still needs to be more aligned with the GDPR.

Although the Commission’s draft contains a precise description of the first two categories, these will likely change over the course of the next years as the proposal is undergoing the legislative processes of the EU.

The draft was published by the European Commission in April 2021 and must still undergo scrutiny from the European Parliament and the Council of the European Union. Currently, some amendments have been formulated and the draft is still under review by the Parliament. After the Act has passed the scrutiny, it will be subject to a two – year implementation period.

Finally, a question remains to be answered: who shall oversee and control the Act’s implementation?It is foreseen that national supervisory authorities shall be established in each EU member state. Furthermore, the AIA aims at establishing a special European AI Board made up of representatives both of the member States and of the European Commission, which will also be the chair. Similar to the EDPB, this Board shall have the power to issue opinions and recommendations, and ensure the consistent application of the regulation throughout the EU.

Garante statement: use of Google Analytics violates GDPR

29. June 2022

On June, 23, 2022, the Italian Data Protection Authority (Garante) released a statement on the use of Google Analytics (GA) holding the view that the use of GA by Italian websites without otherwise applicable safeguards violates the GDPR.

Garante comes out as the third data protection authority within the EU that declares the transfer of personal data through GA illegal. Earlier this year, CNIL and the Austrian data protection authority delivered a decision each, both coming to the same conclusion, namely that the use of GA violates the GDPR.

What lead to this statement is that Garante had received a number of complaints. However, it is also the product of coordination with other European privacy authorities.

In its reasoning, Garante assigns a special role to cookies that help GA to collect personal data, such as the IP address, visited pages, type of browser, and the kind of operating system. Garante considers it as proven that personal data is being transferred to the US when using GA. Garante reiterates that IP addresses qualify as personal data and that the pseudoanonymisation undertaken by GA is not sufficient to protect personal data from being accessed from US governmental agencies.

Garante called on all controllers and processors involved in Italian website operations for compliance and ordered a period of 90 days to comply with their obligations under the GDPR. The statement further states: “The Italian SA calls upon all controllers to verify that the use of cookies and other tracking tools on their websites is compliant with data protection law; this applies in particular to Google Analytics and similar services.”

EDPB adopts new guidelines on certification as a tool for transfers

23. June 2022

On June 16, 2022, the European Data Protection Board (EDPB) announced on its website that it had adopted guidelines on certification as a tool for transfers of personal data (publication is yet to take place following linguistic checks). Once published these guidelines will undergo public consultation until September 2022.

On a first note, these guidelines can be placed within the broader context of international data transfers, as envisioned by art. 46 (2) (f) GDPR. Further, the certification mechanism comes only into play when an adequacy decision is absent. As is probably well known, art. 46 (2) GDPR outlines several safeguards that may be resorted to in case personal data is being transferred to third countries.

One of these is the voluntary certification mechanism, as laid down by art. 42/43 GDPR, that allows accredited certification bodies or supervisory authorities to issue certifications, provided, of course, that controllers or processors have made binding and enforceable commitments. What the EU legislators hoped was to assist data subjects in quickly assessing “the level of data protection of relevant products and services” (Recital 100 GDPR) by way of certifications, seals, and marks.

In accordance with art. 42 (5) GDPR and guideline 1/2018 on certification, whereby the latter is to be complemented with the new guidelines, accredited certification bodies or supervisory authorities are competent to issue such certification. It is important to note that the previously mentioned accredited certification bodies could very well be private bodies which are subject to certain requirements and prior approval by the Board or supervisory authorities. The criteria on the basis of which certifications are issued are to be determined and approved by the Board or by the competent supervisory authorities (art. 42 (5) GDPR).

According to EDPB Deputy Chair Ventsislav Karadjov, these yet-to-be published guidelines are “ground-breaking” as he provides an outlook for the content of the guidelines. One of the most important aspects that will be touched upon are the accreditation requirements that certification bodies have to comply with as well as the certification criteria attesting that appropriate safeguards for transfers are in place. It remains to be seen whether these guidelines will indeed provide more guidance on those aspects.

UK announces Data Reform Bill

31. May 2022

In 2021 the Department for Culture, Media and Sport (DCMS) published a consultation document entitled “Data: a new direction”, requesting opinions on proposals that could bring changes to the UK’s data protection regime. On May 10, 2022, as part of the Queen’s Speech, Prince Charles confirmed that the government of the United Kingdom (UK) is in the process of reforming its data privacy rules, raising questions about whether the country could still be in compliance with the General Data Protection Regulation (GDPR).

Other than the statement itself, not much information was provided regarding the specific details. The accompanying briefing notes provided more information. They set out the main purposes of the Bill, namely to:

  • The establishment of a new pro-growth and trusted data protection framework
  • Reducing the burdens on business
  • Creation of a world class data rights regime
  • Supporting innovation
  • Driving industry participation in schemes which give citizens and small businesses more control of their data, particularly in relation to health and social care
  • Modernization of the  Information Commissioner’s Office (ICO), including strengthening its enforcement powers and increasing its accountability

Nevertheless, the defined goals are rather superficial. Another concern is that the new bill could deviate too far from the GDPR. The new regime might not be able to retain the adequacy-status with the EU, allowing personal data to be exchanged between UK and EU organizations. Prime Minister Johnson said that the Data Reform Bill would “improve the burdensome GDPR, allowing information to be shared more effectively and securely between public bodies.” So far, no time frame for the adoption of the new law has been published.

EU: Commission publishes Q&A on SCCs

30. May 2022

On 25 May 2022, the European Commission published guidance outlining questions and answers (‘Q&A’) on the two sets of Standard Contractual Clauses (‘SCCs’), on controllers and processors (‘the Controller-Processor SCCs’) and third-country data transfers (‘the Data Transfer SCCs’) respectively, as adopted by the European Commission on 4 June 2021. The Q&A are intended to provide practical guidance on the use of the SCCs. They are based on feedback from various stakeholders on their experiences using the new SCCs in the months following their adoption. 

Specifically, 44 questions are addressed, including those related to contracting, amendments, the relationship to other contract clauses, and the operation of the so-called docking clause.  In addition, the Q&A contains a specific section dedicated to each set of SCCs. Notably, in the section on the Data Transfer SCCs, the Commission addresses the scope of data transfers for which the Data Transfer SCCs may be used, highlighting that they may not be used for data transfers to controllers or processors whose processing operations are directly subject to the General Data Protection Regulation (Regulation (EU) 2016/679) (‘GDPR’) by virtue of Article 3 of the GDPR. Further to this point, the Q&A highlights that the Commission is in the process of developing an additional set of SCCs for this scenario, which will consider the requirements that already apply directly to those controllers and processors under the GDPR. 

In addition, the Q&A includes a section with questions on the obligations of data importers and exporters, specifically addressing the SCC liability scheme. Specifically, the Q&A states that other provisions in the broader (commercial) contract (e.g., specific rules for allocation of liability, caps on liability between the parties) may not contradict or undermine liability schemes of the SCCs. 

Additionally, with respect to the Court of Justice of the European Union’s judgment in Data Protection Commissioner v. Facebook Ireland Limited, Maximillian Schrems (C-311/18) (‘the Schrems II Case’), the Q&A includes a set of questions on local laws and government access aimed at clarifying contracting parties’ obligations under Clause 14 of the Data Transfer SCCs. 

In this regard, the Q&A highlights that Clause 14 of the Data Transfer SCCs should not be read in isolation but used together with the European Data Protection Board’s Recommendations 01/2020 on measures that supplement transfer tools. 

Twitter fined $150m for handing users’ contact details to advertisers

Twitter has been fined $150 million by U.S. authorities after the company collected users’ email addresses and phone numbers for security reasons and then used the data for targeted advertising. 

According to a settlement with the U.S. Department of Justice and the Federal Trade Commission, the social media platform had told users that the information would be used to keep their accounts secure. “While Twitter represented to users that it collected their telephone numbers and email addresses to secure their accounts, Twitter failed to disclose that it also used user contact information to aid advertisers in reaching their preferred audiences,” said a court complaint filed by the DoJ. 

A stated in the court documents, the breaches occurred between May 2013 and September 2019, and the information was apparently used for purposes such as two-factor authentication. However, in addition to the above-mentioned purposes, Twitter used that data to allow advertisers to target specific groups of users by matching phone numbers and email addresses with advertisers’ own lists. 

In addition to financial compensation, the settlement requires Twitter to improve its compliance practices. According to the complaint, the false disclosures violated FTC law and a 2011 settlement with the agency. 

Twitter’s chief privacy officer, Damien Kieran, said in a statement that the company has “cooperated with the FTC at every step of the way.” 

“In reaching this settlement, we have paid a $150m penalty, and we have aligned with the agency on operational updates and program enhancements to ensure that people’s personal data remains secure, and their privacy protected,” he added. 

Twitter generates 90 percent of its $5 billion (£3.8 billion) in annual revenue from advertising.  

The complaint also alleges that Twitter falsely claimed to comply with EU and U.S. privacy laws, as well as Swiss and U.S. privacy laws, which prohibit companies from using data in ways that consumers have not approved of. 

The settlement with Twitter follows years of controversy over tech companies’ privacy practices. Revelations in 2018 that Facebook, the world’s largest social network, used phone numbers provided for two-factor authentication for advertising purposes enraged privacy advocates. Facebook, now Meta, also settled the matter with the FTC as part of a $5 billion settlement in 2019. 

 

CJEU considers representative actions admissible

29. April 2022

Associations can bring legal proceedings against companies according to a press release of the European Court of Justice (CJEU).

This is the conclusion reached by the Court in a decision on the proceedings of the Federation of German Consumer Organisations (vzbv), which challenged Facebook’s data protection directive. Accordingly, it allows a consumer protection association to bring legal proceedings, in the absence of a mandate conferred on it for that purpose and independently of the infringement of specific rights of the data subjects, against the person allegedly responsible for an infringement of the laws protecting personal data, The vzbv is an institution that is entitled to bring legal proceeding under the GDPR because it pursues an objective in the public interest.

Specifically, the case is about third-party games on Facebook, in which users must agree to the use of data in order to be able to play these games on Facebook. According to the association, Facebook has not informed the data subjects in a precise, transparent and understandable form about the use of the data, as is actually prescribed by the General Data Protection Regulation (GDPR). The Federal Court of Justice in Germany (BGH) already came to this conclusion in May 2020 however, it was not considered sufficiently clarified whether the association can bring legal proceedings in this case.

The EU Advocate General also concluded before that the association can bring legal proceeding in a legally non-binding statement.

Thus, the CJEU confirmed this view so that the BGH must now finally decide on the case of vzbv vs. facebook. It is also important that this decision opens doors for similar collective actions against other companies.

Record GDPR fine by the Hungarian Data Protection Authority for the unlawful use of AI

22. April 2022

The Hungarian Data Protection Authority (Nemzeti Adatvédelmi és Információszabadság Hatóság, NAIH) has recently published its annual report in which it presented a case where the Authority imposed the highest fine to date of ca. €670,000 (HUF 250 million).

This case involved the processing of personal data by a bank that acted as a data controller. The controller automatically analyzed recorded audio of costumer calls. It used the results of the analysis to determine which customers should be called back by analyzing the emotional state of the caller using an artificial intelligence-based speech signal processing software that automatically analyzed the call based on a list of keywords and the emotional state of the caller. The software then established a ranking of the calls serving as a recommendation as to which caller should be called back as a priority.

The bank justified the processing on the basis of its legitimate interests in retaining its customers and improving the efficiency of its internal operations.

According to the bank this procedure aimed at quality control, in particular at the prevention of customer complaints. However, the Authority held that the bank’s privacy notice referred to these processing activities in general terms only, and no material information was made available regarding the voice analysis itself. Furthermore, the privacy notice only indicated quality control and complaint prevention as purposes of the data processing.

In addition, the Authority highlighted that while the Bank had conducted a data protection impact assessment and found that the processing posed a high risk to data subjects due to its ability to profile and perform assessments, the data protection impact assessment did not provide substantive solutions to address these risks. The Authority also emphasized that the legal basis of legitimate interest cannot serve as a “last resort” when all other legal bases are inapplicable, and therefore data controllers cannot rely on this legal basis at any time and for any reason. Consequently, the Authority not only imposed a record fine, but also required the bank to stop analyzing emotions in the context of speech analysis.

 

Pages: 1 2 3 4 5 6 7 8 9 10 ... 21 22 23 Next
1 2 3 23