Category: GDPR

265 million euro fine for Meta

29. November 2022

The Irish Data Protection Commission (DPC) imposed an administrative fine of 265 million euros on Facebook-mother Meta as a result of the unlawful publication of personal data.

Investigation proceedings

Following the availability online of personal data of up to 533 million Facebook and Instagram users from over 100 countries in April 2021, the DPC had launched investigations. As part of the investigation process, it cooperated with the other European data protection authorities and examined the Facebook Search, Facebook Messenger Contact Importer and Instagram Contact Importer tools. With the help of these tools, contacts stored in the smartphone can be imported into the Instagram or Facebook app in order to find friends or acquaintances.

Lack of technical and organisational measures to protect data

As part of its investigation, the DPC dealt with the so-called technical and organisational measures according to Article 25 GDPR. According to data protection law, data controllers must use such measures to ensure that the rights of data subjects are extensively protected. These include, for example, pseudonymisation and encryption of personal data, but also physical protection measures or the existence of reliable backups.

The DPC did not consider Meta’s technical and organisational measures to be sufficient. Therefore, in addition to the aforementioned fine of 265 million euros, it issued a reprimand as well as an order to bring the processing operations into compliance with data protection law within a certain period of time and to implement a number of specific remedial measures to this end.

Not the first fine for Meta

Meta is by now familiar with fines from European data protection authorities. In total, the company has already been fined almost one billion euros, most recently in September in the amount of 405 million euros for serious data protection violations involving underage Instagram users. The reason for the considerable amount of the individual sanctions is Article 83 GDPR, according to which fines can amount to up to four percent of a company’s total worldwide annual turnover. Meta has appealed against each of the previous decisions, so it can also be assumed in this case that Meta will not accept the fine without a judicial review, either.

Italian DPA launches investigation on cookie- and paywalls

27. October 2022

On October 21st, 2022 the Italian Data Protection Authority launched an investigation on the use of cookie walls by several online newspapers. Although the GDPR allows the implementation of cookiewalls and paywalls (not revealing the content of a website unless the cookies have been accepted or a certain amount of money has been paid), the Italian watchdogs will take a closer look if these have been correctly implemented correctly and do not violated the European regulation.

Further information is yet to be released by the authorities.

TikTok faces huge fine from Britain’s ICO

12. October 2022

Lately, the Chinese social media success has been the subject of an investigation by the British data protection watchdog, the Information Commissioner’s Office (ICO): the investigation has so far concluded that the social media network has clearly breached the United Kingdom’s data protection laws, in particular the regulations concerning children’s personal data in the time. The Authority issued therefore a notice of intent, which is a potential precursor to a fine amounting up to a staggering 27 million pounds.

In particular, the Authority found out that the platform could have processed personal data of children under the age of 13 failing to gather the parents’ consent for the processing of these data. Under these data there are allegedly also special category data, which have a special protection under Art. 9 GDPR.

Furthermore, in the ICO’s opinion the principle of transparency was not respected by the Chinese hit platform by not providing complete or transparent information on the data processing or their gathering.

The ICO’s investigation is still ongoing as the Commissioner’s Office is still deciding whether to impose the fine or whether there has been a breach of data protection law.

The protection of teenagers and children is the top priority of the ICO according to current Information Commissioner John Edwards. Under his guidance, the ICO has several ongoing investigations targeting various tech companies who could be breaking the UK’s data protection laws.

This is not the first time TikTok has been under observation by data protection watchdogs. In July a US – Australian cybersecurity firm has found that TikTok gathers excessive amounts of information from their users, and voiced their concern over their findings. Based on these precedents, it could be possible that local data protection authorities will increment their efforts to control TikTok’s compliance with local laws and, in Europe, with the  GDPR.

Danish watchdogs ban Google Chromebooks and Google Workspace in municipality

26. August 2022

In July 2022, after an investigation related to a data breach was carried out by the Danish Data Protection Authority (Datailsynet), Google Chromebooks and Google Workspace were banned in schools in the municipality of Helsingor. The DPA ruled that the risk assessment carried out by city officials shows that the processing of personal data by Google does not meet GDPR requirements. In particular, data transfers have been targeted by the Authority: the Data Processing Agreement allows data transfer to third countries for analytical and statistical support, though the data are primarily stored in Google’s European facilities.

This decision comes in a moment of tension in the world of personal data between Europe and the United States of America: other notorious cases (some still ongoing) are the case of the Irish Data Protection Authority vs. Facebook (now part of Meta Inc.), and the case of the German Federal Cartel Office vs. Facebook. European watchdogs have found that in many cases the American tech giants’ policies do not meet the requirements established by the GDPR. This could be traced back to a lack of legal framework in the field of privacy and personal data protection in the United States, were these companies are based.

This decision was taken in the aftermath of the Schrems II ruling by the European Court of Justice, which stated that the pre-existing agreement on data transfers between Europe and the US (so-called Privacy Shield)was not compatible with the GDPR. A new deal is on the table, but not yet approved nor effective.

Google is becoming the target of various investigations by European data watchdogs, above all because of its tool Google Analytics. In January the Austrian Data Protection Authority published an opinion in which it stated that companies using Google Analytics inadvertently transferred customers’ personal data such as IP addresses to the United States, in breach of the GDPR. Italy’s Garante per la Protezione dei Dati Personali published a similar opinion a few weeks later, stating that “the current methods adopted by Google do not guarantee an adequate level of protection of personal data”.

European Data Protection Board adopts a dispute resolution decision in the context of Instagram

17. August 2022

In early December 2021, the Irish Data Protection Commission (DPC) in its capacity as lead supervisory authority responsible for overseeing Instagram (meta) sent a draft decision to other European supervisory authorities in line with Art. 60 (3) GDPR. In this draft decision, the DPC expressed its concern with instagram’s compliance with several GDPR provisions, notably Art. 5(1)(a) and (c), 6(1), 12(1), 13, 24, 25 and 35 GDPR.

The lead supervisor authority specifically raised the issue of the public disclosure of children’s personal data, such as e-mail addresses and phone numbers, due to their use of the Instagram business account feature.

The respective Supervisory Authorities, however, did not fully agree with the draft decision and issued objections in accordance with Art. 60(4) GDPR. Unable to find common ground on some of the objections, Art. 65(1) (a) GDPR laying down the dispute resolution procedure, became applicable. Consequently, the lead supervisory authority, the DPC, was required to ask the European Data Protection Board (EDPB) to adopt a binding decision.

On July 29, 2022, the EDPB announced that it had adopted a dispute resolution decision following these objections. Now, it is upon the DPC to adopt its final decision and to communicate it to the controller. The DPC has one month to issue its final decision, albeit it should be based on the EDPB decision.

The Commission’s Proposal for the European Health Data Space raises data protection concerns

21. July 2022

On May 3, 2022, the European Commission (EC) published its proposal for the creation of the European Health Data Space (EHDS). This proposal, if adopted, would foresee the creation of an EU-wide infrastructure that allows to link health data sets for practitioners, researchers, and industry. In its communication, the EC points at the necessity for promoting “the free, cross-border flows of personal data” with the aim of creating an “internal market for personal health data and digital health products and services”.

Doctors in Germany, by way of an example, would then be able to access the medical file of a Spanish patient that is currently undergoing medical treatment in Germany. In this context, it might be worthy to note that not all Member States are maintaining electronic records of patients having the consequence that this proposal would require certain member states to take steps towards digitalization. With regard to researchers and industry, the underlying incentive of this proposal is to enable them to draw from health data available to create new solutions and to push forward innovation.

Nevertheless, health data are sensitive data within the meaning of the GDPR, which means that access to such data is only exceptionally possible. This begs the question whether and how access to personal health data that this proposal is intending to enable, can be reconciled with the GDPR. Recently, the European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS) issued a joint opinion on this new legislative initiative expressing several concerns in relation to the proposal from a data protection perspective.

If one takes the example of health data processed while accessing healthcare, then the legal ground of art. 9 (2) (h) GDPR, namely that of medical diagnosis or provision of health, would be applicable. Further processing for any other purpose, however, would then require the data subject’s consent.

In the words of EDPB Chair Andrea Jelinek: “The EU Health Data Space will involve the processing of large quantities of data which are of a highly sensitive nature. Therefore, it is of the utmost importance that the rights of the European Economic Area’s (EEA) individuals are by no means undermined by this Proposal. The description of the rights in the Proposal is not consistent with the GDPR and there is a substantial risk of legal uncertainty for individuals who may not be able to distinguish between the two types of rights. We strongly urge the Commission to clarify the interplay of the different rights between the Proposal and the GDPR.”

Diving into the details of the joint opinion, the EDPB and EDPS strongly recommends making secondary use of personal data steaming from wellness applications, such as wellness and behavioral data, be subject to the prior consent of the data subject, in case these data, contrary to EDPB and EDPS’ recommendation, are not excluded from the scope of this proposal altogether.

That would not only be in line with the GDPR, but would also make possible to differentiate between health data generated by wellness applications, on the one hand, and health data generated by medical devices, on the other hand.

The fundamental difference between both data lies in the different degrees of quality and the fact that wellness applications do also process, for instance, food practices which therefore allows to draw conclusions from data subjects’ daily activities, habits, and practices.

DPC sends draft decision on Meta’s EU-US data transfers to other European DPAs

14. July 2022

On July 7, 2022, it became known that the Irish Data Protection Commission (DPC) had forwarded a draft decision concerning Meta’s EU-US data transfers to other European DPAs for consultation. Having to respect a four-week-period, European DPAs may comment on this draft or formulate objections to it. In such an event, the DPC would be given an additional month to respond to the objections raised (article 60 GDPR).

According to information available to politico, the DPC is intending to halt Meta’s EU-US transfer. The DPC is said to have concluded in its out of “own volition” draft decision that Meta can no longer rely on the SCCs when it transfers their user’s personal data to US based servers. In other words, even though Meta has implemented the EU’s SSCs, it cannot be ruled out that US intelligence services may gain access to personal data of data subjects using facebook, instagram and other meta products.

Following the striking down of both, the Safe Harbour Agreement in 2015 and the EU-US Privacy Shield in 2020 by the Court of Justice of the European Union, this draft decision seems to question the legality and compatibility of EU-US data transfers with the GDPR for a third time.

In this context it is worthy to consider a statement Meta made in its annual report to the United States Securities and Exchange Commission (SEC):

“If a new transatlantic data transfer framework is not adopted and we are unable to continue to rely on SCCs or rely upon other alternative means of data transfers from Europe to the United States, we will likely be unable to offer a number of our most significant products and services, including Facebook and Instagram, in Europe, which would materially and adversely affect our business, financial condition, and results of operations.”

Despite the possibility of a halt of Meta’s EU-US data transfers, there is reason to believe that this DPC initiated procedure will be continued in the future and that it will go beyond the previously mentioned four-weeks timeline. “We expect other DPAs to issue objections, as some major issues are not dealt with in the DPC’s draft. This will lead to another draft and then a vote”, says NOYB’s Max Schrems who filed the original complaint to the DPC. Hence, it seems rather unlikely that an instant stop of an EU-US transfer will occur. Instead, we could rather expect article 65 GDPR to be triggered meaning that the EDPB would be required to issue a final decision, including a vote, on the matter.

With no concrete EU-US transfer agreement in sight and the ongoing uncertainty on whether the DPC will eventually succeed with its draft decision, this matter continues to be of big interest.

Artificial Intelligence and Personal Data: a hard co-existence. A new perspective for the EU

7. July 2022

In the last decades AI has had an impressive development in various fields. At the same time, with each step forward the new machines and the new processes they are programmed to perform need to collect way more data than before in order to function properly.

One of the first things that come to mind is how can the rise of AI and the principle of data minimization, as contained in Art. 5 para. 1 lit. c) GDPR, be reconciled? At first glance it seems contradictory that there may be a way: after all, the GDPR clearly states that the number of personal data collected should be as small as possible. A study carried out by the Panel for the Future of Science and Technology of the European Union suggests that, given the wide scope (referring to the exceptions contained in the article) conceded by the norm, this issue could be addressed by measures like pseudonymization. This means that the data collected by the AI is deprived of every information that could refer personal data to a specific individual without additional information, thus lowering the risks for individuals.

The main issue with the current legal framework of the European Union regarding personal data protection is the fact that certain parts have been left vague, which causes uncertainty also in the regulation of artificial intelligence. To address this problem, the EU has put forward a proposal for a new Artificial Intelligence Act (“AIA”), aiming to create a common and more “approachable” legal framework.

One of the main features of this Act is that it divides the application of artificial intelligence in three main categories of risk levels:

  1. Creating an unacceptable risk, thus prohibited AIs (e.g. systems that violate fundamental rights).
  2. Creating a high risk, subject to specific regulation.
  3. Creating a low or minimum risk, with no further regulation.

Regarding high-risk AIs, the AIA foresees the creation of post-market monitoring obligations. If the AI in question violates any part of the AIA, it can then be forcibly withdrawn from the market by the regulator.

This approach has been welcomed by the Joint Opinion of the EDPB – EDPS, although the two bodies stated that the draft still needs to be more aligned with the GDPR.

Although the Commission’s draft contains a precise description of the first two categories, these will likely change over the course of the next years as the proposal is undergoing the legislative processes of the EU.

The draft was published by the European Commission in April 2021 and must still undergo scrutiny from the European Parliament and the Council of the European Union. Currently, some amendments have been formulated and the draft is still under review by the Parliament. After the Act has passed the scrutiny, it will be subject to a two – year implementation period.

Finally, a question remains to be answered: who shall oversee and control the Act’s implementation?It is foreseen that national supervisory authorities shall be established in each EU member state. Furthermore, the AIA aims at establishing a special European AI Board made up of representatives both of the member States and of the European Commission, which will also be the chair. Similar to the EDPB, this Board shall have the power to issue opinions and recommendations, and ensure the consistent application of the regulation throughout the EU.

Garante statement: use of Google Analytics violates GDPR

29. June 2022

On June, 23, 2022, the Italian Data Protection Authority (Garante) released a statement on the use of Google Analytics (GA) holding the view that the use of GA by Italian websites without otherwise applicable safeguards violates the GDPR.

Garante comes out as the third data protection authority within the EU that declares the transfer of personal data through GA illegal. Earlier this year, CNIL and the Austrian data protection authority delivered a decision each, both coming to the same conclusion, namely that the use of GA violates the GDPR.

What lead to this statement is that Garante had received a number of complaints. However, it is also the product of coordination with other European privacy authorities.

In its reasoning, Garante assigns a special role to cookies that help GA to collect personal data, such as the IP address, visited pages, type of browser, and the kind of operating system. Garante considers it as proven that personal data is being transferred to the US when using GA. Garante reiterates that IP addresses qualify as personal data and that the pseudoanonymisation undertaken by GA is not sufficient to protect personal data from being accessed from US governmental agencies.

Garante called on all controllers and processors involved in Italian website operations for compliance and ordered a period of 90 days to comply with their obligations under the GDPR. The statement further states: “The Italian SA calls upon all controllers to verify that the use of cookies and other tracking tools on their websites is compliant with data protection law; this applies in particular to Google Analytics and similar services.”

EDPB adopts new guidelines on certification as a tool for transfers

23. June 2022

On June 16, 2022, the European Data Protection Board (EDPB) announced on its website that it had adopted guidelines on certification as a tool for transfers of personal data (publication is yet to take place following linguistic checks). Once published these guidelines will undergo public consultation until September 2022.

On a first note, these guidelines can be placed within the broader context of international data transfers, as envisioned by art. 46 (2) (f) GDPR. Further, the certification mechanism comes only into play when an adequacy decision is absent. As is probably well known, art. 46 (2) GDPR outlines several safeguards that may be resorted to in case personal data is being transferred to third countries.

One of these is the voluntary certification mechanism, as laid down by art. 42/43 GDPR, that allows accredited certification bodies or supervisory authorities to issue certifications, provided, of course, that controllers or processors have made binding and enforceable commitments. What the EU legislators hoped was to assist data subjects in quickly assessing “the level of data protection of relevant products and services” (Recital 100 GDPR) by way of certifications, seals, and marks.

In accordance with art. 42 (5) GDPR and guideline 1/2018 on certification, whereby the latter is to be complemented with the new guidelines, accredited certification bodies or supervisory authorities are competent to issue such certification. It is important to note that the previously mentioned accredited certification bodies could very well be private bodies which are subject to certain requirements and prior approval by the Board or supervisory authorities. The criteria on the basis of which certifications are issued are to be determined and approved by the Board or by the competent supervisory authorities (art. 42 (5) GDPR).

According to EDPB Deputy Chair Ventsislav Karadjov, these yet-to-be published guidelines are “ground-breaking” as he provides an outlook for the content of the guidelines. One of the most important aspects that will be touched upon are the accreditation requirements that certification bodies have to comply with as well as the certification criteria attesting that appropriate safeguards for transfers are in place. It remains to be seen whether these guidelines will indeed provide more guidance on those aspects.

Pages: 1 2 3 4 5 6 7 8 9 10 ... 22 23 24 Next
1 2 3 24