Category: General

265 million euro fine for Meta

29. November 2022

The Irish Data Protection Commission (DPC) imposed an administrative fine of 265 million euros on Facebook-mother Meta as a result of the unlawful publication of personal data.

Investigation proceedings

Following the availability online of personal data of up to 533 million Facebook and Instagram users from over 100 countries in April 2021, the DPC had launched investigations. As part of the investigation process, it cooperated with the other European data protection authorities and examined the Facebook Search, Facebook Messenger Contact Importer and Instagram Contact Importer tools. With the help of these tools, contacts stored in the smartphone can be imported into the Instagram or Facebook app in order to find friends or acquaintances.

Lack of technical and organisational measures to protect data

As part of its investigation, the DPC dealt with the so-called technical and organisational measures according to Article 25 GDPR. According to data protection law, data controllers must use such measures to ensure that the rights of data subjects are extensively protected. These include, for example, pseudonymisation and encryption of personal data, but also physical protection measures or the existence of reliable backups.

The DPC did not consider Meta’s technical and organisational measures to be sufficient. Therefore, in addition to the aforementioned fine of 265 million euros, it issued a reprimand as well as an order to bring the processing operations into compliance with data protection law within a certain period of time and to implement a number of specific remedial measures to this end.

Not the first fine for Meta

Meta is by now familiar with fines from European data protection authorities. In total, the company has already been fined almost one billion euros, most recently in September in the amount of 405 million euros for serious data protection violations involving underage Instagram users. The reason for the considerable amount of the individual sanctions is Article 83 GDPR, according to which fines can amount to up to four percent of a company’s total worldwide annual turnover. Meta has appealed against each of the previous decisions, so it can also be assumed in this case that Meta will not accept the fine without a judicial review, either.

Spanish DPA publishes new tool for notifiability of data breaches

2. November 2022

A few days ago the Spanish Data Protection Authority launched a new tool called “Asesora Brecha” in order to simplify the notifiaibility of data breaches. This was deemed necessary due to the large number of reported data breaches in the country.

This tool helps data controllers as well as data protection officers to decide whether they should notify a personal data breach to the supervisory authority and how the breach itself can be avoided. Specifically, the functions include:

  • Who has to notify the supervisory authority
  • Which situations correspond to a data breach and which not
  • Which is the competent authority

The tool was described as free and easy to use. It was also added to the Decalogue of AEPD help resources in order to promote and facilitate compliance with the GDPR. In regard to the principle of storage limitation, the tool itself is GDPR compliant. Once the procedure is complete, all the provided data are automatically deleted.

However, the Spanish DPA clearly stated that the use of “Asesora Brecha” does not automatically imply that the obligations imposed by the GDPR are fulfilled. The responsible figure needs to fill out the relevant documentation and, if needed, report the data breach to the authorities.

KINAST is ranked among the Top 5 of Data Protection Law Firms in Germany

28. October 2022

We are very pleased about our renewed top placement in this year’s ranking of the Kanzleimonitor* study 2022-23 and would like to thank all clients who recommended us!

In the field of Data Protection Law, we achieved 5th place with numerous direct recommendations. Our firm can thus once again hold its own in a strong field of competitors alongside various large law firms (including Taylor Wessing, Osborne Clarke) in the absolute top group in Data Protection Law.

Three of our Attorneys are also mentioned by name in the current ranking of personal recommendations: Kristin Bauer, Dr. Karsten Kinast and Benjamin Schuh.

We are particularly pleased with this study result, as it is a transparent, direct evaluation from our clients and is carried out by our own professional group of lawyers.

Many thanks again to all clients who have recommended us (again)!

*The German Kanzleimonitor study (law firm monitor) (“kanzleimonitor.de – recommendation is the best reference”) provides an annual comprehensive ranking of the 100 most recommended lawyers and law firms in each legal field in Germany. This overview is intended to serve corporate lawyers in all industries as a selection criterion for mandating commercial law firms.

Another 20 million Euro fine for Clearview AI

The French data protection authority CNIL imposed a fine of 20 million Euros on Clearview AI, being the latest in a line of authorities deeming the processing activities of the biometrics company unlawful under data protection law.

Clearview AI is a US company that extracts photographs and videos that are directly accessible online, including social media, in order to feed its biometric image database, which it prides itself to be the biggest in the world. Access to the search engine based on this database is offered to law enforcement authorities.

The case

The decision followed several complaints from data subjects in 2020, which led to the CNIL’s investigations and a formal notice to Clearview AI in November 2021 to “cease the collection and use of data of persons on French territory in the absence of a legal basis” and “facilitate the exercise of individuals’ rights and to comply with requests for erasure.” However, the company did not react to this notice within the two-month deadline imposed by the CNIL. Therefore, the authority imposed not only the fine but also an order to Clearview AI “to stop collecting and processing data of individuals residing in France without a legal basis and to delete the data of these persons that it had already collected, within a period of two months.” In addition, it set a “penalty of 100,000 euros per day of delay beyond these two months.”

CNIL based its decision on three breaches. First, Clearview AI had processed the data without a legal basis. Given the “intrusive and massive nature of the process which makes it possible to retrieve the images present on Internet of the millions of internet users in France”, Clearview AI had no legitimate interest in the data processing. Second, the CNIL sanctioned Clearview AI’s inadequate handling of data subjects’ requests. Lastly, it penalized the company’s failure to cooperate with the CNIL.

The impact of the decision

For over two years, Clearview AI has been under the scrutiny of data protection authorities (“DPA”s) all over the world. So far, it has been fined more than 68 million Euros in total. Apart from CNIL’s fine, there have been fines of 20 million Euros by Greece’s Hellenic DPA in July 2022, over 7.5 million pounds by the UK Information Commissioner’s Office in May 2022 and 20 million Euros by the Italian Garante in March 2022.

CNIL’s decision was likely not the last one, considering that the all-encompassing nature of Clearview AI’s collection of personal data that – given the company’s business model – inevitably concerns EU data subjects. Whether the company will comply within the two-month period is yet to be seen.

UN Report on privacy and data protection as an increasingly precious asset in the digital era

UN Special Rapporteur on the right to privacy Ana Brian Nougrères published a report in which she laid out ten guiding principles “as a key structural part of every national legal system that regulate the actions of controllers and processors in the processing of personal data”.

According to the Special Rapporteur, “privacy is a human right that enables the free development of personality and the exercise of rights in accordance with the dignity of the human being […]. But today, we live in a world where participating in public and private activity at the national and international level requires more and more personal data to be processed”. Her goal is to achieve “cooperation and regulatory harmonization at the international level”. While many States regulate data protection and privacy issues nationally, international law enshrines the right to privacy in Article 12 of the Universal Declaration of Human Rights. The Special Rapporteur indicated that national legislation already has much in common regarding the principles of privacy and data protection which can “serve as a basis for progressing towards a global consensus that will make it possible to address various challenges that arise in the processing and international transfer of data concerning individuals to ensure that their right to privacy is safeguarded in both virtual and face-to-face environments”.

The ten key principles analyzed are legality, consent, transparency, purpose, loyalty, proportionality, minimization, quality, responsibility, and security – hardly news from an EU perspective. This is not a coincidence, as the Special Rapporteur used several supranational legal frameworks, including the GDPR, as a base for her analysis. This shows once more that a solely Eurocentric view on privacy and data protection is ill-advised, as other parts of the world may not find the principles quite as self-evident. With her report, the Special Rapporteur wishes to encourage and guide States “to strike a balance between the different conflicting interests in the processing of personal data and the right to privacy in the global and digital era”.

Microsoft data leak allegedly affected over 65,000 entities worldwide

Sensitive customer data was openly accessible on the internet via an incorrectly configured Microsoft server. After security researchers from the threat intelligence firm SOCRadar informed the company about the data leak on September 24, 2022, the server was secured, Microsoft announced on October 19, 2022. 

According to Microsoft, an “unintentional misconfiguration on an endpoint that is not in use across the Microsoft ecosystem” “resulted in the potential for unauthenticated access to some business transaction data corresponding to interactions between Microsoft and prospective customers, such as the planning or potential implementation and provisioning of Microsoft services.” The business transaction data that was leaked included “names, email addresses, email content, company name, and phone numbers, and may have included attached files relating to business between a customer and Microsoft or an authorized Microsoft partner.” 

While SOCRadar claims that the breach affected data of over 65,000 entities in 111 countries and entails data from 2017 to 2022 , Microsoft stated that the scope of the issue had been “greatly exaggerated”. Furthermore, Microsoft does not appreciate SOCRadar’s release of a public search tool and suggests that the tool does not meet basic data protection and privacy measures.  

Whether those numbers were indeed exaggerated or if Microsoft is trying to downplay the breach is difficult to judge from the outside. 

EDPS takes legal action against Europol’s new regulation

27. September 2022

ON June 28th 2022, two new provisions of the amended Europol regulation came into force. These changes are considered worrying by the European Data Protection Supervisor (EDPS), as they have a direct impact on the data processing of individuals in the European Union: based on these provisions, the new regulation allows the Europol to retroactively process large volumes of data, even of individuals with no links to criminal activity.

Specifically, before these new provisions were passed, individuals could expect that if their data was gathered by Europol it would be processed within six months in order to establish whether the individual was involved in illicit activities or not, and if the former was the case, that the data related to that person would be deleted. With these modifications, Europol would be allowed to store and process these data even if the individual was found not part of any wrongdoing.

In an effort to stop these changes to effectively come into force, the EDPS issued an order on January 3rd 2022 to amend the new provisions including a precisely determined deletion period for data related to individuals not connected to unlawful activities. Seen as the order was ignored by Europol, on September 16th the EDPS requested that the European Court of Justice (ECJ) annuls these two provisions. The authorities stated that this proceeding by Europol is a clear violation of the individual’s fundamental rights.

Furthermore, it is clear that by overriding a direct order by the European data protection watchdogs and by introducing such amendments the independent controlling power of the supervising authority is undermined: this could set a dangerous precedent by which authorities in the European Union could foresee possible counter – reactions of the legislative power to override their supervising activities depending on political will. This would result in a clear violation of the European Charter of Fundamental Rights, since there would be a concrete risk of undermining the independence of a controlling authority by making it subject to undue political pressure or interference.

Individual brought an action against European Commission before Court of Justice of the European Union

27. July 2022

A German citizen brought an action against the European Commission (the Commission) before the Court of Justice of the European Union claiming that the Commission is involved in illegal international data transfers to the US.

The subject-matter of the action, which was recently admitted by the Court, relates to data processing carried out in the context of the web page “future.europa.eu”, a platform that intends to increase citizen’s engagement with the EU.

In his complaint, that was drafted by EuGD, a German data protection organization, he alleges, amongst other things, that upon accessing said website and by enabling a facebook login, personal data, such as users’ IP addresses, is being transferred to US clouds and webhosts. The action’s allegations of illegal transfers are also grounded on the Schrems II judgment according to the organization’s press release.

It should be noted that personal data processings by organs of the EU do not fall under the scope of the GDPR, but instead they are regulated by another regulation, that is, regulation 2018/1725 on the protection of natural persons with regard to the processing of personal data by the Union institutions, bodies, offices and agencies and on the free movement of such data.

Even though the GDPR does not apply to the Commission, regulation 2018/1725 does mention the GDPR in the context of international data transfers to third countries (e.g. recital 65) and it is not too far fetched to hold the view that the ruling contained in Schrems II will indeed extend to this regulation.

One should also remember Recital 5 of Regulation 2018/1725 that reads the following:

Whenever the provisions of this Regulation follow the same principles as the provisions of Regulation (EU) 2016/679, those two sets of provisions should, under the case law of the Court of Justice of the European Union (the ‘Court of Justice’), be interpreted homogeneously, in particular because the scheme of this Regulation should be understood as equivalent to the scheme of Regulation (EU) 2016/679.

The claimant also alleges that the Commission did not duly respond to his access request in which he requested information on the data processed and about the safeguards in place. He specifically alleges that one request was not answered properly and that the other one was left unanswered at first.

The action questioning the legality of European webpages that use US webhosts and enable facebook log-ins comes at an interesting moment in time. Not too long ago, facebook/meta data transfers’ compatibility with GDPR was challenged by the DPC when it recommended to halt EU-US transfers of meta products for failing to comply with the GDPR.

The founder of the organization that is assisting the legal action told EURACTIV “that if a restaurant or a bakery has to figure out a way to comply with the ban on data transfers to the United States, so does the European Commission, as there cannot be double standards.”

European Parliament adopts Digital Services Act and Digital Markets Act

7. July 2022

On July 5, 2022, the EU Parliament voted in favor of the long-awaited Digital Markets Act (DMA) and Digital Services Act (DSA) following trilogue talks and agreements held between Parliament, Council, and European Commission earlier this year.

While the DSA amending the e-Commerce directive strictly prohibits specific forms of targeted advertising and misleading practices, the DMA can be viewed as the Competition law component that sets out stricter obligations for large online platforms within the Commission’s Digital Services Package.

Upon entry into force, advertisements targeting children, advertisements based on sensitive data, and dark patterns will no longer be permitted. Further, online platforms need to provide its users with the option and choice to not receive recommendations based on profiling. What the DSA also seeks to do, is to strengthen platform’s accountability and transparency. This means  that these platforms have to provide authorities and vetted researchers with access to information on the content moderation rules the respective platform uses as well as information on the algorithms used by recommender systems.

The spread of illegal content, such as hate speech, is also being addressed by these legislations obliging large platforms to respond quickly with due regard to other fundamental rights implicated.

Online platforms and other service providers not respecting the new obligations, may be fined with 10% of their annual total turnover in case of violations of the DMA, and 6% for violations of the DSA.

Artificial Intelligence and Personal Data: a hard co-existence. A new perspective for the EU

In the last decades AI has had an impressive development in various fields. At the same time, with each step forward the new machines and the new processes they are programmed to perform need to collect way more data than before in order to function properly.

One of the first things that come to mind is how can the rise of AI and the principle of data minimization, as contained in Art. 5 para. 1 lit. c) GDPR, be reconciled? At first glance it seems contradictory that there may be a way: after all, the GDPR clearly states that the number of personal data collected should be as small as possible. A study carried out by the Panel for the Future of Science and Technology of the European Union suggests that, given the wide scope (referring to the exceptions contained in the article) conceded by the norm, this issue could be addressed by measures like pseudonymization. This means that the data collected by the AI is deprived of every information that could refer personal data to a specific individual without additional information, thus lowering the risks for individuals.

The main issue with the current legal framework of the European Union regarding personal data protection is the fact that certain parts have been left vague, which causes uncertainty also in the regulation of artificial intelligence. To address this problem, the EU has put forward a proposal for a new Artificial Intelligence Act (“AIA”), aiming to create a common and more “approachable” legal framework.

One of the main features of this Act is that it divides the application of artificial intelligence in three main categories of risk levels:

  1. Creating an unacceptable risk, thus prohibited AIs (e.g. systems that violate fundamental rights).
  2. Creating a high risk, subject to specific regulation.
  3. Creating a low or minimum risk, with no further regulation.

Regarding high-risk AIs, the AIA foresees the creation of post-market monitoring obligations. If the AI in question violates any part of the AIA, it can then be forcibly withdrawn from the market by the regulator.

This approach has been welcomed by the Joint Opinion of the EDPB – EDPS, although the two bodies stated that the draft still needs to be more aligned with the GDPR.

Although the Commission’s draft contains a precise description of the first two categories, these will likely change over the course of the next years as the proposal is undergoing the legislative processes of the EU.

The draft was published by the European Commission in April 2021 and must still undergo scrutiny from the European Parliament and the Council of the European Union. Currently, some amendments have been formulated and the draft is still under review by the Parliament. After the Act has passed the scrutiny, it will be subject to a two – year implementation period.

Finally, a question remains to be answered: who shall oversee and control the Act’s implementation?It is foreseen that national supervisory authorities shall be established in each EU member state. Furthermore, the AIA aims at establishing a special European AI Board made up of representatives both of the member States and of the European Commission, which will also be the chair. Similar to the EDPB, this Board shall have the power to issue opinions and recommendations, and ensure the consistent application of the regulation throughout the EU.

Pages: 1 2 3 4 5 6 7 8 9 10 ... 30 31 32 Next
1 2 3 32