Category: Data Protection

KINAST is ranked among the Top 5 of Data Protection Law Firms in Germany

28. October 2022

We are very pleased about our renewed top placement in this year’s ranking of the Kanzleimonitor* study 2022-23 and would like to thank all clients who recommended us!

In the field of Data Protection Law, we achieved 5th place with numerous direct recommendations. Our firm can thus once again hold its own in a strong field of competitors alongside various large law firms (including Taylor Wessing, Osborne Clarke) in the absolute top group in Data Protection Law.

Three of our Attorneys are also mentioned by name in the current ranking of personal recommendations: Kristin Bauer, Dr. Karsten Kinast and Benjamin Schuh.

We are particularly pleased with this study result, as it is a transparent, direct evaluation from our clients and is carried out by our own professional group of lawyers.

Many thanks again to all clients who have recommended us (again)!

*The German Kanzleimonitor study (law firm monitor) (“kanzleimonitor.de – recommendation is the best reference”) provides an annual comprehensive ranking of the 100 most recommended lawyers and law firms in each legal field in Germany. This overview is intended to serve corporate lawyers in all industries as a selection criterion for mandating commercial law firms.

Another 20 million Euro fine for Clearview AI

The French data protection authority CNIL imposed a fine of 20 million Euros on Clearview AI, being the latest in a line of authorities deeming the processing activities of the biometrics company unlawful under data protection law.

Clearview AI is a US company that extracts photographs and videos that are directly accessible online, including social media, in order to feed its biometric image database, which it prides itself to be the biggest in the world. Access to the search engine based on this database is offered to law enforcement authorities.

The case

The decision followed several complaints from data subjects in 2020, which led to the CNIL’s investigations and a formal notice to Clearview AI in November 2021 to “cease the collection and use of data of persons on French territory in the absence of a legal basis” and “facilitate the exercise of individuals’ rights and to comply with requests for erasure.” However, the company did not react to this notice within the two-month deadline imposed by the CNIL. Therefore, the authority imposed not only the fine but also an order to Clearview AI “to stop collecting and processing data of individuals residing in France without a legal basis and to delete the data of these persons that it had already collected, within a period of two months.” In addition, it set a “penalty of 100,000 euros per day of delay beyond these two months.”

CNIL based its decision on three breaches. First, Clearview AI had processed the data without a legal basis. Given the “intrusive and massive nature of the process which makes it possible to retrieve the images present on Internet of the millions of internet users in France”, Clearview AI had no legitimate interest in the data processing. Second, the CNIL sanctioned Clearview AI’s inadequate handling of data subjects’ requests. Lastly, it penalized the company’s failure to cooperate with the CNIL.

The impact of the decision

For over two years, Clearview AI has been under the scrutiny of data protection authorities (“DPA”s) all over the world. So far, it has been fined more than 68 million Euros in total. Apart from CNIL’s fine, there have been fines of 20 million Euros by Greece’s Hellenic DPA in July 2022, over 7.5 million pounds by the UK Information Commissioner’s Office in May 2022 and 20 million Euros by the Italian Garante in March 2022.

CNIL’s decision was likely not the last one, considering that the all-encompassing nature of Clearview AI’s collection of personal data that – given the company’s business model – inevitably concerns EU data subjects. Whether the company will comply within the two-month period is yet to be seen.

UN Report on privacy and data protection as an increasingly precious asset in the digital era

UN Special Rapporteur on the right to privacy Ana Brian Nougrères published a report in which she laid out ten guiding principles “as a key structural part of every national legal system that regulate the actions of controllers and processors in the processing of personal data”.

According to the Special Rapporteur, “privacy is a human right that enables the free development of personality and the exercise of rights in accordance with the dignity of the human being […]. But today, we live in a world where participating in public and private activity at the national and international level requires more and more personal data to be processed”. Her goal is to achieve “cooperation and regulatory harmonization at the international level”. While many States regulate data protection and privacy issues nationally, international law enshrines the right to privacy in Article 12 of the Universal Declaration of Human Rights. The Special Rapporteur indicated that national legislation already has much in common regarding the principles of privacy and data protection which can “serve as a basis for progressing towards a global consensus that will make it possible to address various challenges that arise in the processing and international transfer of data concerning individuals to ensure that their right to privacy is safeguarded in both virtual and face-to-face environments”.

The ten key principles analyzed are legality, consent, transparency, purpose, loyalty, proportionality, minimization, quality, responsibility, and security – hardly news from an EU perspective. This is not a coincidence, as the Special Rapporteur used several supranational legal frameworks, including the GDPR, as a base for her analysis. This shows once more that a solely Eurocentric view on privacy and data protection is ill-advised, as other parts of the world may not find the principles quite as self-evident. With her report, the Special Rapporteur wishes to encourage and guide States “to strike a balance between the different conflicting interests in the processing of personal data and the right to privacy in the global and digital era”.

Microsoft data leak allegedly affected over 65,000 entities worldwide

Sensitive customer data was openly accessible on the internet via an incorrectly configured Microsoft server. After security researchers from the threat intelligence firm SOCRadar informed the company about the data leak on September 24, 2022, the server was secured, Microsoft announced on October 19, 2022. 

According to Microsoft, an “unintentional misconfiguration on an endpoint that is not in use across the Microsoft ecosystem” “resulted in the potential for unauthenticated access to some business transaction data corresponding to interactions between Microsoft and prospective customers, such as the planning or potential implementation and provisioning of Microsoft services.” The business transaction data that was leaked included “names, email addresses, email content, company name, and phone numbers, and may have included attached files relating to business between a customer and Microsoft or an authorized Microsoft partner.” 

While SOCRadar claims that the breach affected data of over 65,000 entities in 111 countries and entails data from 2017 to 2022 , Microsoft stated that the scope of the issue had been “greatly exaggerated”. Furthermore, Microsoft does not appreciate SOCRadar’s release of a public search tool and suggests that the tool does not meet basic data protection and privacy measures.  

Whether those numbers were indeed exaggerated or if Microsoft is trying to downplay the breach is difficult to judge from the outside. 

Italian DPA launches investigation on cookie- and paywalls

27. October 2022

On October 21st, 2022 the Italian Data Protection Authority launched an investigation on the use of cookie walls by several online newspapers. Although the GDPR allows the implementation of cookiewalls and paywalls (not revealing the content of a website unless the cookies have been accepted or a certain amount of money has been paid), the Italian watchdogs will take a closer look if these have been correctly implemented correctly and do not violated the European regulation.

Further information is yet to be released by the authorities.

TikTok faces huge fine from Britain’s ICO

12. October 2022

Lately, the Chinese social media success has been the subject of an investigation by the British data protection watchdog, the Information Commissioner’s Office (ICO): the investigation has so far concluded that the social media network has clearly breached the United Kingdom’s data protection laws, in particular the regulations concerning children’s personal data in the time. The Authority issued therefore a notice of intent, which is a potential precursor to a fine amounting up to a staggering 27 million pounds.

In particular, the Authority found out that the platform could have processed personal data of children under the age of 13 failing to gather the parents’ consent for the processing of these data. Under these data there are allegedly also special category data, which have a special protection under Art. 9 GDPR.

Furthermore, in the ICO’s opinion the principle of transparency was not respected by the Chinese hit platform by not providing complete or transparent information on the data processing or their gathering.

The ICO’s investigation is still ongoing as the Commissioner’s Office is still deciding whether to impose the fine or whether there has been a breach of data protection law.

The protection of teenagers and children is the top priority of the ICO according to current Information Commissioner John Edwards. Under his guidance, the ICO has several ongoing investigations targeting various tech companies who could be breaking the UK’s data protection laws.

This is not the first time TikTok has been under observation by data protection watchdogs. In July a US – Australian cybersecurity firm has found that TikTok gathers excessive amounts of information from their users, and voiced their concern over their findings. Based on these precedents, it could be possible that local data protection authorities will increment their efforts to control TikTok’s compliance with local laws and, in Europe, with the  GDPR.

G7 Data Protection Authorities discuss flow of data across borders

27. September 2022

From September 6th to September 9th, 2022 a meeting between representatives of the G7’s Data Protection Authorities was held in Bonn, Germany, to discuss current regulatory and technological issues concerning the concept of Data Flow with Free Trust (DFFT), a proposed guiding principle for international cooperation on data flows.

It aims at providing answers to several questions in order to create a safe global digital environment in which the protection of data flow is guaranteed. The most important question is: how to overcome the existing data flow barriers? It may seem difficult to introduce a harmonization between countries that have a completely different approach and regulations in regard to personal data protection. To answer this question, a bottom – up approach was adopted for the implementation of the DFFT: it is foreseen that high – level intragovernmental discussions that result in pragmatic rule – making will be held, in order to parallel the public/private relationship for the resolution of individual issues.

Scholars and experts seem to think that RegTech could prove a very useful help to the implementation of the DFFT. To tackle some of the issues that were found in the various discussions and that resulted from research, the World Economic Forum issued a white paper finding seven common success factors that define the best deployment of RegTech.

This concept, first proposed by Japan’s late Prime Minister Shinzo Abe in 2019, is now moving into the implementation phase, mainly concerning trade agreements including e – commerce. A milestone regarding this topic will probably be the next G7 Conference, which will be held in Japan in 2023. Kishida Fumio, the new Japanese Prime Minister, claimed his country’s initiative in the project, and pledged his commitment to the continuous development of the DFFT.

EDPS takes legal action against Europol’s new regulation

ON June 28th 2022, two new provisions of the amended Europol regulation came into force. These changes are considered worrying by the European Data Protection Supervisor (EDPS), as they have a direct impact on the data processing of individuals in the European Union: based on these provisions, the new regulation allows the Europol to retroactively process large volumes of data, even of individuals with no links to criminal activity.

Specifically, before these new provisions were passed, individuals could expect that if their data was gathered by Europol it would be processed within six months in order to establish whether the individual was involved in illicit activities or not, and if the former was the case, that the data related to that person would be deleted. With these modifications, Europol would be allowed to store and process these data even if the individual was found not part of any wrongdoing.

In an effort to stop these changes to effectively come into force, the EDPS issued an order on January 3rd 2022 to amend the new provisions including a precisely determined deletion period for data related to individuals not connected to unlawful activities. Seen as the order was ignored by Europol, on September 16th the EDPS requested that the European Court of Justice (ECJ) annuls these two provisions. The authorities stated that this proceeding by Europol is a clear violation of the individual’s fundamental rights.

Furthermore, it is clear that by overriding a direct order by the European data protection watchdogs and by introducing such amendments the independent controlling power of the supervising authority is undermined: this could set a dangerous precedent by which authorities in the European Union could foresee possible counter – reactions of the legislative power to override their supervising activities depending on political will. This would result in a clear violation of the European Charter of Fundamental Rights, since there would be a concrete risk of undermining the independence of a controlling authority by making it subject to undue political pressure or interference.

Danish watchdogs ban Google Chromebooks and Google Workspace in municipality

26. August 2022

In July 2022, after an investigation related to a data breach was carried out by the Danish Data Protection Authority (Datailsynet), Google Chromebooks and Google Workspace were banned in schools in the municipality of Helsingor. The DPA ruled that the risk assessment carried out by city officials shows that the processing of personal data by Google does not meet GDPR requirements. In particular, data transfers have been targeted by the Authority: the Data Processing Agreement allows data transfer to third countries for analytical and statistical support, though the data are primarily stored in Google’s European facilities.

This decision comes in a moment of tension in the world of personal data between Europe and the United States of America: other notorious cases (some still ongoing) are the case of the Irish Data Protection Authority vs. Facebook (now part of Meta Inc.), and the case of the German Federal Cartel Office vs. Facebook. European watchdogs have found that in many cases the American tech giants’ policies do not meet the requirements established by the GDPR. This could be traced back to a lack of legal framework in the field of privacy and personal data protection in the United States, were these companies are based.

This decision was taken in the aftermath of the Schrems II ruling by the European Court of Justice, which stated that the pre-existing agreement on data transfers between Europe and the US (so-called Privacy Shield)was not compatible with the GDPR. A new deal is on the table, but not yet approved nor effective.

Google is becoming the target of various investigations by European data watchdogs, above all because of its tool Google Analytics. In January the Austrian Data Protection Authority published an opinion in which it stated that companies using Google Analytics inadvertently transferred customers’ personal data such as IP addresses to the United States, in breach of the GDPR. Italy’s Garante per la Protezione dei Dati Personali published a similar opinion a few weeks later, stating that “the current methods adopted by Google do not guarantee an adequate level of protection of personal data”.

Personal data risks in the aftermath of the overturning of Roe vs. Wade

23. August 2022

At the end of June 2022, the United States Supreme Court overturned its 1973 ruling in the case of Roe vs. Wade, thus concretely ending federal abortion rights. The decision caused a worldwide outrage, but now a concerning situation presents itself: the massive use of social media and the Internet by the population could result in serious personal privacy violations by the authorities. For example, tech giants such as Apple, Google and Meta Inc. could share users’ data if law enforcement authorities suspect a felony is being committed. This could especially be the case in those States who chose to make abortion illegal after the Supreme Court’s ruling. According to the United States’ Federal Rules of Civil Procedure no. 45, this kind of personal data could be made object of a subpoena, thus forcing the subject to produce them in court. In such a scenario tech companies would have no choice than to provide the consumer’s data. It is clear that this is a high risk for the consumer’s privacy.

In particular, location data could show if a person visited an abortion clinic. Many women use specific apps in order to track periods, fertility and an eventual pregnancy. All these data could be put under surveillance and seized by law enforcement in order to investigate and prosecute abortion – related cases.

In some States this already happened. In 2018 in Mississippi a woman was charged with second – degree murder after seeking health care for a pregnancy loss which happened at home. Prosecutors produced her Internet browser history as proof. After two years she was acquitted of the charges.

Another risk is posed by the so – called data brokers: these are companies that harvest data, cleanse or analyze it and sell them to the highest bidder. These companies could also be used by law enforcement agencies to arbitrarily investigate people who could be related to abortion cases.

The lack of legislation regarding personal data protection is a serious issue in the United States. For example, there is no principle of data minimization as found in the GDPR. The Supreme Courts’ ruling makes this historical moment unexplored territory from a legal point of view. Privacy advisors and activists recommend to try to limit the digital footprint users leave on the web. Also, new laws and bills could be introduce in order to limit the access law enforcement agencies have to personal data.

Pages: 1 2 3 4 5 6 7 8 9 10 11 12 Next
1 2 3 12