Category: Data Protection

ECJ against data retention without any reason or limit

6. April 2022

In the press release of the judgment of 5.4.2022, the ECJ has once again ruled that the collection of private communications data is unlawful without any reason or limit. This reinforces the rulings of 2014, 2016 and 2020, according to which changes are necessary at EU and national level.

In this judgment, the ECJ states that the decision to allow data retention as evidence in the case of a long-standing murder case is for the national court in Ireland.

Questions regarding this issue were submitted in 2020 by Germany, France and Ireland. The EU Advocate General confirmed, in a legally non-binding manner, the incompatibility of national laws with EU fundamental rights.

However, a first exception to data retention resulted from the 2020 judgment, according to which, in the event of a serious threat to national security, storage for a limited period and subject to judicial review was recognized as permissible.

Subsequently, a judgment in 2021 stated that national law must provide clear and precise rules with minimum conditions for the purpose of preventing abuse.

According to the ECJ, an without cause storage with restriction should be allowed in the following cases:

  • When limited to specific individuals or locations;
  • No concrete evidence of crime necessary, local crime rate is sufficient;
  • Frequently visited locations such as airports and train stations;
  • When national laws require the identity of prepaid cardholders to be stored;
  • Quick freeze, an immediate backup and temporary data storage if there is suspicion of crime.

All of these are to be used only to combat serious crime or prevent threats to national security.

In Germany, Justice Minister Marco Buschmann is in favor of a quick freeze solution as an alternative that preserves fundamental rights. However, the EU states are to work on a legally compliant option for data retention despite the ECJ’s criticism of this principle.

Italian DPA imposes a 20 Mio Euro Fine on Clearview AI

29. March 2022

The Italian data protection authority “Garante” has fined Clearview AI 20 million Euros for data protection violations regarding its facial recognition technology. Clearview AI’s facial recognition system uses over 10 billion images from the internet and prides themself to have the largest biometric image database in the world. The data protection authority has found Clearview AI to be in breach of numerous GDPR requirements. For example, fair and lawful processing was not carried out within the data protection framework, and there was no lawful basis for the collection of information and no appropriate transparency and data retention policies.

Last November, the UK ICO warned of a potential 17 million pound fine against Clearview, and in this context, and also ordered Clearview to stop processing data.

Then, in December, the French CNIL ordered Clearview to stop processing citizens’ data and gave it two months to delete all the data it had stored, but did not mention any explicit financial sanction.

In Italy, Clearview AI must now, in addition to the 20 million Euro fine, not only delete all images of Italian citizens from its database. It must also delete the biometric information needed to search for a specific face. Furthermore, the company must provide a EU representative as a point of contact for EU data subjects and the supervisory authority.

European Commission and United States agree in principle on Trans-Atlantic Data Privacy Framework

On March 25th, 2022, the United States and the European Commission have committed to a new Trans-Atlantic Data Privacy Framework that aims at taking the place of the previous Privacy Shield framework.

The White House stated that the Trans-Atlantic Data Privacy Framework “will foster trans-Atlantic data flows and address the concerns raised by the Court of Justice of the European Union when it struck down in 2020 the Commission’s adequacy decision underlying the EU-US Privacy Shield framework”.

According to the joint statement of the US and the European Commission, “under the Trans-Atlantic Data Privacy Framework, the United States is to put in place new safeguards to ensure that signals surveillance activities are necessary and proportionate in the pursuit of defined national security objectives, establish a two-level independent redress mechanism with binding authority to direct remedial measures, and enhance rigorous and layered oversight of signals intelligence activities to ensure compliance with limitations on surveillance activities”.

This new Trans-Atlantic Data Privacy Framework has been a strenuous work in the making and reflects more than a year of detailed negotiations between the US and EU led by Secretary of Commerce Gina Raimondo and Commissioner for Justice Didier Reynders.

It is hoped that this new framework will provide a durable basis for the data flows between the EU and the US, and underscores the shared commitment to privacy, data protection, the rule of law, and the collective security.

Like the Privacy Shield before, this new framework will represent a self-certification with the US Department of Commerce. Therefore, it will be crucial for data exporters in the EU to ensure that their data importers are certified under the new framework.

The establishment of a new “Data Protection Review Court” will be the responsible department in cases of the new two-tier redress system that will allow EU citizens to raise complaints in cases of access of their data by US intelligence authorities, aiming at investigating and resolving the complaints.

The US’ commitments will be concluded by an Executive Order, which will form the basis of the adequacy decision by the European Commission to put the new framework in place. While this represents a quicker solution to reach the goal, it also means that Executive Orders can be easily repealed by the next government of the US. Therefore, it remains to be seen if this new framework, so far only agreed upon in principle, will bring the much hoped closure on the topic of trans-Atlantic data flows that is intended to bring.

Belgian DPA declares technical standard used for cookie banner for consent requests illegal

28. March 2022

In a long-awaited decision on the Transparency and Consent Framework (TCF), the Belgian data protection authority APD concludes that this technical standard, which advertisers use to collect consent for targeted advertising on the Internet, does not comply with the principles of legality and fairness. Accordingly, it violates the GDPR.

The ADP’s decision is aligned with other European data protection authorities and has consequences for cookie banners and behavioral online advertising in the EU. The advertising association IAB Europe, which develops and operates the TCF system, must now delete the personal data collected in this way and pay a fine of 250,000 euros. In addition, conditions have been determined for the advertising industry under which the TCF may continue to be used at all.

Almost all companies, including advertising companies such as Google or Amazon, use the mechanism to pass on users’ presumed consent to the processing of their personal data for personalized advertising purposes. This decision will have a major impact on the protection of users’ personal data. This is also confirmed by Hielke Hijmans from APD.

The basic structure of the targeted advertising system is that each visit to a participating website triggers an auction among the providers of advertisements. Based on the desired prices and the user’s data profile, among other things, a decision is made in milliseconds as to which advertisements she will see. For this real-time bidding (RTB) to work, the advertising companies collect data to compile target groups for ads.

If users accept cookies or do not object that the use of their data is in the legitimate interest of the provider, the TCF generates a so-called TC string, which contains information about consent decisions. This identifier forms the basis for the creation of individual profiles and for the auctions in which advertising spaces and, with them, the attention of the desired target group are auctioned off, and is forwarded to partners in the OpenRTB system.

According to the authority, the TC strings already constitute personal data because they enable users to be identified with the IP address and the cookies set by the TCF. In addition, IAB Europe is said to be jointly legally responsible for any data processing via the framework, although IAB Europe has not positioned itself as a data processor, only as a provider of a standard.
The TCF envisions advertising providers invoking a “legitimate interest” in data collection in cookie banners that pop up all the time, rather than asking for consent. This would have to be prohibited, for example, for it to be lawful. The principles of privacy by design and by default are also violated, since consent is literally tricked by design tricks, the data flows are not manageable, and revocation of consent is hardly possible.

Google to launch Google Analytics 4 with aim to address EU Data Protection concerns

24. March 2022

On March 16, 2022, Google announced the launch of its new analytics solution, “Google Analytics 4”. Among other things, “Google Analytics 4” aims to address the most recent data protection developments regarding the use of analytical cookies and the transfers tied to such processing.

The announcement of this new launch comes following 101 complaints made by the non-governmental organization None of Your Business (NOYB) complaints with 30 EEA countries’ data protection authorities (DPA). Assessing the data transfer from the EU to the US after the Schrems II decision of the CJEU for the use of Google Analytics, the French and Austrian DPAs ruled that the transfer of EU personal data from the EU to the U.S. through the use of the Google Analytics cookies is unlawful under the GDPR.

In the press release, Google states that “Google Analytics 4 is designed with privacy at its core to provide a better experience for both our customers and their users. It helps businesses meet evolving needs and user expectations, with more comprehensive and granular controls for data collection and usage.”

However, the most important change that the launch of “Google Analytics 4” will have on the processing of personal data is that it will no longer store users’ IP addresses. This will limit the data processing and resulting transfers that Google Analytics was under scrutiny for in the EU, however it is unclear at this point if the EU DPAs will change their opinion on the use of Google Analytics with this new version.

According to the press release, the current Google Analytics will be suspended starting July 2023, and Google is recommending companies to move onto “Google Analytics 4” as soon as possible.

Land register number allows access to personal data, Polish authorities confirm

23. March 2022

In a legal dispute that has been ongoing since 2020, the Polish Commissioner for Human Rights recently stated that the disclosure of land register numbers can lead to obtaining a large amount of personal data contained in the registers. In his opinion, general access to such detailed data harms and significantly restricts the informational autonomy of individuals.

The Commissioner’s view confirms the position of the Polish Data Protection Authority, which, in an administrative decision dated August 24th, 2020, ordered the Polish General Surveyor to cease making land register numbers available on the website “GEOPORTAL2”. He also imposed a fine of PLN 100,000 for violating the principle of lawfulness under Articles 5 para. 1 lit. a, 6 para. 1 GDPR, as there was no legal basis for the processing.

The decision was justified by the fact that land register numbers allow indirect identification of property owners and are therefore considered personal data. Moreover, the publication of these enables access to further data such as national ID number or property address. This may lead to a variety of dangers associated with the use of such data, in particular identity theft or impersonation for criminal purposes.

This opinion was also held by the Polish Voivodeship Administrative Court in Warsaw, which on May 5th, 2021, dismissed the Surveyor’s complaint against the decision of the Polish Data Protection Authority.

Irish DPC fines Meta 17 Million Euros over 2018 data breaches

16. March 2022

On March 15th, 2022, the Irish Data Protection Commission (DPC) has imposed a fine on Meta Platforms 17 million euros over a series of twelve data breaches, which happened from June to December 2018.

The inquiry of the DPC which led to this decision examined the extent to which Meta Platforms complied with the requirements of Arti. 5(1)(f), Art. 5(2), Art. 24(1) and Art. 32(1) GDPR in relation to the processing of personal data relevant to the twelve breach notifications.

As the result of this inquiry, the DPC found that Meta Platforms infringed Art. 5(2) and 24(1) GDPR.  In particular, the DPC assessed that Meta Platforms failed to have in place appropriate technical and organisational measures which would enable it to readily demonstrate the security measures that it implemented in practice to protect the data of its European users in the case of those twelve data breaches.

The processing under examination constituted a “cross-border” processing, and as such the DPC’s decision was subject to the co-decision-making process outlined in Art. 60 GDPR. This resulted in all of the other European supervisory authorities to be engaged in this decision as co-decision-makers.  While objections to the DPC’s draft decision were raised by two of the European supervisory authorities, consensus was achieved through further engagement between the DPC, and the supervisory authorities concerned.

“Accordingly, the DPC’s decision represents the collective views of both the DPC and its counterpart supervisory authorities throughout the EU,” the DPC stated in their press release.

A Meta spokesperson has commented on the decision, stating, “This fine is about record keeping practices from 2018 that we have since updated, not a failure to protect people’s information. We take our obligations under the GDPR seriously and will carefully consider this decision as our processes continue to evolve.”

ICO releases Guidance on Video Surveillance

7. March 2022

At the end of February 2022, The UK Information Commissioners’ Office (ICO) published a guidance for organizations that capture CCTVs footage in order to provide advice for when they operate video surveillance systems that view or record individuals.

The recommendations aim to focus on best practices for data activities related to “emerging capabilities that can assist human decision making, such as the use of Facial Recognition Technology and machine learning algorithms.” As per the Guidance, surveillance systems specifically include traditional CCTV, Automatic Number Plate Recognition, Body Worn Video, Drones, Facial Recognition Technology, dashcams and smart doorbell cameras.

In their Guidance, the ICO offers checklists with points that controllers can use in order to monitor their use of video surveillance and keep track of their compliance with the applicable law. It further touches on the principles of data protection and how they specifically apply to video surveillance. In addition, it helps companies with the documentation of a Data Processing Impact Assessment.

The Guidance gives in depth advice on video surveillance at the workplace as well as if video feeds should also record audio.

Overall, the Guidance aims to sensibilize controllers regarding the various issues faced with when using video surveillance, and gives them in depth help on what to do to be compliant with the data protection regulations in the UK.

French CNIL highlights its data protection enforcement priorities for 2022

25. February 2022

Following complaints received, but also on its own initiative, the French data protection supervisory authority Commission Nationale Informatique et Liberté (hereinafter ‘CNIL’) carries out checks, also based on reports of data protection violations. CNIL has published three topics for 2022 on which it will focus in particular. These topics are: commercial prospecting, surveillance tools in the context of teleworking, and cloud services.

With regard to commercial prospecting, CNIL draws particular attention to unsolicited advertising calls, which are a recurring complaint to CNIL in France.

In February 2022, CNIL published a guideline for “commercial management”, which is particularly relevant for commercial canvassing.

Based on this guideline, CNIL will control GDPR compliance. The focus here will be on professionals who resell data.

Regarding the monitoring tools for teleworking, identified as CNIL’s second priority, CNIL aims to assist in balancing the interests of protecting the privacy of workers who have the possibility of home office due to COVID-19 and the legitimate monitoring of activities by informing the rules to be followed for this purpose. CNIL believes that employers need to be more strictly controlled in this regard.

Last but not least, CNIL draws particular attention to the potential data protection breaches regarding the use of cloud computing technologies. Since massive data transfers outside the European Union can be considered here in particular, activities in this area must be monitored more closely. For this purpose, CNIL reserves the right to focus in particular on the frameworks governing the contractual relationships between data controllers and cloud technology providers.

Norwegian DPA aims to strengthen cookie regulations

22. February 2022

The Norwegian Data Protection Authority (DPA), Datatilsynet, has reached out to the Ministry of Local Government and District Affairs in a letter emphasizing the requirement of tightening cookie regulations in Norway.

This letter comes amid voices of consulting committees to delay the proposed tightened cookie regulations which have been on open consultation in Norway since the end of last year.

In the letter, the Datatilsynet points out the importance of strengthened cookie laws, specifically regarding the manner of obtaining consent and the design of the consent banners, which “are designed in ways that influence users to consent by making it more cumbersome and time consuming to not consent”.

The letter also references the French data protection authority’s decisions to fine Google €150 million and Facebook €60 million for inadequately facilitating refusal of cookies, as issued on 31 December 2021, and clearly outlined that in contrast to the practices for which Google and Facebook had been fined in France, the cookie practices would hardly have been considered problematic under the Norwegian cookie regulations, where illusory consents are allowed through pre-set browser settings.

Senior Legal Advisor Anders Obrestad stated that “these cases illustrate how unsustainable the current regulation of cookies and similar sports technologies in Norway are for the privacy of internet users”.

The Norwegian DPA hopes to be able to stop any delay in the strengthening of cookie regulations, as well as emphasize the importance of valid consent of internet users.

Pages: Prev 1 2 3 4 5 6 7 8 9 10 11 12 13 Next
1 2 3 4 5 6 13