Category: European Data Protection

Google launches “Reject All” button on cookie banners

22. April 2022

After being hit with a €150 million fine by France’s data protection agency CNIL earlier in the year for making the process of rejecting cookies unnecessarily confusing and convoluted for users, Google has added a new “Reject All” button to the cookie consent banners that have become ubiquitous on websites in Europe. Users visiting Search and YouTube in Europe while signed out or in incognito mode will soon see an updated cookie dialogue with reject all and accept all buttons.

Previously, users only had two options: “I accept” and “personalize.” While this allowed users to accept all cookies with a single click, they had to navigate through various menus and options if they wanted to reject all cookies. “This update, which began rolling out earlier this month on YouTube, will provide you with equal “Reject All” and “Accept All” buttons on the first screen in your preferred language,” wrote Google product manager Sammit Adhya in a blog post.

According to Google they have kicked off the rollout of the new cookie banner in France and will be extending the change to all Google users in Europe, the U.K., and Switzerland soon.

Google’s plan to include a “Reject All” button on cookie banners after its existing policy violated EU law was also welcomed by Hamburg’s Commissioner for Data Protection and Freedom of Information Thomas Fuchs during a presentation of his 2021 activity report.

But the introduction of the “Reject All” button is likely to be only an interim solution because the US giant already presented far-reaching plans at the end of January to altogether remove Google cookies from third-party providers by 2023.

Instead of cookies, the internet giant wants to rely on in-house tracking technology for the Google Privacy Sandbox project.

ECJ against data retention without any reason or limit

6. April 2022

In the press release of the judgment of 5.4.2022, the ECJ has once again ruled that the collection of private communications data is unlawful without any reason or limit. This reinforces the rulings of 2014, 2016 and 2020, according to which changes are necessary at EU and national level.

In this judgment, the ECJ states that the decision to allow data retention as evidence in the case of a long-standing murder case is for the national court in Ireland.

Questions regarding this issue were submitted in 2020 by Germany, France and Ireland. The EU Advocate General confirmed, in a legally non-binding manner, the incompatibility of national laws with EU fundamental rights.

However, a first exception to data retention resulted from the 2020 judgment, according to which, in the event of a serious threat to national security, storage for a limited period and subject to judicial review was recognized as permissible.

Subsequently, a judgment in 2021 stated that national law must provide clear and precise rules with minimum conditions for the purpose of preventing abuse.

According to the ECJ, an without cause storage with restriction should be allowed in the following cases:

  • When limited to specific individuals or locations;
  • No concrete evidence of crime necessary, local crime rate is sufficient;
  • Frequently visited locations such as airports and train stations;
  • When national laws require the identity of prepaid cardholders to be stored;
  • Quick freeze, an immediate backup and temporary data storage if there is suspicion of crime.

All of these are to be used only to combat serious crime or prevent threats to national security.

In Germany, Justice Minister Marco Buschmann is in favor of a quick freeze solution as an alternative that preserves fundamental rights. However, the EU states are to work on a legally compliant option for data retention despite the ECJ’s criticism of this principle.

Italian DPA imposes a 20 Mio Euro Fine on Clearview AI

29. March 2022

The Italian data protection authority “Garante” has fined Clearview AI 20 million Euros for data protection violations regarding its facial recognition technology. Clearview AI’s facial recognition system uses over 10 billion images from the internet and prides themself to have the largest biometric image database in the world. The data protection authority has found Clearview AI to be in breach of numerous GDPR requirements. For example, fair and lawful processing was not carried out within the data protection framework, and there was no lawful basis for the collection of information and no appropriate transparency and data retention policies.

Last November, the UK ICO warned of a potential 17 million pound fine against Clearview, and in this context, and also ordered Clearview to stop processing data.

Then, in December, the French CNIL ordered Clearview to stop processing citizens’ data and gave it two months to delete all the data it had stored, but did not mention any explicit financial sanction.

In Italy, Clearview AI must now, in addition to the 20 million Euro fine, not only delete all images of Italian citizens from its database. It must also delete the biometric information needed to search for a specific face. Furthermore, the company must provide a EU representative as a point of contact for EU data subjects and the supervisory authority.

European Commission and United States agree in principle on Trans-Atlantic Data Privacy Framework

On March 25th, 2022, the United States and the European Commission have committed to a new Trans-Atlantic Data Privacy Framework that aims at taking the place of the previous Privacy Shield framework.

The White House stated that the Trans-Atlantic Data Privacy Framework “will foster trans-Atlantic data flows and address the concerns raised by the Court of Justice of the European Union when it struck down in 2020 the Commission’s adequacy decision underlying the EU-US Privacy Shield framework”.

According to the joint statement of the US and the European Commission, “under the Trans-Atlantic Data Privacy Framework, the United States is to put in place new safeguards to ensure that signals surveillance activities are necessary and proportionate in the pursuit of defined national security objectives, establish a two-level independent redress mechanism with binding authority to direct remedial measures, and enhance rigorous and layered oversight of signals intelligence activities to ensure compliance with limitations on surveillance activities”.

This new Trans-Atlantic Data Privacy Framework has been a strenuous work in the making and reflects more than a year of detailed negotiations between the US and EU led by Secretary of Commerce Gina Raimondo and Commissioner for Justice Didier Reynders.

It is hoped that this new framework will provide a durable basis for the data flows between the EU and the US, and underscores the shared commitment to privacy, data protection, the rule of law, and the collective security.

Like the Privacy Shield before, this new framework will represent a self-certification with the US Department of Commerce. Therefore, it will be crucial for data exporters in the EU to ensure that their data importers are certified under the new framework.

The establishment of a new “Data Protection Review Court” will be the responsible department in cases of the new two-tier redress system that will allow EU citizens to raise complaints in cases of access of their data by US intelligence authorities, aiming at investigating and resolving the complaints.

The US’ commitments will be concluded by an Executive Order, which will form the basis of the adequacy decision by the European Commission to put the new framework in place. While this represents a quicker solution to reach the goal, it also means that Executive Orders can be easily repealed by the next government of the US. Therefore, it remains to be seen if this new framework, so far only agreed upon in principle, will bring the much hoped closure on the topic of trans-Atlantic data flows that is intended to bring.

Belgian DPA declares technical standard used for cookie banner for consent requests illegal

28. March 2022

In a long-awaited decision on the Transparency and Consent Framework (TCF), the Belgian data protection authority APD concludes that this technical standard, which advertisers use to collect consent for targeted advertising on the Internet, does not comply with the principles of legality and fairness. Accordingly, it violates the GDPR.

The ADP’s decision is aligned with other European data protection authorities and has consequences for cookie banners and behavioral online advertising in the EU. The advertising association IAB Europe, which develops and operates the TCF system, must now delete the personal data collected in this way and pay a fine of 250,000 euros. In addition, conditions have been determined for the advertising industry under which the TCF may continue to be used at all.

Almost all companies, including advertising companies such as Google or Amazon, use the mechanism to pass on users’ presumed consent to the processing of their personal data for personalized advertising purposes. This decision will have a major impact on the protection of users’ personal data. This is also confirmed by Hielke Hijmans from APD.

The basic structure of the targeted advertising system is that each visit to a participating website triggers an auction among the providers of advertisements. Based on the desired prices and the user’s data profile, among other things, a decision is made in milliseconds as to which advertisements she will see. For this real-time bidding (RTB) to work, the advertising companies collect data to compile target groups for ads.

If users accept cookies or do not object that the use of their data is in the legitimate interest of the provider, the TCF generates a so-called TC string, which contains information about consent decisions. This identifier forms the basis for the creation of individual profiles and for the auctions in which advertising spaces and, with them, the attention of the desired target group are auctioned off, and is forwarded to partners in the OpenRTB system.

According to the authority, the TC strings already constitute personal data because they enable users to be identified with the IP address and the cookies set by the TCF. In addition, IAB Europe is said to be jointly legally responsible for any data processing via the framework, although IAB Europe has not positioned itself as a data processor, only as a provider of a standard.
The TCF envisions advertising providers invoking a “legitimate interest” in data collection in cookie banners that pop up all the time, rather than asking for consent. This would have to be prohibited, for example, for it to be lawful. The principles of privacy by design and by default are also violated, since consent is literally tricked by design tricks, the data flows are not manageable, and revocation of consent is hardly possible.

Google to launch Google Analytics 4 with aim to address EU Data Protection concerns

24. March 2022

On March 16, 2022, Google announced the launch of its new analytics solution, “Google Analytics 4”. Among other things, “Google Analytics 4” aims to address the most recent data protection developments regarding the use of analytical cookies and the transfers tied to such processing.

The announcement of this new launch comes following 101 complaints made by the non-governmental organization None of Your Business (NOYB) complaints with 30 EEA countries’ data protection authorities (DPA). Assessing the data transfer from the EU to the US after the Schrems II decision of the CJEU for the use of Google Analytics, the French and Austrian DPAs ruled that the transfer of EU personal data from the EU to the U.S. through the use of the Google Analytics cookies is unlawful under the GDPR.

In the press release, Google states that “Google Analytics 4 is designed with privacy at its core to provide a better experience for both our customers and their users. It helps businesses meet evolving needs and user expectations, with more comprehensive and granular controls for data collection and usage.”

However, the most important change that the launch of “Google Analytics 4” will have on the processing of personal data is that it will no longer store users’ IP addresses. This will limit the data processing and resulting transfers that Google Analytics was under scrutiny for in the EU, however it is unclear at this point if the EU DPAs will change their opinion on the use of Google Analytics with this new version.

According to the press release, the current Google Analytics will be suspended starting July 2023, and Google is recommending companies to move onto “Google Analytics 4” as soon as possible.

CNIL judges use of Google Analytics illegal

14. February 2022

On 10th February 2022, the French Data Protection Authority Commission Nationale de l’Informatique et des Libertés (CNIL) has pronounced the use of Google Analytics on European websites to not be in line with the requirements of the General Data Protection Regulation (GDPR) and has ordered the website owner to comply with the requirements of the GDPR within a month’s time.

The CNIL judged this decision in regard to several complaints maybe by the NOYB association concerning the transfer to the USA of personal data collected during visits to websites using Google Analytics. All in all, NOYB filed 101 complaints against data controllers allegedly transferring personal data to the USA in all of the 27 EU Member States and the three further states of European Economic Area (EEA).

Only two weeks ago, the Austrian Data Protection Authority (ADPA) made a similar decision, stating that the use of Google Analytics was in violation of the GDPR.

Regarding the French decision, the CNIL concluded that transfers to the United States are currently not sufficiently regulated. In the absence of an adequacy decision concerning transfers to the USA, the transfer of data can only take place if appropriate guarantees are provided for this data flow. However, while Google has adopted additional measures to regulate data transfers in the context of the Google Analytics functionality, the CNIL deemed that those measures are not sufficient to exclude the accessibility of the personal data for US intelligence services. This would result in “a risk for French website users who use this service and whose data is exported”.

The CNIL stated therefore that “the data of Internet users is thus transferred to the United States in violation of Articles 44 et seq. of the GDPR. The CNIL therefore ordered the website manager to bring this processing into compliance with the GDPR, if necessary by ceasing to use the Google Analytics functionality (under the current conditions) or by using a tool that does not involve a transfer outside the EU. The website operator in question has one month to comply.”

The CNIL has also given advice regarding website audience measurement and analysis services. For these purposes, the CNIL recommended that these tools should only be used to produce anonymous statistical data. This would allow for an exemption as the aggregated data would not be considered “personal” data and therefore not fall under the scope of the GDPR and the requirements for consent, if the data controller ensures that there are no illegal transfers.

EDPS sanctions European Parliament for unlawfully transfered data to the US

25. January 2022

The European Data Protection Supervisor (EDPS) ruled that the European Parliament (EP) offended against a judgement of the European Court of Justice (ECJ) by transferring data to the US using US origin tech tools on their website for COVID-19 tests. This judgement relies on a complaint that involves misleading cookie banners, uncertain data privacy statements and unlawful data transfers from the EU to the US.

The ECJ makes clear that the transfer of personal data from the EU to the US is topic of strict conditions. Websites can only transfer data to the US if a certain security level is given. In this case there was not such a security level available.

The misleading cookie banners were so vague that the cookies were not listed in total and some differences between language options became appearent. Therefore, the website users could not give their valid consent.

Furthermore, the data privacy information were not clear and transparent, in that they refer to an incorrect legal basis for the processing. The given references were also in violation of transperency and requests of information.

Even during the process the EP tried to improve the technical deficits.

The EDPS issued a caution because in contrast to national data protection authorities it can only sentence under certain conditions, which were not given in this case. In result, it imposed a cease and desist order with a one month deadline for the EP to adjust the compliance.

Dutch Minister of Finance fined 2.75 million Euro for discriminatory and unlawful data processing

4. January 2022

On December 8th, 2021, the Autoriteit Persoonsgegevens (the Dutch Data Protection Authority (DPA)) announced that it had fined the Belastingdienst (the Dutch Tax Administration) €2.75 million. The fine was imposed because, as part of the so-called Toeslagenaaffaire (Childcare Benefit Affair), the Belastingdienst processed data on the (dual) nationality of childcare benefit claimants in an unlawful, discriminatory and therefore unlawful manner over many years, in serious breach of the principles of the General Data Protection Regulation (GDPR).

In the 2010s, the Belastingdienst wrongly reclaimed child benefits from tens of thousands of parents. Even minor formal errors in filling out the forms led to enormous claims, and a supposedly false citizenship could lead to years of stigmatizing fraud investigations. As a result, many families who relied on government assistance were driven into bankruptcy. The Belastingdienst should have deleted the data on dual nationality of Dutch nationals in January 2014, as from that date the dual nationality of Dutch nationals no longer played a legal role in the assessment of applications for childcare benefits. Nevertheless, the Belastingdienst retained and used these data. In May 2018, there were still about 1.4 million people with dual nationality registered in the Belastingdienst’s systems. What initially appeared to be a simple administrative failure has evolved over the years into a major scandal. The final report of the investigative commission, presented in December, concludes that the tax offices systematically preyed on innocent citizens. The Belastingdienst also used the nationality of applicants as an indicator in a system that automatically classified certain applications as risky. Again, the data were not necessary for this purpose. Under the General Data Protection Regulation, it is unlawful to process data on nationality in a discriminatory manner, as the data processing must not violate fundamental rights. These include the right to equality and non-discrimination. Under the GDPR, it is unlawful to process personal data on nationality in a discriminatory manner, as the data processing must not violate fundamental rights. These include the right to equality and non-discrimination. In addition, personal data may only be processed and stored for a specific, predetermined purpose. Processing without a purpose is inadmissible, and here there was no purpose, as nationality is legally irrelevant for the assessment of applications for childcare benefits.

In the statement DPA chair Aleid Wolfsen is quoted:

The government has exclusive responsibility for lots of things. Members of the public don’t have a choice; they are forced to allow the government to process their personal data.
That’s why it’s crucial that everyone can have absolute confidence that this processing is done properly. That the government doesn’t keep and process unnecessary data about individuals. And that there is never any element of discrimination involved in an individual’s contact with the government.
That went horribly wrong at the Benefits Office, with all the associated consequences. Obviously this fine cannot undo any of the harm done. But it is an important step within a broader recovery process.

In the wake of the DPA investigation, the Belastingdienst began to clean up its internal systems. In the summer of 2020, the dual nationalities of Dutch nationals were completely deleted from the systems. According to the DPA, since October 2018, the Belastingdienst no longer uses the nationality of applicants to assess risk. And since February 2019, it no longer uses the data to fight organized fraud. The fine was imposed on the Minister of Finance because he is responsible for the processing of personal data within the Belastingdienst.

European Commission adopts South Korea Adequacy Decision

30. December 2021

On December 17th, 2021, the European Commission (Commission) announced in a statement it had adopted an adequacy decision for the transfer of personal data from the European Union (EU) to the Republic of Korea (South Korea) under the General Data Protection Regulation (GDPR).

An adequacy decision is one of the instruments available under the GDPR to transfer personal data from the EU to third countries that ensure a comparable level of protection for personal data as the EU. It is a Commission decision under which personal data can flow freely and securely from the EU to the third country in question without any further conditions or authorizations being required. In other words, the transfer of data to the third country in question can be handled in the same way as the transfer of data within the EU.

This adequacy decision allows for the free flow of personal data between the EU and South Korea without the need for any further authorization or transfer instrument, and it also applies to the transfer of personal data between public sector bodies. It complements the Free Trade Agreement (FTA) between the EU and South Korea, which entered into force in July 2011. The trade agreement has led to a significant increase in bilateral trade in goods and services and, inevitably, in the exchange of personal data.

Unlike the adequacy decision regarding the United Kingdom, this adequacy decision is not time-limited.

The Commission’s statement reads:

The adequacy decision will complement the EU – Republic of Korea Free Trade Agreement with respect to personal data flows. As such, it shows that, in the digital era, promoting high privacy and personal data protection standards and facilitating international trade can go hand in hand.

In South Korea, the processing of personal data is governed by the Personal Information Portection Act (PIPA), which provides similar principles, safeguards, individual rights and obligations as the ones under EU law.

An important step in the adequacy talks was the reform of PIPA, which took effect in August 2020 and strengthened the investigative and enforcement powers of the Personal Information Protection Commission (PIPC), the independent data protection authority of South Korea. As part of the adequacy talks, both sides also agreed on several additional safeguards that will improve the protection of personal data processed in South Korea, such as transparency and onward transfers.

These safeguards provide stronger protections, for example, South Korean data importers will be required to inform Europeans about the processing of their data, and onward transfers to third countries must ensure that the data continue to enjoy the same level of protection. These regulations are binding and can be enforced by the PIPC and South Korean courts.

The Commission has also published a Q&A on the adequacy decision.

Pages: Prev 1 2 3 4 5 6 7 8 9 10 ... 18 19 20 Next
1 2 3 4 20