Tag: GDPR

Record GDPR fine by the Hungarian Data Protection Authority for the unlawful use of AI

22. April 2022

The Hungarian Data Protection Authority (Nemzeti Adatvédelmi és Információszabadság Hatóság, NAIH) has recently published its annual report in which it presented a case where the Authority imposed the highest fine to date of ca. €670,000 (HUF 250 million).

This case involved the processing of personal data by a bank that acted as a data controller. The controller automatically analyzed recorded audio of costumer calls. It used the results of the analysis to determine which customers should be called back by analyzing the emotional state of the caller using an artificial intelligence-based speech signal processing software that automatically analyzed the call based on a list of keywords and the emotional state of the caller. The software then established a ranking of the calls serving as a recommendation as to which caller should be called back as a priority.

The bank justified the processing on the basis of its legitimate interests in retaining its customers and improving the efficiency of its internal operations.

According to the bank this procedure aimed at quality control, in particular at the prevention of customer complaints. However, the Authority held that the bank’s privacy notice referred to these processing activities in general terms only, and no material information was made available regarding the voice analysis itself. Furthermore, the privacy notice only indicated quality control and complaint prevention as purposes of the data processing.

In addition, the Authority highlighted that while the Bank had conducted a data protection impact assessment and found that the processing posed a high risk to data subjects due to its ability to profile and perform assessments, the data protection impact assessment did not provide substantive solutions to address these risks. The Authority also emphasized that the legal basis of legitimate interest cannot serve as a “last resort” when all other legal bases are inapplicable, and therefore data controllers cannot rely on this legal basis at any time and for any reason. Consequently, the Authority not only imposed a record fine, but also required the bank to stop analyzing emotions in the context of speech analysis.

 

Google launches “Reject All” button on cookie banners

After being hit with a €150 million fine by France’s data protection agency CNIL earlier in the year for making the process of rejecting cookies unnecessarily confusing and convoluted for users, Google has added a new “Reject All” button to the cookie consent banners that have become ubiquitous on websites in Europe. Users visiting Search and YouTube in Europe while signed out or in incognito mode will soon see an updated cookie dialogue with reject all and accept all buttons.

Previously, users only had two options: “I accept” and “personalize.” While this allowed users to accept all cookies with a single click, they had to navigate through various menus and options if they wanted to reject all cookies. “This update, which began rolling out earlier this month on YouTube, will provide you with equal “Reject All” and “Accept All” buttons on the first screen in your preferred language,” wrote Google product manager Sammit Adhya in a blog post.

According to Google they have kicked off the rollout of the new cookie banner in France and will be extending the change to all Google users in Europe, the U.K., and Switzerland soon.

Google’s plan to include a “Reject All” button on cookie banners after its existing policy violated EU law was also welcomed by Hamburg’s Commissioner for Data Protection and Freedom of Information Thomas Fuchs during a presentation of his 2021 activity report.

But the introduction of the “Reject All” button is likely to be only an interim solution because the US giant already presented far-reaching plans at the end of January to altogether remove Google cookies from third-party providers by 2023.

Instead of cookies, the internet giant wants to rely on in-house tracking technology for the Google Privacy Sandbox project.

Dutch DPA issues highest fine for GDPR violations

14. April 2022

On April 7th, 2022, the Dutch Data Protection Authority, Autoriteit Persoonsgegevens, imposed the highest-ever fine for data protection violations, amounting to € 3.7 million. It is directed against the Minister of Finance, who was the data controller for the Tax and Customs Administration’s processing operations. The reason for this is the years of unlawful processing of personal data in the Fraud Notification Facility application, a blacklist in which reports and suspected fraud cases were registered.

The investigation revealed several violations of principles and other requirements of the GDPR. Firstly, there was no legal basis for the processing of the personal data included in the list, making it unlawful under Art. 5 (1) (a), Art. 6 (1) GDPR. Secondly, the pre-formulated purposes of collecting the personal data were not clearly defined and thus did not comply with the principle of purpose limitation stipulated in Art. 5 (1) (b) GDPR. Moreover, the personal data were often incorrect and non-updated, which constituted a violation of the principle of accuracy according to Art. 5 (1) (d) GDPR. Since the personal data were also kept longer than the applicable retention period allowed, they were not processed in accordance with the principle of storage limitation as laid down in Art. 5 (1) (e) GDPR. Furthermore, the security of the processing according to Art. 32 (1) GDPR was not ensured by appropriate technical and organizational measures. In addition, the internal Data Protection Officer was not involved properly and in a timely manner in the conduct of the Data Protection Impact Assessment pursuant to Art. 38 (1), 35 (2) GDPR.

The amount of the fine imposed results from the severity, consequences and duration of the violations. With the Fraud Notification Facility, the rights of 270,000 people have been violated in over six years. They were often falsely registered as (possible) fraudsters, which caused them to suffer serious consequences. It left many unable to obtain a payment agreement or eligible for debt rescheduling and therefore, in financial insecurity. The Tax and Customs Administration also used discriminatory practices. Employees were instructed to assess the risk of fraud based on people’s nationality and appearance, among other factors.

The DPA also considered previous serious infringements in determining the amount of the fine. The Minister of Finance was penalized in 2018 for inadequate security of personal data, in 2020 for illegal use of the citizen service number in the VAT identification number of self-employed persons, and in 2021 for the discriminatory and illegal action in the childcare benefits scandal. Following the latter affair, the Fraud Notification Facility was shut down in February 2020.

The Minister of Finance can appeal the decision within six weeks.

Belgian DPA declares technical standard used for cookie banner for consent requests illegal

28. March 2022

In a long-awaited decision on the Transparency and Consent Framework (TCF), the Belgian data protection authority APD concludes that this technical standard, which advertisers use to collect consent for targeted advertising on the Internet, does not comply with the principles of legality and fairness. Accordingly, it violates the GDPR.

The ADP’s decision is aligned with other European data protection authorities and has consequences for cookie banners and behavioral online advertising in the EU. The advertising association IAB Europe, which develops and operates the TCF system, must now delete the personal data collected in this way and pay a fine of 250,000 euros. In addition, conditions have been determined for the advertising industry under which the TCF may continue to be used at all.

Almost all companies, including advertising companies such as Google or Amazon, use the mechanism to pass on users’ presumed consent to the processing of their personal data for personalized advertising purposes. This decision will have a major impact on the protection of users’ personal data. This is also confirmed by Hielke Hijmans from APD.

The basic structure of the targeted advertising system is that each visit to a participating website triggers an auction among the providers of advertisements. Based on the desired prices and the user’s data profile, among other things, a decision is made in milliseconds as to which advertisements she will see. For this real-time bidding (RTB) to work, the advertising companies collect data to compile target groups for ads.

If users accept cookies or do not object that the use of their data is in the legitimate interest of the provider, the TCF generates a so-called TC string, which contains information about consent decisions. This identifier forms the basis for the creation of individual profiles and for the auctions in which advertising spaces and, with them, the attention of the desired target group are auctioned off, and is forwarded to partners in the OpenRTB system.

According to the authority, the TC strings already constitute personal data because they enable users to be identified with the IP address and the cookies set by the TCF. In addition, IAB Europe is said to be jointly legally responsible for any data processing via the framework, although IAB Europe has not positioned itself as a data processor, only as a provider of a standard.
The TCF envisions advertising providers invoking a “legitimate interest” in data collection in cookie banners that pop up all the time, rather than asking for consent. This would have to be prohibited, for example, for it to be lawful. The principles of privacy by design and by default are also violated, since consent is literally tricked by design tricks, the data flows are not manageable, and revocation of consent is hardly possible.

Google to launch Google Analytics 4 with aim to address EU Data Protection concerns

24. March 2022

On March 16, 2022, Google announced the launch of its new analytics solution, “Google Analytics 4”. Among other things, “Google Analytics 4” aims to address the most recent data protection developments regarding the use of analytical cookies and the transfers tied to such processing.

The announcement of this new launch comes following 101 complaints made by the non-governmental organization None of Your Business (NOYB) complaints with 30 EEA countries’ data protection authorities (DPA). Assessing the data transfer from the EU to the US after the Schrems II decision of the CJEU for the use of Google Analytics, the French and Austrian DPAs ruled that the transfer of EU personal data from the EU to the U.S. through the use of the Google Analytics cookies is unlawful under the GDPR.

In the press release, Google states that “Google Analytics 4 is designed with privacy at its core to provide a better experience for both our customers and their users. It helps businesses meet evolving needs and user expectations, with more comprehensive and granular controls for data collection and usage.”

However, the most important change that the launch of “Google Analytics 4” will have on the processing of personal data is that it will no longer store users’ IP addresses. This will limit the data processing and resulting transfers that Google Analytics was under scrutiny for in the EU, however it is unclear at this point if the EU DPAs will change their opinion on the use of Google Analytics with this new version.

According to the press release, the current Google Analytics will be suspended starting July 2023, and Google is recommending companies to move onto “Google Analytics 4” as soon as possible.

Irish DPC fines Meta 17 Million Euros over 2018 data breaches

16. March 2022

On March 15th, 2022, the Irish Data Protection Commission (DPC) has imposed a fine on Meta Platforms 17 million euros over a series of twelve data breaches, which happened from June to December 2018.

The inquiry of the DPC which led to this decision examined the extent to which Meta Platforms complied with the requirements of Arti. 5(1)(f), Art. 5(2), Art. 24(1) and Art. 32(1) GDPR in relation to the processing of personal data relevant to the twelve breach notifications.

As the result of this inquiry, the DPC found that Meta Platforms infringed Art. 5(2) and 24(1) GDPR.  In particular, the DPC assessed that Meta Platforms failed to have in place appropriate technical and organisational measures which would enable it to readily demonstrate the security measures that it implemented in practice to protect the data of its European users in the case of those twelve data breaches.

The processing under examination constituted a “cross-border” processing, and as such the DPC’s decision was subject to the co-decision-making process outlined in Art. 60 GDPR. This resulted in all of the other European supervisory authorities to be engaged in this decision as co-decision-makers.  While objections to the DPC’s draft decision were raised by two of the European supervisory authorities, consensus was achieved through further engagement between the DPC, and the supervisory authorities concerned.

“Accordingly, the DPC’s decision represents the collective views of both the DPC and its counterpart supervisory authorities throughout the EU,” the DPC stated in their press release.

A Meta spokesperson has commented on the decision, stating, “This fine is about record keeping practices from 2018 that we have since updated, not a failure to protect people’s information. We take our obligations under the GDPR seriously and will carefully consider this decision as our processes continue to evolve.”

Norwegian DPA aims to strengthen cookie regulations

22. February 2022

The Norwegian Data Protection Authority (DPA), Datatilsynet, has reached out to the Ministry of Local Government and District Affairs in a letter emphasizing the requirement of tightening cookie regulations in Norway.

This letter comes amid voices of consulting committees to delay the proposed tightened cookie regulations which have been on open consultation in Norway since the end of last year.

In the letter, the Datatilsynet points out the importance of strengthened cookie laws, specifically regarding the manner of obtaining consent and the design of the consent banners, which “are designed in ways that influence users to consent by making it more cumbersome and time consuming to not consent”.

The letter also references the French data protection authority’s decisions to fine Google €150 million and Facebook €60 million for inadequately facilitating refusal of cookies, as issued on 31 December 2021, and clearly outlined that in contrast to the practices for which Google and Facebook had been fined in France, the cookie practices would hardly have been considered problematic under the Norwegian cookie regulations, where illusory consents are allowed through pre-set browser settings.

Senior Legal Advisor Anders Obrestad stated that “these cases illustrate how unsustainable the current regulation of cookies and similar sports technologies in Norway are for the privacy of internet users”.

The Norwegian DPA hopes to be able to stop any delay in the strengthening of cookie regulations, as well as emphasize the importance of valid consent of internet users.

Apps are tracking personal data despite contrary information

15. February 2022

Tracking in apps enables the app providers to offer users personalized advertising. On the one hand, this causes higher financial revenues for app providers. On the other hand, it leads to approaches regarding data processing which are uncompliant with the GDPR.

For a year now data privacy labels are mandatory and designed to show personal data the app providers access (article in German) and provide to third parties. Although these labels on iPhones underline that data access does not take place, 80% of the analyzed applications that have these labels have access to data by tracking personal information. This is a conclusion of an analysis done by an IT specialist at the University of Oxford.

For example, the “RT News” app, which supposedly does not collect data, actually provides different sets of data to tracking services like Facebook, Google, ComScore and Taboola. However, data transfer activities have to be shown in the privacy labels of apps that may actually contain sensitive information of viewed content.

In particular, apps that access GPS location information are sold by data companies. This constitutes an abuse of data protection because personal data ishandled without being data protection law compliant and provided illegally to third parties.

In a published analysis in the Journal Internet Policy Review, tests of two million Android apps have shown that nearly 90 percent of Google’s Play Store apps share data with third parties directly after launching the app. However, Google indicates that these labels with false information about not tracking personal data come from the app provider. Google therefore evades responsibility for the implementation for these labels. Whereby, Apple asserts that controls of correctness are made.

Putting it into perspective, this issue raises the question whether these privacy labels make the use of apps safer in terms of data protection. One can argue that, if the app developers can simply give themselves these labels under Google, the Apple approach seems more legitimate. It remains to be seen if any actions will be taken in this regard.

CNIL judges use of Google Analytics illegal

14. February 2022

On 10th February 2022, the French Data Protection Authority Commission Nationale de l’Informatique et des Libertés (CNIL) has pronounced the use of Google Analytics on European websites to not be in line with the requirements of the General Data Protection Regulation (GDPR) and has ordered the website owner to comply with the requirements of the GDPR within a month’s time.

The CNIL judged this decision in regard to several complaints maybe by the NOYB association concerning the transfer to the USA of personal data collected during visits to websites using Google Analytics. All in all, NOYB filed 101 complaints against data controllers allegedly transferring personal data to the USA in all of the 27 EU Member States and the three further states of European Economic Area (EEA).

Only two weeks ago, the Austrian Data Protection Authority (ADPA) made a similar decision, stating that the use of Google Analytics was in violation of the GDPR.

Regarding the French decision, the CNIL concluded that transfers to the United States are currently not sufficiently regulated. In the absence of an adequacy decision concerning transfers to the USA, the transfer of data can only take place if appropriate guarantees are provided for this data flow. However, while Google has adopted additional measures to regulate data transfers in the context of the Google Analytics functionality, the CNIL deemed that those measures are not sufficient to exclude the accessibility of the personal data for US intelligence services. This would result in “a risk for French website users who use this service and whose data is exported”.

The CNIL stated therefore that “the data of Internet users is thus transferred to the United States in violation of Articles 44 et seq. of the GDPR. The CNIL therefore ordered the website manager to bring this processing into compliance with the GDPR, if necessary by ceasing to use the Google Analytics functionality (under the current conditions) or by using a tool that does not involve a transfer outside the EU. The website operator in question has one month to comply.”

The CNIL has also given advice regarding website audience measurement and analysis services. For these purposes, the CNIL recommended that these tools should only be used to produce anonymous statistical data. This would allow for an exemption as the aggregated data would not be considered “personal” data and therefore not fall under the scope of the GDPR and the requirements for consent, if the data controller ensures that there are no illegal transfers.

European Commission adopts South Korea Adequacy Decision

30. December 2021

On December 17th, 2021, the European Commission (Commission) announced in a statement it had adopted an adequacy decision for the transfer of personal data from the European Union (EU) to the Republic of Korea (South Korea) under the General Data Protection Regulation (GDPR).

An adequacy decision is one of the instruments available under the GDPR to transfer personal data from the EU to third countries that ensure a comparable level of protection for personal data as the EU. It is a Commission decision under which personal data can flow freely and securely from the EU to the third country in question without any further conditions or authorizations being required. In other words, the transfer of data to the third country in question can be handled in the same way as the transfer of data within the EU.

This adequacy decision allows for the free flow of personal data between the EU and South Korea without the need for any further authorization or transfer instrument, and it also applies to the transfer of personal data between public sector bodies. It complements the Free Trade Agreement (FTA) between the EU and South Korea, which entered into force in July 2011. The trade agreement has led to a significant increase in bilateral trade in goods and services and, inevitably, in the exchange of personal data.

Unlike the adequacy decision regarding the United Kingdom, this adequacy decision is not time-limited.

The Commission’s statement reads:

The adequacy decision will complement the EU – Republic of Korea Free Trade Agreement with respect to personal data flows. As such, it shows that, in the digital era, promoting high privacy and personal data protection standards and facilitating international trade can go hand in hand.

In South Korea, the processing of personal data is governed by the Personal Information Portection Act (PIPA), which provides similar principles, safeguards, individual rights and obligations as the ones under EU law.

An important step in the adequacy talks was the reform of PIPA, which took effect in August 2020 and strengthened the investigative and enforcement powers of the Personal Information Protection Commission (PIPC), the independent data protection authority of South Korea. As part of the adequacy talks, both sides also agreed on several additional safeguards that will improve the protection of personal data processed in South Korea, such as transparency and onward transfers.

These safeguards provide stronger protections, for example, South Korean data importers will be required to inform Europeans about the processing of their data, and onward transfers to third countries must ensure that the data continue to enjoy the same level of protection. These regulations are binding and can be enforced by the PIPC and South Korean courts.

The Commission has also published a Q&A on the adequacy decision.

Pages: 1 2 3 4 5 6 7 8 9 10 11 Next
1 2 3 11