Category: Personal Data

Record GDPR fine by the Hungarian Data Protection Authority for the unlawful use of AI

22. April 2022

The Hungarian Data Protection Authority (Nemzeti Adatvédelmi és Információszabadság Hatóság, NAIH) has recently published its annual report in which it presented a case where the Authority imposed the highest fine to date of ca. €670,000 (HUF 250 million).

This case involved the processing of personal data by a bank that acted as a data controller. The controller automatically analyzed recorded audio of costumer calls. It used the results of the analysis to determine which customers should be called back by analyzing the emotional state of the caller using an artificial intelligence-based speech signal processing software that automatically analyzed the call based on a list of keywords and the emotional state of the caller. The software then established a ranking of the calls serving as a recommendation as to which caller should be called back as a priority.

The bank justified the processing on the basis of its legitimate interests in retaining its customers and improving the efficiency of its internal operations.

According to the bank this procedure aimed at quality control, in particular at the prevention of customer complaints. However, the Authority held that the bank’s privacy notice referred to these processing activities in general terms only, and no material information was made available regarding the voice analysis itself. Furthermore, the privacy notice only indicated quality control and complaint prevention as purposes of the data processing.

In addition, the Authority highlighted that while the Bank had conducted a data protection impact assessment and found that the processing posed a high risk to data subjects due to its ability to profile and perform assessments, the data protection impact assessment did not provide substantive solutions to address these risks. The Authority also emphasized that the legal basis of legitimate interest cannot serve as a “last resort” when all other legal bases are inapplicable, and therefore data controllers cannot rely on this legal basis at any time and for any reason. Consequently, the Authority not only imposed a record fine, but also required the bank to stop analyzing emotions in the context of speech analysis.

 

Dutch DPA issues highest fine for GDPR violations

14. April 2022

On April 7th, 2022, the Dutch Data Protection Authority, Autoriteit Persoonsgegevens, imposed the highest-ever fine for data protection violations, amounting to € 3.7 million. It is directed against the Minister of Finance, who was the data controller for the Tax and Customs Administration’s processing operations. The reason for this is the years of unlawful processing of personal data in the Fraud Notification Facility application, a blacklist in which reports and suspected fraud cases were registered.

The investigation revealed several violations of principles and other requirements of the GDPR. Firstly, there was no legal basis for the processing of the personal data included in the list, making it unlawful under Art. 5 (1) (a), Art. 6 (1) GDPR. Secondly, the pre-formulated purposes of collecting the personal data were not clearly defined and thus did not comply with the principle of purpose limitation stipulated in Art. 5 (1) (b) GDPR. Moreover, the personal data were often incorrect and non-updated, which constituted a violation of the principle of accuracy according to Art. 5 (1) (d) GDPR. Since the personal data were also kept longer than the applicable retention period allowed, they were not processed in accordance with the principle of storage limitation as laid down in Art. 5 (1) (e) GDPR. Furthermore, the security of the processing according to Art. 32 (1) GDPR was not ensured by appropriate technical and organizational measures. In addition, the internal Data Protection Officer was not involved properly and in a timely manner in the conduct of the Data Protection Impact Assessment pursuant to Art. 38 (1), 35 (2) GDPR.

The amount of the fine imposed results from the severity, consequences and duration of the violations. With the Fraud Notification Facility, the rights of 270,000 people have been violated in over six years. They were often falsely registered as (possible) fraudsters, which caused them to suffer serious consequences. It left many unable to obtain a payment agreement or eligible for debt rescheduling and therefore, in financial insecurity. The Tax and Customs Administration also used discriminatory practices. Employees were instructed to assess the risk of fraud based on people’s nationality and appearance, among other factors.

The DPA also considered previous serious infringements in determining the amount of the fine. The Minister of Finance was penalized in 2018 for inadequate security of personal data, in 2020 for illegal use of the citizen service number in the VAT identification number of self-employed persons, and in 2021 for the discriminatory and illegal action in the childcare benefits scandal. Following the latter affair, the Fraud Notification Facility was shut down in February 2020.

The Minister of Finance can appeal the decision within six weeks.

ECJ against data retention without any reason or limit

6. April 2022

In the press release of the judgment of 5.4.2022, the ECJ has once again ruled that the collection of private communications data is unlawful without any reason or limit. This reinforces the rulings of 2014, 2016 and 2020, according to which changes are necessary at EU and national level.

In this judgment, the ECJ states that the decision to allow data retention as evidence in the case of a long-standing murder case is for the national court in Ireland.

Questions regarding this issue were submitted in 2020 by Germany, France and Ireland. The EU Advocate General confirmed, in a legally non-binding manner, the incompatibility of national laws with EU fundamental rights.

However, a first exception to data retention resulted from the 2020 judgment, according to which, in the event of a serious threat to national security, storage for a limited period and subject to judicial review was recognized as permissible.

Subsequently, a judgment in 2021 stated that national law must provide clear and precise rules with minimum conditions for the purpose of preventing abuse.

According to the ECJ, an without cause storage with restriction should be allowed in the following cases:

  • When limited to specific individuals or locations;
  • No concrete evidence of crime necessary, local crime rate is sufficient;
  • Frequently visited locations such as airports and train stations;
  • When national laws require the identity of prepaid cardholders to be stored;
  • Quick freeze, an immediate backup and temporary data storage if there is suspicion of crime.

All of these are to be used only to combat serious crime or prevent threats to national security.

In Germany, Justice Minister Marco Buschmann is in favor of a quick freeze solution as an alternative that preserves fundamental rights. However, the EU states are to work on a legally compliant option for data retention despite the ECJ’s criticism of this principle.

Italian DPA imposes a 20 Mio Euro Fine on Clearview AI

29. March 2022

The Italian data protection authority “Garante” has fined Clearview AI 20 million Euros for data protection violations regarding its facial recognition technology. Clearview AI’s facial recognition system uses over 10 billion images from the internet and prides themself to have the largest biometric image database in the world. The data protection authority has found Clearview AI to be in breach of numerous GDPR requirements. For example, fair and lawful processing was not carried out within the data protection framework, and there was no lawful basis for the collection of information and no appropriate transparency and data retention policies.

Last November, the UK ICO warned of a potential 17 million pound fine against Clearview, and in this context, and also ordered Clearview to stop processing data.

Then, in December, the French CNIL ordered Clearview to stop processing citizens’ data and gave it two months to delete all the data it had stored, but did not mention any explicit financial sanction.

In Italy, Clearview AI must now, in addition to the 20 million Euro fine, not only delete all images of Italian citizens from its database. It must also delete the biometric information needed to search for a specific face. Furthermore, the company must provide a EU representative as a point of contact for EU data subjects and the supervisory authority.

Belgian DPA declares technical standard used for cookie banner for consent requests illegal

28. March 2022

In a long-awaited decision on the Transparency and Consent Framework (TCF), the Belgian data protection authority APD concludes that this technical standard, which advertisers use to collect consent for targeted advertising on the Internet, does not comply with the principles of legality and fairness. Accordingly, it violates the GDPR.

The ADP’s decision is aligned with other European data protection authorities and has consequences for cookie banners and behavioral online advertising in the EU. The advertising association IAB Europe, which develops and operates the TCF system, must now delete the personal data collected in this way and pay a fine of 250,000 euros. In addition, conditions have been determined for the advertising industry under which the TCF may continue to be used at all.

Almost all companies, including advertising companies such as Google or Amazon, use the mechanism to pass on users’ presumed consent to the processing of their personal data for personalized advertising purposes. This decision will have a major impact on the protection of users’ personal data. This is also confirmed by Hielke Hijmans from APD.

The basic structure of the targeted advertising system is that each visit to a participating website triggers an auction among the providers of advertisements. Based on the desired prices and the user’s data profile, among other things, a decision is made in milliseconds as to which advertisements she will see. For this real-time bidding (RTB) to work, the advertising companies collect data to compile target groups for ads.

If users accept cookies or do not object that the use of their data is in the legitimate interest of the provider, the TCF generates a so-called TC string, which contains information about consent decisions. This identifier forms the basis for the creation of individual profiles and for the auctions in which advertising spaces and, with them, the attention of the desired target group are auctioned off, and is forwarded to partners in the OpenRTB system.

According to the authority, the TC strings already constitute personal data because they enable users to be identified with the IP address and the cookies set by the TCF. In addition, IAB Europe is said to be jointly legally responsible for any data processing via the framework, although IAB Europe has not positioned itself as a data processor, only as a provider of a standard.
The TCF envisions advertising providers invoking a “legitimate interest” in data collection in cookie banners that pop up all the time, rather than asking for consent. This would have to be prohibited, for example, for it to be lawful. The principles of privacy by design and by default are also violated, since consent is literally tricked by design tricks, the data flows are not manageable, and revocation of consent is hardly possible.

Google to launch Google Analytics 4 with aim to address EU Data Protection concerns

24. March 2022

On March 16, 2022, Google announced the launch of its new analytics solution, “Google Analytics 4”. Among other things, “Google Analytics 4” aims to address the most recent data protection developments regarding the use of analytical cookies and the transfers tied to such processing.

The announcement of this new launch comes following 101 complaints made by the non-governmental organization None of Your Business (NOYB) complaints with 30 EEA countries’ data protection authorities (DPA). Assessing the data transfer from the EU to the US after the Schrems II decision of the CJEU for the use of Google Analytics, the French and Austrian DPAs ruled that the transfer of EU personal data from the EU to the U.S. through the use of the Google Analytics cookies is unlawful under the GDPR.

In the press release, Google states that “Google Analytics 4 is designed with privacy at its core to provide a better experience for both our customers and their users. It helps businesses meet evolving needs and user expectations, with more comprehensive and granular controls for data collection and usage.”

However, the most important change that the launch of “Google Analytics 4” will have on the processing of personal data is that it will no longer store users’ IP addresses. This will limit the data processing and resulting transfers that Google Analytics was under scrutiny for in the EU, however it is unclear at this point if the EU DPAs will change their opinion on the use of Google Analytics with this new version.

According to the press release, the current Google Analytics will be suspended starting July 2023, and Google is recommending companies to move onto “Google Analytics 4” as soon as possible.

Land register number allows access to personal data, Polish authorities confirm

23. March 2022

In a legal dispute that has been ongoing since 2020, the Polish Commissioner for Human Rights recently stated that the disclosure of land register numbers can lead to obtaining a large amount of personal data contained in the registers. In his opinion, general access to such detailed data harms and significantly restricts the informational autonomy of individuals.

The Commissioner’s view confirms the position of the Polish Data Protection Authority, which, in an administrative decision dated August 24th, 2020, ordered the Polish General Surveyor to cease making land register numbers available on the website “GEOPORTAL2”. He also imposed a fine of PLN 100,000 for violating the principle of lawfulness under Articles 5 para. 1 lit. a, 6 para. 1 GDPR, as there was no legal basis for the processing.

The decision was justified by the fact that land register numbers allow indirect identification of property owners and are therefore considered personal data. Moreover, the publication of these enables access to further data such as national ID number or property address. This may lead to a variety of dangers associated with the use of such data, in particular identity theft or impersonation for criminal purposes.

This opinion was also held by the Polish Voivodeship Administrative Court in Warsaw, which on May 5th, 2021, dismissed the Surveyor’s complaint against the decision of the Polish Data Protection Authority.

Dutch data protection authority imposes fine of €525,000

The Dutch Data Protection Authority, autoriteit persoonsgegevens (hereinafter “ap”) imposed a fine of €525,000 on DPG Media at the beginning of March.

The background to the fine were access and deletion requests of various data subjects who had a newspaper subscription or received increased advertising. If a data subject wanted to know what personal data the company had collected about him, he had to send an ID document to DPG Media to prove his identity. The same applied to anyone who asked the company to delete their data. The customer was supposed to either upload a scan of his ID document or send it to the company by mail or letter.

DPG Media’s procedure for proof of identity was criticized for several reasons. From ap’s point of view, too much data was requested and it was made too difficult for the data subjects to assert their rights to access and deletion. If, for example, DPG Media had requested blackened ID documents, this method of proof of identity would also have been questionable. The ap emphasizes that requesting blackened ID documents is often disproportionate.

It also notes that ID documents are documents that are particularly worthy of protection. Especially regarding possible identity theft, they must be handled very carefully.

Thus, ap clarifies that, even if an identification document is in principle suitable for identifying the data subject, less intrusive identifiers should be used in preference. Milder identifiers, but equally suitable in this specific case, are for example to request the postal address for a telephone inquiry or – as recital 57 states – the use of an “authentication mechanism such as the same credentials, used by the data subject to log-in to the online service offered by the data controller.“

Artificial intelligence in business operations poses problems in terms of GDPR compliance

25. February 2022

With the introduction of the General Data Protection Regulation, the intention was to protect personal data and to minimize the processing of such data to the absolutely necessary extent. Processing should be possible for a specific, well-defined purpose.

In the age of technology, it is particularly practical to access artificial intelligence, especially in everyday business, and use it to optimize business processes. More and more companies are looking for solutions based on artificial intelligence. This generally involves processing significant amounts of personal data.

In order for artificial intelligence to be implementable at all, this system must first be given a lot of data to store so that it can learn from it and thus make its own decisions.

When using so-called “machine learning”, which forms a subset of artificial intelligence, care must be taken as to whether and what data is processed so that it is in compliance with the General Data Protection Regulation.

If a company receives data for further processing and analysis, or if it shares data for this purpose, there must be mutual clarity regarding this processing.

The use of artificial intelligence faces significant challenges in terms of compliance with the General Data Protection Regulation. These are primarily compliance with the principles of transparency, purpose limitation and data minimization.

In addition, the data protection impact assessment required by the General Data Protection Regulation also poses problems with regard to artificial intelligence, as artificial intelligence is a self-learning system that can make its own decisions. Thus, some of these decisions may not be understandable or predictable.

In summary, there is a strong tension between artificial intelligence and data privacy.

Many companies are trying to get around this problem with the so-called “crowd sourcing” solution. This involves the development of anonymized data, which is additionally provided with a fuzziness instead of being able to trace it back to a person.

Apps are tracking personal data despite contrary information

15. February 2022

Tracking in apps enables the app providers to offer users personalized advertising. On the one hand, this causes higher financial revenues for app providers. On the other hand, it leads to approaches regarding data processing which are uncompliant with the GDPR.

For a year now data privacy labels are mandatory and designed to show personal data the app providers access (article in German) and provide to third parties. Although these labels on iPhones underline that data access does not take place, 80% of the analyzed applications that have these labels have access to data by tracking personal information. This is a conclusion of an analysis done by an IT specialist at the University of Oxford.

For example, the “RT News” app, which supposedly does not collect data, actually provides different sets of data to tracking services like Facebook, Google, ComScore and Taboola. However, data transfer activities have to be shown in the privacy labels of apps that may actually contain sensitive information of viewed content.

In particular, apps that access GPS location information are sold by data companies. This constitutes an abuse of data protection because personal data ishandled without being data protection law compliant and provided illegally to third parties.

In a published analysis in the Journal Internet Policy Review, tests of two million Android apps have shown that nearly 90 percent of Google’s Play Store apps share data with third parties directly after launching the app. However, Google indicates that these labels with false information about not tracking personal data come from the app provider. Google therefore evades responsibility for the implementation for these labels. Whereby, Apple asserts that controls of correctness are made.

Putting it into perspective, this issue raises the question whether these privacy labels make the use of apps safer in terms of data protection. One can argue that, if the app developers can simply give themselves these labels under Google, the Apple approach seems more legitimate. It remains to be seen if any actions will be taken in this regard.

Pages: 1 2 3 4 5 6 7 8 9 10 ... 20 21 22 Next
1 2 3 22