Google to launch Google Analytics 4 with aim to address EU Data Protection concerns

24. March 2022

On March 16, 2022, Google announced the launch of its new analytics solution, “Google Analytics 4”. Among other things, “Google Analytics 4” aims to address the most recent data protection developments regarding the use of analytical cookies and the transfers tied to such processing.

The announcement of this new launch comes following 101 complaints made by the non-governmental organization None of Your Business (NOYB) complaints with 30 EEA countries’ data protection authorities (DPA). Assessing the data transfer from the EU to the US after the Schrems II decision of the CJEU for the use of Google Analytics, the French and Austrian DPAs ruled that the transfer of EU personal data from the EU to the U.S. through the use of the Google Analytics cookies is unlawful under the GDPR.

In the press release, Google states that “Google Analytics 4 is designed with privacy at its core to provide a better experience for both our customers and their users. It helps businesses meet evolving needs and user expectations, with more comprehensive and granular controls for data collection and usage.”

However, the most important change that the launch of “Google Analytics 4” will have on the processing of personal data is that it will no longer store users’ IP addresses. This will limit the data processing and resulting transfers that Google Analytics was under scrutiny for in the EU, however it is unclear at this point if the EU DPAs will change their opinion on the use of Google Analytics with this new version.

According to the press release, the current Google Analytics will be suspended starting July 2023, and Google is recommending companies to move onto “Google Analytics 4” as soon as possible.

Land register number allows access to personal data, Polish authorities confirm

23. March 2022

In a legal dispute that has been ongoing since 2020, the Polish Commissioner for Human Rights recently stated that the disclosure of land register numbers can lead to obtaining a large amount of personal data contained in the registers. In his opinion, general access to such detailed data harms and significantly restricts the informational autonomy of individuals.

The Commissioner’s view confirms the position of the Polish Data Protection Authority, which, in an administrative decision dated August 24th, 2020, ordered the Polish General Surveyor to cease making land register numbers available on the website “GEOPORTAL2”. He also imposed a fine of PLN 100,000 for violating the principle of lawfulness under Articles 5 para. 1 lit. a, 6 para. 1 GDPR, as there was no legal basis for the processing.

The decision was justified by the fact that land register numbers allow indirect identification of property owners and are therefore considered personal data. Moreover, the publication of these enables access to further data such as national ID number or property address. This may lead to a variety of dangers associated with the use of such data, in particular identity theft or impersonation for criminal purposes.

This opinion was also held by the Polish Voivodeship Administrative Court in Warsaw, which on May 5th, 2021, dismissed the Surveyor’s complaint against the decision of the Polish Data Protection Authority.

Dutch data protection authority imposes fine of €525,000

The Dutch Data Protection Authority, autoriteit persoonsgegevens (hereinafter “ap”) imposed a fine of €525,000 on DPG Media at the beginning of March.

The background to the fine were access and deletion requests of various data subjects who had a newspaper subscription or received increased advertising. If a data subject wanted to know what personal data the company had collected about him, he had to send an ID document to DPG Media to prove his identity. The same applied to anyone who asked the company to delete their data. The customer was supposed to either upload a scan of his ID document or send it to the company by mail or letter.

DPG Media’s procedure for proof of identity was criticized for several reasons. From ap’s point of view, too much data was requested and it was made too difficult for the data subjects to assert their rights to access and deletion. If, for example, DPG Media had requested blackened ID documents, this method of proof of identity would also have been questionable. The ap emphasizes that requesting blackened ID documents is often disproportionate.

It also notes that ID documents are documents that are particularly worthy of protection. Especially regarding possible identity theft, they must be handled very carefully.

Thus, ap clarifies that, even if an identification document is in principle suitable for identifying the data subject, less intrusive identifiers should be used in preference. Milder identifiers, but equally suitable in this specific case, are for example to request the postal address for a telephone inquiry or – as recital 57 states – the use of an “authentication mechanism such as the same credentials, used by the data subject to log-in to the online service offered by the data controller.“

Irish DPC fines Meta 17 Million Euros over 2018 data breaches

16. March 2022

On March 15th, 2022, the Irish Data Protection Commission (DPC) has imposed a fine on Meta Platforms 17 million euros over a series of twelve data breaches, which happened from June to December 2018.

The inquiry of the DPC which led to this decision examined the extent to which Meta Platforms complied with the requirements of Arti. 5(1)(f), Art. 5(2), Art. 24(1) and Art. 32(1) GDPR in relation to the processing of personal data relevant to the twelve breach notifications.

As the result of this inquiry, the DPC found that Meta Platforms infringed Art. 5(2) and 24(1) GDPR.  In particular, the DPC assessed that Meta Platforms failed to have in place appropriate technical and organisational measures which would enable it to readily demonstrate the security measures that it implemented in practice to protect the data of its European users in the case of those twelve data breaches.

The processing under examination constituted a “cross-border” processing, and as such the DPC’s decision was subject to the co-decision-making process outlined in Art. 60 GDPR. This resulted in all of the other European supervisory authorities to be engaged in this decision as co-decision-makers.  While objections to the DPC’s draft decision were raised by two of the European supervisory authorities, consensus was achieved through further engagement between the DPC, and the supervisory authorities concerned.

“Accordingly, the DPC’s decision represents the collective views of both the DPC and its counterpart supervisory authorities throughout the EU,” the DPC stated in their press release.

A Meta spokesperson has commented on the decision, stating, “This fine is about record keeping practices from 2018 that we have since updated, not a failure to protect people’s information. We take our obligations under the GDPR seriously and will carefully consider this decision as our processes continue to evolve.”

ICO releases Guidance on Video Surveillance

7. March 2022

At the end of February 2022, The UK Information Commissioners’ Office (ICO) published a guidance for organizations that capture CCTVs footage in order to provide advice for when they operate video surveillance systems that view or record individuals.

The recommendations aim to focus on best practices for data activities related to “emerging capabilities that can assist human decision making, such as the use of Facial Recognition Technology and machine learning algorithms.” As per the Guidance, surveillance systems specifically include traditional CCTV, Automatic Number Plate Recognition, Body Worn Video, Drones, Facial Recognition Technology, dashcams and smart doorbell cameras.

In their Guidance, the ICO offers checklists with points that controllers can use in order to monitor their use of video surveillance and keep track of their compliance with the applicable law. It further touches on the principles of data protection and how they specifically apply to video surveillance. In addition, it helps companies with the documentation of a Data Processing Impact Assessment.

The Guidance gives in depth advice on video surveillance at the workplace as well as if video feeds should also record audio.

Overall, the Guidance aims to sensibilize controllers regarding the various issues faced with when using video surveillance, and gives them in depth help on what to do to be compliant with the data protection regulations in the UK.

Artificial intelligence in business operations poses problems in terms of GDPR compliance

25. February 2022

With the introduction of the General Data Protection Regulation, the intention was to protect personal data and to minimize the processing of such data to the absolutely necessary extent. Processing should be possible for a specific, well-defined purpose.

In the age of technology, it is particularly practical to access artificial intelligence, especially in everyday business, and use it to optimize business processes. More and more companies are looking for solutions based on artificial intelligence. This generally involves processing significant amounts of personal data.

In order for artificial intelligence to be implementable at all, this system must first be given a lot of data to store so that it can learn from it and thus make its own decisions.

When using so-called “machine learning”, which forms a subset of artificial intelligence, care must be taken as to whether and what data is processed so that it is in compliance with the General Data Protection Regulation.

If a company receives data for further processing and analysis, or if it shares data for this purpose, there must be mutual clarity regarding this processing.

The use of artificial intelligence faces significant challenges in terms of compliance with the General Data Protection Regulation. These are primarily compliance with the principles of transparency, purpose limitation and data minimization.

In addition, the data protection impact assessment required by the General Data Protection Regulation also poses problems with regard to artificial intelligence, as artificial intelligence is a self-learning system that can make its own decisions. Thus, some of these decisions may not be understandable or predictable.

In summary, there is a strong tension between artificial intelligence and data privacy.

Many companies are trying to get around this problem with the so-called “crowd sourcing” solution. This involves the development of anonymized data, which is additionally provided with a fuzziness instead of being able to trace it back to a person.

French CNIL highlights its data protection enforcement priorities for 2022

Following complaints received, but also on its own initiative, the French data protection supervisory authority Commission Nationale Informatique et Liberté (hereinafter ‘CNIL’) carries out checks, also based on reports of data protection violations. CNIL has published three topics for 2022 on which it will focus in particular. These topics are: commercial prospecting, surveillance tools in the context of teleworking, and cloud services.

With regard to commercial prospecting, CNIL draws particular attention to unsolicited advertising calls, which are a recurring complaint to CNIL in France.

In February 2022, CNIL published a guideline for “commercial management”, which is particularly relevant for commercial canvassing.

Based on this guideline, CNIL will control GDPR compliance. The focus here will be on professionals who resell data.

Regarding the monitoring tools for teleworking, identified as CNIL’s second priority, CNIL aims to assist in balancing the interests of protecting the privacy of workers who have the possibility of home office due to COVID-19 and the legitimate monitoring of activities by informing the rules to be followed for this purpose. CNIL believes that employers need to be more strictly controlled in this regard.

Last but not least, CNIL draws particular attention to the potential data protection breaches regarding the use of cloud computing technologies. Since massive data transfers outside the European Union can be considered here in particular, activities in this area must be monitored more closely. For this purpose, CNIL reserves the right to focus in particular on the frameworks governing the contractual relationships between data controllers and cloud technology providers.

Norwegian DPA aims to strengthen cookie regulations

22. February 2022

The Norwegian Data Protection Authority (DPA), Datatilsynet, has reached out to the Ministry of Local Government and District Affairs in a letter emphasizing the requirement of tightening cookie regulations in Norway.

This letter comes amid voices of consulting committees to delay the proposed tightened cookie regulations which have been on open consultation in Norway since the end of last year.

In the letter, the Datatilsynet points out the importance of strengthened cookie laws, specifically regarding the manner of obtaining consent and the design of the consent banners, which “are designed in ways that influence users to consent by making it more cumbersome and time consuming to not consent”.

The letter also references the French data protection authority’s decisions to fine Google €150 million and Facebook €60 million for inadequately facilitating refusal of cookies, as issued on 31 December 2021, and clearly outlined that in contrast to the practices for which Google and Facebook had been fined in France, the cookie practices would hardly have been considered problematic under the Norwegian cookie regulations, where illusory consents are allowed through pre-set browser settings.

Senior Legal Advisor Anders Obrestad stated that “these cases illustrate how unsustainable the current regulation of cookies and similar sports technologies in Norway are for the privacy of internet users”.

The Norwegian DPA hopes to be able to stop any delay in the strengthening of cookie regulations, as well as emphasize the importance of valid consent of internet users.

Apps are tracking personal data despite contrary information

15. February 2022

Tracking in apps enables the app providers to offer users personalized advertising. On the one hand, this causes higher financial revenues for app providers. On the other hand, it leads to approaches regarding data processing which are uncompliant with the GDPR.

For a year now data privacy labels are mandatory and designed to show personal data the app providers access (article in German) and provide to third parties. Although these labels on iPhones underline that data access does not take place, 80% of the analyzed applications that have these labels have access to data by tracking personal information. This is a conclusion of an analysis done by an IT specialist at the University of Oxford.

For example, the “RT News” app, which supposedly does not collect data, actually provides different sets of data to tracking services like Facebook, Google, ComScore and Taboola. However, data transfer activities have to be shown in the privacy labels of apps that may actually contain sensitive information of viewed content.

In particular, apps that access GPS location information are sold by data companies. This constitutes an abuse of data protection because personal data ishandled without being data protection law compliant and provided illegally to third parties.

In a published analysis in the Journal Internet Policy Review, tests of two million Android apps have shown that nearly 90 percent of Google’s Play Store apps share data with third parties directly after launching the app. However, Google indicates that these labels with false information about not tracking personal data come from the app provider. Google therefore evades responsibility for the implementation for these labels. Whereby, Apple asserts that controls of correctness are made.

Putting it into perspective, this issue raises the question whether these privacy labels make the use of apps safer in terms of data protection. One can argue that, if the app developers can simply give themselves these labels under Google, the Apple approach seems more legitimate. It remains to be seen if any actions will be taken in this regard.

CNIL judges use of Google Analytics illegal

14. February 2022

On 10th February 2022, the French Data Protection Authority Commission Nationale de l’Informatique et des Libertés (CNIL) has pronounced the use of Google Analytics on European websites to not be in line with the requirements of the General Data Protection Regulation (GDPR) and has ordered the website owner to comply with the requirements of the GDPR within a month’s time.

The CNIL judged this decision in regard to several complaints maybe by the NOYB association concerning the transfer to the USA of personal data collected during visits to websites using Google Analytics. All in all, NOYB filed 101 complaints against data controllers allegedly transferring personal data to the USA in all of the 27 EU Member States and the three further states of European Economic Area (EEA).

Only two weeks ago, the Austrian Data Protection Authority (ADPA) made a similar decision, stating that the use of Google Analytics was in violation of the GDPR.

Regarding the French decision, the CNIL concluded that transfers to the United States are currently not sufficiently regulated. In the absence of an adequacy decision concerning transfers to the USA, the transfer of data can only take place if appropriate guarantees are provided for this data flow. However, while Google has adopted additional measures to regulate data transfers in the context of the Google Analytics functionality, the CNIL deemed that those measures are not sufficient to exclude the accessibility of the personal data for US intelligence services. This would result in “a risk for French website users who use this service and whose data is exported”.

The CNIL stated therefore that “the data of Internet users is thus transferred to the United States in violation of Articles 44 et seq. of the GDPR. The CNIL therefore ordered the website manager to bring this processing into compliance with the GDPR, if necessary by ceasing to use the Google Analytics functionality (under the current conditions) or by using a tool that does not involve a transfer outside the EU. The website operator in question has one month to comply.”

The CNIL has also given advice regarding website audience measurement and analysis services. For these purposes, the CNIL recommended that these tools should only be used to produce anonymous statistical data. This would allow for an exemption as the aggregated data would not be considered “personal” data and therefore not fall under the scope of the GDPR and the requirements for consent, if the data controller ensures that there are no illegal transfers.

Pages: Prev 1 2 3 4 5 6 7 8 9 10 ... 67 68 69 Next
1 4 5 6 7 8 69