Garante statement: use of Google Analytics violates GDPR

29. June 2022

On June, 23, 2022, the Italian Data Protection Authority (Garante) released a statement on the use of Google Analytics (GA) holding the view that the use of GA by Italian websites without otherwise applicable safeguards violates the GDPR.

Garante comes out as the third data protection authority within the EU that declares the transfer of personal data through GA illegal. Earlier this year, CNIL and the Austrian data protection authority delivered a decision each, both coming to the same conclusion, namely that the use of GA violates the GDPR.

What lead to this statement is that Garante had received a number of complaints. However, it is also the product of coordination with other European privacy authorities.

In its reasoning, Garante assigns a special role to cookies that help GA to collect personal data, such as the IP address, visited pages, type of browser, and the kind of operating system. Garante considers it as proven that personal data is being transferred to the US when using GA. Garante reiterates that IP addresses qualify as personal data and that the pseudoanonymisation undertaken by GA is not sufficient to protect personal data from being accessed from US governmental agencies.

Garante called on all controllers and processors involved in Italian website operations for compliance and ordered a period of 90 days to comply with their obligations under the GDPR. The statement further states: “The Italian SA calls upon all controllers to verify that the use of cookies and other tracking tools on their websites is compliant with data protection law; this applies in particular to Google Analytics and similar services.”

Canada’s new privacy policy: Bill C-27

On June 16th, 2022 the Canadian Federal Government has introduced a new privacy bill, named Bill C-27 (a re-working of Bill C-11). Among its main goals there is the will to strengthen the role of the Privacy Commissioner and to establish a special Data Protection Tribunal. Furthermore, it aims to propose new regulations regarding artificial intelligence. If passed, the act would substitute Part 1 of the current PIPEDA (Personal Information and Electronic Documents Act), replacing it with the new CPPA (Consumer Privacy Protection Act). Bill C-27 still needs to undergo reviews by various committees and is not expected to come into force until after summer.

The Office of the Privacy Commissioner  enforces the Canadian federal privacy laws and provides counsel to individuals regarding the protection of their personal data and their rights. With the new bill the Commissioner will be able to make recommendations about penalties to the Tribunal along with other authorities.

If the Bill comes into force, the Data Protection Tribunal’s power will be amplified. Its decisions will be binding and final.  Moreover, its decisions may be enforced as if they were orders of a superior court. The Tribunal also may review the recommendations made by the Privacy Commissioner, but is not bound to follow them in any way.

One other important innovation brought by Bill C-27 is the clarification of the concept of legitimate interest: this has been added as an exception to consent, as it outweighs potential adverse effects on the data subject.

All data regarding children are now considered to be sensitive, and must be treated as such by organizations and corporations. This means introducing higher standards for handling that data and limiting the rights to collect that information.

The concepts of de-identification and anonymization have been adapted to global standards.

Finally, along with Bill C-27 the Government aims to introduce the new Artificial Intelligence and Data Act, creating a framework for high-impact AI systems. Its goals are to regulate international and intraprovincial AI systems commerce by introducing common requirements across Canada, and to prohibit conduct in relation to AI systems that may result in harm to individuals or their interests. A new working definition of AI system is given.

Lastly, the Act aims at the creation of a new AI Data Commissioner inside a ministry. This figure will help the enforcement of the Act across Canada.

EDPB adopts new guidelines on certification as a tool for transfers

23. June 2022

On June 16, 2022, the European Data Protection Board (EDPB) announced on its website that it had adopted guidelines on certification as a tool for transfers of personal data (publication is yet to take place following linguistic checks). Once published these guidelines will undergo public consultation until September 2022.

On a first note, these guidelines can be placed within the broader context of international data transfers, as envisioned by art. 46 (2) (f) GDPR. Further, the certification mechanism comes only into play when an adequacy decision is absent. As is probably well known, art. 46 (2) GDPR outlines several safeguards that may be resorted to in case personal data is being transferred to third countries.

One of these is the voluntary certification mechanism, as laid down by art. 42/43 GDPR, that allows accredited certification bodies or supervisory authorities to issue certifications, provided, of course, that controllers or processors have made binding and enforceable commitments. What the EU legislators hoped was to assist data subjects in quickly assessing “the level of data protection of relevant products and services” (Recital 100 GDPR) by way of certifications, seals, and marks.

In accordance with art. 42 (5) GDPR and guideline 1/2018 on certification, whereby the latter is to be complemented with the new guidelines, accredited certification bodies or supervisory authorities are competent to issue such certification. It is important to note that the previously mentioned accredited certification bodies could very well be private bodies which are subject to certain requirements and prior approval by the Board or supervisory authorities. The criteria on the basis of which certifications are issued are to be determined and approved by the Board or by the competent supervisory authorities (art. 42 (5) GDPR).

According to EDPB Deputy Chair Ventsislav Karadjov, these yet-to-be published guidelines are “ground-breaking” as he provides an outlook for the content of the guidelines. One of the most important aspects that will be touched upon are the accreditation requirements that certification bodies have to comply with as well as the certification criteria attesting that appropriate safeguards for transfers are in place. It remains to be seen whether these guidelines will indeed provide more guidance on those aspects.

Steps towards data protection law in India

17. June 2022

At present, there is no comprehensive data protection law in India. The relevant provisions are governed by several laws, regulations and court decisions, including the Information Technology Act 2000 and the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules 2011.

Following the inclusion of privacy as a fundamental right in Article 21 of the Indian Constitution on August 24th, 2017, a Personal Data Protection Bill (PDPB) was formulated and introduced in the Lower House of the Parliament on December 11th, 2019. The PDPB was intended to constitute the first comprehensive data protection law in India.

The PDPB was pending consideration of the Parliament for a long time. On November 22nd, 2021, the Indian Joint Parliamentary Committee (JPC) responsible for reviewing the PDPB issued its report on the proposed law. Back then, the Indian Parliament was expected to table JPC’s final report and consider the bill on December 21st, 2021, ahead of the end of its legislative session on December 23rd, 2021. Once passed by both houses of the Parliament and approved by the president, the PDPB was then to be enacted as legislation.

However, as it has recently become known, new regulations may soon be introduced to replace the proposed PDPB, which was scrapped in favor of a total overhaul after data localization and data mirroring requirements raised concerns among business stakeholders. In addition, the Indian Government is expected to commence work on a new law to replace the Information Technology Act 2000, which would entail new guidelines for data governance and cybersecurity as part of a ‘Digital India Act’.

This would be a major, and long overdue, step towards a modern data protection law that takes into account both economic interests and individual rights, as well as integrates into the progressive legal development worldwide.

Connecticut enacts privacy law

3. June 2022

On May 10, 2022, Connecticut Gov. Ned Lamont approved the Connecticut Privacy Act (“CTDPA”) concerning Personal Data Privacy and Online Monitoring. The passage of the CTDPA continues the trend in the U.S. for states to individually address consumer rights and business obligations relating to consumer data, in the absence of uniform legislation from the U.S. Congress. This makes Connecticut the 5th state in the United Sates to pass a comprehensive data privacy law.

The CTDPA shares many similarities with the California Consumer Privacy Act (“CPRA”), Colorado Privacy Act (“CPA”), Virginia Consumer Data Protection Act (“VCDPA”) and Utah Consumer Privacy Act (“UCPA”). The Connecticut Privacy Act applies to “personal data”, which is defined as “any information that is linked or reasonably linkable to an identified or identifiable individual,” not including de-identified data or publicly available information. It imposes obligations on both controllers and processors of personal data.

Who does the Connecticut Privacy Act apply to?

The law will apply to individuals and entities who/ that

  • conduct business in Connecticut.
  • produce products or services that are targeted to Connecticut residents.
  • during the preceding calendar year, either controlled or processed the personal data of at least 100,000 consumers (excluding for the purpose of completing a payment transaction) or controlled or processed the personal data of at least 25,000 consumers and derived more than 25% of their gross revenue from the sale of personal data.

Certain entities are exempted, for example:

  • State and local government entities
  • Nonprofits
  • Higher education institutions
  • Financial institutions subject to the Gramm-Leach-Bliley Act (“GLB”)
  • Entities and business associates subject to the Health Insurance Portability and Accountability Act (“HIPAA”)

Consumers will have the right to:

  • access – the right to know what personal data a company has collected about them
  • correct inaccuracies in the consumer’s personal data
  • delete personal data provided by, or obtained about, the consumer
  • obtain a copy of the consumer’s personal data processed by a controller, in a portable and, to the extent technically feasible, readily usable format
  • opt out of the processing of their personal data for purposes of targeted advertising, the sale of personal data, or profiling in furtherance of solely automated decisions that produce legal or similarly significant effects concerning the consumer

Among other obligations, controllers will be required to:

  • limit the use of the personal data to only the purpose of the collection (“what is adequate, relevant and reasonably necessary”) or as the consumer has authorized
  • establish, implement and maintain reasonable administrative, technical and physical data security practices
  • not to process personal data of a consumer for purposes of targeted advertising
  • obtain consent before processing sensitive data, including data of any individual under the age of 13, and follow the provisions of the Children’s Online Privacy Protection Act

The Connecticut Privacy Act is set to become effective on July 1, 2023. Violation of the CPDPA may result in an enforcement action by the Connecticut Attorney General (AG), who can levy fines and penalties under the Connecticut Unfair Trade Practices Act. However, there is a grace period for enforcement actions until December 31, 2024, for the AG to provide organizations an opportunity to cure any alleged violations.

Like other US data privacy laws, the Connecticut laws are not as comprehensive as the EU’s GDPR but they better align with some of the definitions and especially the mechanisms of consent.

UK announces Data Reform Bill

31. May 2022

In 2021 the Department for Culture, Media and Sport (DCMS) published a consultation document entitled “Data: a new direction”, requesting opinions on proposals that could bring changes to the UK’s data protection regime. On May 10, 2022, as part of the Queen’s Speech, Prince Charles confirmed that the government of the United Kingdom (UK) is in the process of reforming its data privacy rules, raising questions about whether the country could still be in compliance with the General Data Protection Regulation (GDPR).

Other than the statement itself, not much information was provided regarding the specific details. The accompanying briefing notes provided more information. They set out the main purposes of the Bill, namely to:

  • The establishment of a new pro-growth and trusted data protection framework
  • Reducing the burdens on business
  • Creation of a world class data rights regime
  • Supporting innovation
  • Driving industry participation in schemes which give citizens and small businesses more control of their data, particularly in relation to health and social care
  • Modernization of the  Information Commissioner’s Office (ICO), including strengthening its enforcement powers and increasing its accountability

Nevertheless, the defined goals are rather superficial. Another concern is that the new bill could deviate too far from the GDPR. The new regime might not be able to retain the adequacy-status with the EU, allowing personal data to be exchanged between UK and EU organizations. Prime Minister Johnson said that the Data Reform Bill would “improve the burdensome GDPR, allowing information to be shared more effectively and securely between public bodies.” So far, no time frame for the adoption of the new law has been published.

EU: Commission publishes Q&A on SCCs

30. May 2022

On 25 May 2022, the European Commission published guidance outlining questions and answers (‘Q&A’) on the two sets of Standard Contractual Clauses (‘SCCs’), on controllers and processors (‘the Controller-Processor SCCs’) and third-country data transfers (‘the Data Transfer SCCs’) respectively, as adopted by the European Commission on 4 June 2021. The Q&A are intended to provide practical guidance on the use of the SCCs. They are based on feedback from various stakeholders on their experiences using the new SCCs in the months following their adoption. 

Specifically, 44 questions are addressed, including those related to contracting, amendments, the relationship to other contract clauses, and the operation of the so-called docking clause.  In addition, the Q&A contains a specific section dedicated to each set of SCCs. Notably, in the section on the Data Transfer SCCs, the Commission addresses the scope of data transfers for which the Data Transfer SCCs may be used, highlighting that they may not be used for data transfers to controllers or processors whose processing operations are directly subject to the General Data Protection Regulation (Regulation (EU) 2016/679) (‘GDPR’) by virtue of Article 3 of the GDPR. Further to this point, the Q&A highlights that the Commission is in the process of developing an additional set of SCCs for this scenario, which will consider the requirements that already apply directly to those controllers and processors under the GDPR. 

In addition, the Q&A includes a section with questions on the obligations of data importers and exporters, specifically addressing the SCC liability scheme. Specifically, the Q&A states that other provisions in the broader (commercial) contract (e.g., specific rules for allocation of liability, caps on liability between the parties) may not contradict or undermine liability schemes of the SCCs. 

Additionally, with respect to the Court of Justice of the European Union’s judgment in Data Protection Commissioner v. Facebook Ireland Limited, Maximillian Schrems (C-311/18) (‘the Schrems II Case’), the Q&A includes a set of questions on local laws and government access aimed at clarifying contracting parties’ obligations under Clause 14 of the Data Transfer SCCs. 

In this regard, the Q&A highlights that Clause 14 of the Data Transfer SCCs should not be read in isolation but used together with the European Data Protection Board’s Recommendations 01/2020 on measures that supplement transfer tools. 

Twitter fined $150m for handing users’ contact details to advertisers

Twitter has been fined $150 million by U.S. authorities after the company collected users’ email addresses and phone numbers for security reasons and then used the data for targeted advertising. 

According to a settlement with the U.S. Department of Justice and the Federal Trade Commission, the social media platform had told users that the information would be used to keep their accounts secure. “While Twitter represented to users that it collected their telephone numbers and email addresses to secure their accounts, Twitter failed to disclose that it also used user contact information to aid advertisers in reaching their preferred audiences,” said a court complaint filed by the DoJ. 

A stated in the court documents, the breaches occurred between May 2013 and September 2019, and the information was apparently used for purposes such as two-factor authentication. However, in addition to the above-mentioned purposes, Twitter used that data to allow advertisers to target specific groups of users by matching phone numbers and email addresses with advertisers’ own lists. 

In addition to financial compensation, the settlement requires Twitter to improve its compliance practices. According to the complaint, the false disclosures violated FTC law and a 2011 settlement with the agency. 

Twitter’s chief privacy officer, Damien Kieran, said in a statement that the company has “cooperated with the FTC at every step of the way.” 

“In reaching this settlement, we have paid a $150m penalty, and we have aligned with the agency on operational updates and program enhancements to ensure that people’s personal data remains secure, and their privacy protected,” he added. 

Twitter generates 90 percent of its $5 billion (£3.8 billion) in annual revenue from advertising.  

The complaint also alleges that Twitter falsely claimed to comply with EU and U.S. privacy laws, as well as Swiss and U.S. privacy laws, which prohibit companies from using data in ways that consumers have not approved of. 

The settlement with Twitter follows years of controversy over tech companies’ privacy practices. Revelations in 2018 that Facebook, the world’s largest social network, used phone numbers provided for two-factor authentication for advertising purposes enraged privacy advocates. Facebook, now Meta, also settled the matter with the FTC as part of a $5 billion settlement in 2019. 

 

CJEU considers representative actions admissible

29. April 2022

Associations can bring legal proceedings against companies according to a press release of the European Court of Justice (CJEU).

This is the conclusion reached by the Court in a decision on the proceedings of the Federation of German Consumer Organisations (vzbv), which challenged Facebook’s data protection directive. Accordingly, it allows a consumer protection association to bring legal proceedings, in the absence of a mandate conferred on it for that purpose and independently of the infringement of specific rights of the data subjects, against the person allegedly responsible for an infringement of the laws protecting personal data, The vzbv is an institution that is entitled to bring legal proceeding under the GDPR because it pursues an objective in the public interest.

Specifically, the case is about third-party games on Facebook, in which users must agree to the use of data in order to be able to play these games on Facebook. According to the association, Facebook has not informed the data subjects in a precise, transparent and understandable form about the use of the data, as is actually prescribed by the General Data Protection Regulation (GDPR). The Federal Court of Justice in Germany (BGH) already came to this conclusion in May 2020 however, it was not considered sufficiently clarified whether the association can bring legal proceedings in this case.

The EU Advocate General also concluded before that the association can bring legal proceeding in a legally non-binding statement.

Thus, the CJEU confirmed this view so that the BGH must now finally decide on the case of vzbv vs. facebook. It is also important that this decision opens doors for similar collective actions against other companies.

Record GDPR fine by the Hungarian Data Protection Authority for the unlawful use of AI

22. April 2022

The Hungarian Data Protection Authority (Nemzeti Adatvédelmi és Információszabadság Hatóság, NAIH) has recently published its annual report in which it presented a case where the Authority imposed the highest fine to date of ca. €670,000 (HUF 250 million).

This case involved the processing of personal data by a bank that acted as a data controller. The controller automatically analyzed recorded audio of costumer calls. It used the results of the analysis to determine which customers should be called back by analyzing the emotional state of the caller using an artificial intelligence-based speech signal processing software that automatically analyzed the call based on a list of keywords and the emotional state of the caller. The software then established a ranking of the calls serving as a recommendation as to which caller should be called back as a priority.

The bank justified the processing on the basis of its legitimate interests in retaining its customers and improving the efficiency of its internal operations.

According to the bank this procedure aimed at quality control, in particular at the prevention of customer complaints. However, the Authority held that the bank’s privacy notice referred to these processing activities in general terms only, and no material information was made available regarding the voice analysis itself. Furthermore, the privacy notice only indicated quality control and complaint prevention as purposes of the data processing.

In addition, the Authority highlighted that while the Bank had conducted a data protection impact assessment and found that the processing posed a high risk to data subjects due to its ability to profile and perform assessments, the data protection impact assessment did not provide substantive solutions to address these risks. The Authority also emphasized that the legal basis of legitimate interest cannot serve as a “last resort” when all other legal bases are inapplicable, and therefore data controllers cannot rely on this legal basis at any time and for any reason. Consequently, the Authority not only imposed a record fine, but also required the bank to stop analyzing emotions in the context of speech analysis.

 

Pages: Prev 1 2 3 4 5 6 7 8 9 10 ... 67 68 69 Next
1 2 3 4 5 6 69