Category: Data Protection

Canada’s new privacy policy: Bill C-27

29. June 2022

On June 16th, 2022 the Canadian Federal Government has introduced a new privacy bill, named Bill C-27 (a re-working of Bill C-11). Among its main goals there is the will to strengthen the role of the Privacy Commissioner and to establish a special Data Protection Tribunal. Furthermore, it aims to propose new regulations regarding artificial intelligence. If passed, the act would substitute Part 1 of the current PIPEDA (Personal Information and Electronic Documents Act), replacing it with the new CPPA (Consumer Privacy Protection Act). Bill C-27 still needs to undergo reviews by various committees and is not expected to come into force until after summer.

The Office of the Privacy Commissioner  enforces the Canadian federal privacy laws and provides counsel to individuals regarding the protection of their personal data and their rights. With the new bill the Commissioner will be able to make recommendations about penalties to the Tribunal along with other authorities.

If the Bill comes into force, the Data Protection Tribunal’s power will be amplified. Its decisions will be binding and final.  Moreover, its decisions may be enforced as if they were orders of a superior court. The Tribunal also may review the recommendations made by the Privacy Commissioner, but is not bound to follow them in any way.

One other important innovation brought by Bill C-27 is the clarification of the concept of legitimate interest: this has been added as an exception to consent, as it outweighs potential adverse effects on the data subject.

All data regarding children are now considered to be sensitive, and must be treated as such by organizations and corporations. This means introducing higher standards for handling that data and limiting the rights to collect that information.

The concepts of de-identification and anonymization have been adapted to global standards.

Finally, along with Bill C-27 the Government aims to introduce the new Artificial Intelligence and Data Act, creating a framework for high-impact AI systems. Its goals are to regulate international and intraprovincial AI systems commerce by introducing common requirements across Canada, and to prohibit conduct in relation to AI systems that may result in harm to individuals or their interests. A new working definition of AI system is given.

Lastly, the Act aims at the creation of a new AI Data Commissioner inside a ministry. This figure will help the enforcement of the Act across Canada.

Steps towards data protection law in India

17. June 2022

At present, there is no comprehensive data protection law in India. The relevant provisions are governed by several laws, regulations and court decisions, including the Information Technology Act 2000 and the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules 2011.

Following the inclusion of privacy as a fundamental right in Article 21 of the Indian Constitution on August 24th, 2017, a Personal Data Protection Bill (PDPB) was formulated and introduced in the Lower House of the Parliament on December 11th, 2019. The PDPB was intended to constitute the first comprehensive data protection law in India.

The PDPB was pending consideration of the Parliament for a long time. On November 22nd, 2021, the Indian Joint Parliamentary Committee (JPC) responsible for reviewing the PDPB issued its report on the proposed law. Back then, the Indian Parliament was expected to table JPC’s final report and consider the bill on December 21st, 2021, ahead of the end of its legislative session on December 23rd, 2021. Once passed by both houses of the Parliament and approved by the president, the PDPB was then to be enacted as legislation.

However, as it has recently become known, new regulations may soon be introduced to replace the proposed PDPB, which was scrapped in favor of a total overhaul after data localization and data mirroring requirements raised concerns among business stakeholders. In addition, the Indian Government is expected to commence work on a new law to replace the Information Technology Act 2000, which would entail new guidelines for data governance and cybersecurity as part of a ‘Digital India Act’.

This would be a major, and long overdue, step towards a modern data protection law that takes into account both economic interests and individual rights, as well as integrates into the progressive legal development worldwide.

Connecticut enacts privacy law

3. June 2022

On May 10, 2022, Connecticut Gov. Ned Lamont approved the Connecticut Privacy Act (“CTDPA”) concerning Personal Data Privacy and Online Monitoring. The passage of the CTDPA continues the trend in the U.S. for states to individually address consumer rights and business obligations relating to consumer data, in the absence of uniform legislation from the U.S. Congress. This makes Connecticut the 5th state in the United Sates to pass a comprehensive data privacy law.

The CTDPA shares many similarities with the California Consumer Privacy Act (“CPRA”), Colorado Privacy Act (“CPA”), Virginia Consumer Data Protection Act (“VCDPA”) and Utah Consumer Privacy Act (“UCPA”). The Connecticut Privacy Act applies to “personal data”, which is defined as “any information that is linked or reasonably linkable to an identified or identifiable individual,” not including de-identified data or publicly available information. It imposes obligations on both controllers and processors of personal data.

Who does the Connecticut Privacy Act apply to?

The law will apply to individuals and entities who/ that

  • conduct business in Connecticut.
  • produce products or services that are targeted to Connecticut residents.
  • during the preceding calendar year, either controlled or processed the personal data of at least 100,000 consumers (excluding for the purpose of completing a payment transaction) or controlled or processed the personal data of at least 25,000 consumers and derived more than 25% of their gross revenue from the sale of personal data.

Certain entities are exempted, for example:

  • State and local government entities
  • Nonprofits
  • Higher education institutions
  • Financial institutions subject to the Gramm-Leach-Bliley Act (“GLB”)
  • Entities and business associates subject to the Health Insurance Portability and Accountability Act (“HIPAA”)

Consumers will have the right to:

  • access – the right to know what personal data a company has collected about them
  • correct inaccuracies in the consumer’s personal data
  • delete personal data provided by, or obtained about, the consumer
  • obtain a copy of the consumer’s personal data processed by a controller, in a portable and, to the extent technically feasible, readily usable format
  • opt out of the processing of their personal data for purposes of targeted advertising, the sale of personal data, or profiling in furtherance of solely automated decisions that produce legal or similarly significant effects concerning the consumer

Among other obligations, controllers will be required to:

  • limit the use of the personal data to only the purpose of the collection (“what is adequate, relevant and reasonably necessary”) or as the consumer has authorized
  • establish, implement and maintain reasonable administrative, technical and physical data security practices
  • not to process personal data of a consumer for purposes of targeted advertising
  • obtain consent before processing sensitive data, including data of any individual under the age of 13, and follow the provisions of the Children’s Online Privacy Protection Act

The Connecticut Privacy Act is set to become effective on July 1, 2023. Violation of the CPDPA may result in an enforcement action by the Connecticut Attorney General (AG), who can levy fines and penalties under the Connecticut Unfair Trade Practices Act. However, there is a grace period for enforcement actions until December 31, 2024, for the AG to provide organizations an opportunity to cure any alleged violations.

Like other US data privacy laws, the Connecticut laws are not as comprehensive as the EU’s GDPR but they better align with some of the definitions and especially the mechanisms of consent.

UK announces Data Reform Bill

31. May 2022

In 2021 the Department for Culture, Media and Sport (DCMS) published a consultation document entitled “Data: a new direction”, requesting opinions on proposals that could bring changes to the UK’s data protection regime. On May 10, 2022, as part of the Queen’s Speech, Prince Charles confirmed that the government of the United Kingdom (UK) is in the process of reforming its data privacy rules, raising questions about whether the country could still be in compliance with the General Data Protection Regulation (GDPR).

Other than the statement itself, not much information was provided regarding the specific details. The accompanying briefing notes provided more information. They set out the main purposes of the Bill, namely to:

  • The establishment of a new pro-growth and trusted data protection framework
  • Reducing the burdens on business
  • Creation of a world class data rights regime
  • Supporting innovation
  • Driving industry participation in schemes which give citizens and small businesses more control of their data, particularly in relation to health and social care
  • Modernization of the  Information Commissioner’s Office (ICO), including strengthening its enforcement powers and increasing its accountability

Nevertheless, the defined goals are rather superficial. Another concern is that the new bill could deviate too far from the GDPR. The new regime might not be able to retain the adequacy-status with the EU, allowing personal data to be exchanged between UK and EU organizations. Prime Minister Johnson said that the Data Reform Bill would “improve the burdensome GDPR, allowing information to be shared more effectively and securely between public bodies.” So far, no time frame for the adoption of the new law has been published.

EU: Commission publishes Q&A on SCCs

30. May 2022

On 25 May 2022, the European Commission published guidance outlining questions and answers (‘Q&A’) on the two sets of Standard Contractual Clauses (‘SCCs’), on controllers and processors (‘the Controller-Processor SCCs’) and third-country data transfers (‘the Data Transfer SCCs’) respectively, as adopted by the European Commission on 4 June 2021. The Q&A are intended to provide practical guidance on the use of the SCCs. They are based on feedback from various stakeholders on their experiences using the new SCCs in the months following their adoption. 

Specifically, 44 questions are addressed, including those related to contracting, amendments, the relationship to other contract clauses, and the operation of the so-called docking clause.  In addition, the Q&A contains a specific section dedicated to each set of SCCs. Notably, in the section on the Data Transfer SCCs, the Commission addresses the scope of data transfers for which the Data Transfer SCCs may be used, highlighting that they may not be used for data transfers to controllers or processors whose processing operations are directly subject to the General Data Protection Regulation (Regulation (EU) 2016/679) (‘GDPR’) by virtue of Article 3 of the GDPR. Further to this point, the Q&A highlights that the Commission is in the process of developing an additional set of SCCs for this scenario, which will consider the requirements that already apply directly to those controllers and processors under the GDPR. 

In addition, the Q&A includes a section with questions on the obligations of data importers and exporters, specifically addressing the SCC liability scheme. Specifically, the Q&A states that other provisions in the broader (commercial) contract (e.g., specific rules for allocation of liability, caps on liability between the parties) may not contradict or undermine liability schemes of the SCCs. 

Additionally, with respect to the Court of Justice of the European Union’s judgment in Data Protection Commissioner v. Facebook Ireland Limited, Maximillian Schrems (C-311/18) (‘the Schrems II Case’), the Q&A includes a set of questions on local laws and government access aimed at clarifying contracting parties’ obligations under Clause 14 of the Data Transfer SCCs. 

In this regard, the Q&A highlights that Clause 14 of the Data Transfer SCCs should not be read in isolation but used together with the European Data Protection Board’s Recommendations 01/2020 on measures that supplement transfer tools. 

Twitter fined $150m for handing users’ contact details to advertisers

Twitter has been fined $150 million by U.S. authorities after the company collected users’ email addresses and phone numbers for security reasons and then used the data for targeted advertising. 

According to a settlement with the U.S. Department of Justice and the Federal Trade Commission, the social media platform had told users that the information would be used to keep their accounts secure. “While Twitter represented to users that it collected their telephone numbers and email addresses to secure their accounts, Twitter failed to disclose that it also used user contact information to aid advertisers in reaching their preferred audiences,” said a court complaint filed by the DoJ. 

A stated in the court documents, the breaches occurred between May 2013 and September 2019, and the information was apparently used for purposes such as two-factor authentication. However, in addition to the above-mentioned purposes, Twitter used that data to allow advertisers to target specific groups of users by matching phone numbers and email addresses with advertisers’ own lists. 

In addition to financial compensation, the settlement requires Twitter to improve its compliance practices. According to the complaint, the false disclosures violated FTC law and a 2011 settlement with the agency. 

Twitter’s chief privacy officer, Damien Kieran, said in a statement that the company has “cooperated with the FTC at every step of the way.” 

“In reaching this settlement, we have paid a $150m penalty, and we have aligned with the agency on operational updates and program enhancements to ensure that people’s personal data remains secure, and their privacy protected,” he added. 

Twitter generates 90 percent of its $5 billion (£3.8 billion) in annual revenue from advertising.  

The complaint also alleges that Twitter falsely claimed to comply with EU and U.S. privacy laws, as well as Swiss and U.S. privacy laws, which prohibit companies from using data in ways that consumers have not approved of. 

The settlement with Twitter follows years of controversy over tech companies’ privacy practices. Revelations in 2018 that Facebook, the world’s largest social network, used phone numbers provided for two-factor authentication for advertising purposes enraged privacy advocates. Facebook, now Meta, also settled the matter with the FTC as part of a $5 billion settlement in 2019. 

 

CJEU considers representative actions admissible

29. April 2022

Associations can bring legal proceedings against companies according to a press release of the European Court of Justice (CJEU).

This is the conclusion reached by the Court in a decision on the proceedings of the Federation of German Consumer Organisations (vzbv), which challenged Facebook’s data protection directive. Accordingly, it allows a consumer protection association to bring legal proceedings, in the absence of a mandate conferred on it for that purpose and independently of the infringement of specific rights of the data subjects, against the person allegedly responsible for an infringement of the laws protecting personal data, The vzbv is an institution that is entitled to bring legal proceeding under the GDPR because it pursues an objective in the public interest.

Specifically, the case is about third-party games on Facebook, in which users must agree to the use of data in order to be able to play these games on Facebook. According to the association, Facebook has not informed the data subjects in a precise, transparent and understandable form about the use of the data, as is actually prescribed by the General Data Protection Regulation (GDPR). The Federal Court of Justice in Germany (BGH) already came to this conclusion in May 2020 however, it was not considered sufficiently clarified whether the association can bring legal proceedings in this case.

The EU Advocate General also concluded before that the association can bring legal proceeding in a legally non-binding statement.

Thus, the CJEU confirmed this view so that the BGH must now finally decide on the case of vzbv vs. facebook. It is also important that this decision opens doors for similar collective actions against other companies.

Record GDPR fine by the Hungarian Data Protection Authority for the unlawful use of AI

22. April 2022

The Hungarian Data Protection Authority (Nemzeti Adatvédelmi és Információszabadság Hatóság, NAIH) has recently published its annual report in which it presented a case where the Authority imposed the highest fine to date of ca. €670,000 (HUF 250 million).

This case involved the processing of personal data by a bank that acted as a data controller. The controller automatically analyzed recorded audio of costumer calls. It used the results of the analysis to determine which customers should be called back by analyzing the emotional state of the caller using an artificial intelligence-based speech signal processing software that automatically analyzed the call based on a list of keywords and the emotional state of the caller. The software then established a ranking of the calls serving as a recommendation as to which caller should be called back as a priority.

The bank justified the processing on the basis of its legitimate interests in retaining its customers and improving the efficiency of its internal operations.

According to the bank this procedure aimed at quality control, in particular at the prevention of customer complaints. However, the Authority held that the bank’s privacy notice referred to these processing activities in general terms only, and no material information was made available regarding the voice analysis itself. Furthermore, the privacy notice only indicated quality control and complaint prevention as purposes of the data processing.

In addition, the Authority highlighted that while the Bank had conducted a data protection impact assessment and found that the processing posed a high risk to data subjects due to its ability to profile and perform assessments, the data protection impact assessment did not provide substantive solutions to address these risks. The Authority also emphasized that the legal basis of legitimate interest cannot serve as a “last resort” when all other legal bases are inapplicable, and therefore data controllers cannot rely on this legal basis at any time and for any reason. Consequently, the Authority not only imposed a record fine, but also required the bank to stop analyzing emotions in the context of speech analysis.

 

Google launches “Reject All” button on cookie banners

After being hit with a €150 million fine by France’s data protection agency CNIL earlier in the year for making the process of rejecting cookies unnecessarily confusing and convoluted for users, Google has added a new “Reject All” button to the cookie consent banners that have become ubiquitous on websites in Europe. Users visiting Search and YouTube in Europe while signed out or in incognito mode will soon see an updated cookie dialogue with reject all and accept all buttons.

Previously, users only had two options: “I accept” and “personalize.” While this allowed users to accept all cookies with a single click, they had to navigate through various menus and options if they wanted to reject all cookies. “This update, which began rolling out earlier this month on YouTube, will provide you with equal “Reject All” and “Accept All” buttons on the first screen in your preferred language,” wrote Google product manager Sammit Adhya in a blog post.

According to Google they have kicked off the rollout of the new cookie banner in France and will be extending the change to all Google users in Europe, the U.K., and Switzerland soon.

Google’s plan to include a “Reject All” button on cookie banners after its existing policy violated EU law was also welcomed by Hamburg’s Commissioner for Data Protection and Freedom of Information Thomas Fuchs during a presentation of his 2021 activity report.

But the introduction of the “Reject All” button is likely to be only an interim solution because the US giant already presented far-reaching plans at the end of January to altogether remove Google cookies from third-party providers by 2023.

Instead of cookies, the internet giant wants to rely on in-house tracking technology for the Google Privacy Sandbox project.

Dutch DPA issues highest fine for GDPR violations

14. April 2022

On April 7th, 2022, the Dutch Data Protection Authority, Autoriteit Persoonsgegevens, imposed the highest-ever fine for data protection violations, amounting to € 3.7 million. It is directed against the Minister of Finance, who was the data controller for the Tax and Customs Administration’s processing operations. The reason for this is the years of unlawful processing of personal data in the Fraud Notification Facility application, a blacklist in which reports and suspected fraud cases were registered.

The investigation revealed several violations of principles and other requirements of the GDPR. Firstly, there was no legal basis for the processing of the personal data included in the list, making it unlawful under Art. 5 (1) (a), Art. 6 (1) GDPR. Secondly, the pre-formulated purposes of collecting the personal data were not clearly defined and thus did not comply with the principle of purpose limitation stipulated in Art. 5 (1) (b) GDPR. Moreover, the personal data were often incorrect and non-updated, which constituted a violation of the principle of accuracy according to Art. 5 (1) (d) GDPR. Since the personal data were also kept longer than the applicable retention period allowed, they were not processed in accordance with the principle of storage limitation as laid down in Art. 5 (1) (e) GDPR. Furthermore, the security of the processing according to Art. 32 (1) GDPR was not ensured by appropriate technical and organizational measures. In addition, the internal Data Protection Officer was not involved properly and in a timely manner in the conduct of the Data Protection Impact Assessment pursuant to Art. 38 (1), 35 (2) GDPR.

The amount of the fine imposed results from the severity, consequences and duration of the violations. With the Fraud Notification Facility, the rights of 270,000 people have been violated in over six years. They were often falsely registered as (possible) fraudsters, which caused them to suffer serious consequences. It left many unable to obtain a payment agreement or eligible for debt rescheduling and therefore, in financial insecurity. The Tax and Customs Administration also used discriminatory practices. Employees were instructed to assess the risk of fraud based on people’s nationality and appearance, among other factors.

The DPA also considered previous serious infringements in determining the amount of the fine. The Minister of Finance was penalized in 2018 for inadequate security of personal data, in 2020 for illegal use of the citizen service number in the VAT identification number of self-employed persons, and in 2021 for the discriminatory and illegal action in the childcare benefits scandal. Following the latter affair, the Fraud Notification Facility was shut down in February 2020.

The Minister of Finance can appeal the decision within six weeks.

Pages: Prev 1 2 3 4 5 6 7 8 9 10 11 12 13 Next
1 2 3 4 5 13