Category: Data Protection

Privacy issues in the antitrust legal framework: “the Facebook case”

21. July 2022

European countries were among the first to introduce privacy laws in the context of antitrust and in the competition law framework. As a result of this implementation, in 2019 the German Federal Cartel Office took action to stop Facebook (now a part of Meta Inc.) from further processing personal data that had been acquired through third – party installations (most of all referring to cookies). The proceedings on the matter are still ongoing. Recently also the Irish Data Protection Authority took position against Facebook (which has in the meantime become Meta Inc.), by preventing the American tech giant to transfer user data to the United States due to data safety issues. Also in this matter the parties are still in debate.

In 2014 Facebook notoriously purchased messaging company WhatsApp for almost 22 bln. dollars. At the time Europe did not give much thought to the potential consequences of this merger. This operation was the object of an opinion of the European Commission; in the Commission’s mind the two companies’ privacy policies were way different, and the thought that Facebook now had control over all of the data collected by WhatsApp did not sit well with the European authorities. Another key argument brought forward by the Commission was the lack of an effective competition between the two companies. However, no further action was taken at the time.

A few years later, academic research highlighted the mistake made by the European Commission in not considering the enormous meaning personal data have for these tech companies: due to the fact that personal data are considered to be so – called “nonprice competition”, they play a key role in the strategies and decision – making of big data – driven business models. In particular, when a company depends on collecting and using personal data, it usually lowers the bar of privacy protection standards and raises the number of data collected. This argument was brought forward by the U.K.’s Competition Agency, which stated that by considering the enormous importance personal data have gained in the digital market, companies such as Facebook do not have to face a strong competition in their business.

These arguments and the growing unrest in various DPAs around the globe has brought in 2020 to the notorious investigation of Facebook by the Federal Trade Commission of the United States. In particular the FTC accused Meta Inc. (in particular Facebook) of stifling its competition in order to retain its monopoly of the digital market. On one hand an American court dismissed the claims, but on the other hand the high risks connected with an enormous data collection was highlighted. In particular, according to Section 2 of the Sherman Act, the State has:

  • To prove that a company is in fact a monopoly, and
  • That it has to harm consumers

This does not apply directly to the case, but the FTC argued that the harm to the consumers is to be seen in Meta Inc.’s lowering privacy standards. The case is still pending as of July 2022.

This merger showed how much privacy and antitrust issues overlap in the digitalized market.

In the following months, policymakers and enforcers both in the United States and in the European Union have been struggling to establish new sets of rules to better regulate mergers between companies whose business model relies on the collection of personal data, and above all they called for more cooperation between privacy and antitrust agencies.

Artificial Intelligence and Personal Data: a hard co-existence. A new perspective for the EU

7. July 2022

In the last decades AI has had an impressive development in various fields. At the same time, with each step forward the new machines and the new processes they are programmed to perform need to collect way more data than before in order to function properly.

One of the first things that come to mind is how can the rise of AI and the principle of data minimization, as contained in Art. 5 para. 1 lit. c) GDPR, be reconciled? At first glance it seems contradictory that there may be a way: after all, the GDPR clearly states that the number of personal data collected should be as small as possible. A study carried out by the Panel for the Future of Science and Technology of the European Union suggests that, given the wide scope (referring to the exceptions contained in the article) conceded by the norm, this issue could be addressed by measures like pseudonymization. This means that the data collected by the AI is deprived of every information that could refer personal data to a specific individual without additional information, thus lowering the risks for individuals.

The main issue with the current legal framework of the European Union regarding personal data protection is the fact that certain parts have been left vague, which causes uncertainty also in the regulation of artificial intelligence. To address this problem, the EU has put forward a proposal for a new Artificial Intelligence Act (“AIA”), aiming to create a common and more “approachable” legal framework.

One of the main features of this Act is that it divides the application of artificial intelligence in three main categories of risk levels:

  1. Creating an unacceptable risk, thus prohibited AIs (e.g. systems that violate fundamental rights).
  2. Creating a high risk, subject to specific regulation.
  3. Creating a low or minimum risk, with no further regulation.

Regarding high-risk AIs, the AIA foresees the creation of post-market monitoring obligations. If the AI in question violates any part of the AIA, it can then be forcibly withdrawn from the market by the regulator.

This approach has been welcomed by the Joint Opinion of the EDPB – EDPS, although the two bodies stated that the draft still needs to be more aligned with the GDPR.

Although the Commission’s draft contains a precise description of the first two categories, these will likely change over the course of the next years as the proposal is undergoing the legislative processes of the EU.

The draft was published by the European Commission in April 2021 and must still undergo scrutiny from the European Parliament and the Council of the European Union. Currently, some amendments have been formulated and the draft is still under review by the Parliament. After the Act has passed the scrutiny, it will be subject to a two – year implementation period.

Finally, a question remains to be answered: who shall oversee and control the Act’s implementation?It is foreseen that national supervisory authorities shall be established in each EU member state. Furthermore, the AIA aims at establishing a special European AI Board made up of representatives both of the member States and of the European Commission, which will also be the chair. Similar to the EDPB, this Board shall have the power to issue opinions and recommendations, and ensure the consistent application of the regulation throughout the EU.

U.S. lawmakers unveil bipartisan Data Privacy and Protection Act

30. June 2022

In early June, three of the four chairmen of the U.S. congressional committees responsible for data privacy submitted a drafted American Data Privacy and Protection Act (ADPPA) for consideration. If passed, it would override certain recently enacted privacy laws in some U.S. states.

The draft includes elements of the California Consumer Privacy Act and the European General Data Protection Regulation.

States led the way

Until now, data protection in the United States has primarily been at the top of the agenda at the state level. California, Colorado, Connecticut, Virginia and Utah have recently enacted comprehensive data privacy laws. This year alone, more than 100 privacy bills have already been introduced in the states.  Although not all of these were adopted, the proliferation of state laws and their varying regulatory requirements has led to increasing calls for the adoption of a federal privacy law. A unified federal law, if passed, would provide much-needed clarity to entities and businesses and, ideally, would also stem the tide of class action and other privacy lawsuits brought under various state laws.

Affected Entities

The ADPPA broadly applies (with exceptions) to organizations operating in the United States that collect, process, or transfer personal information and fall into one of the following categories:

  • Subject to the Federal Trade Commission Act
  • Nonprofit organizations
  • So-called Common Carriers, subject to Title II of the Communications Act of 1934

Requirements of the ADPPA (not final)

  • Limit data collection and processing to that which is reasonably necessary
  • Compliance with public and internal privacy regulations
  • Granting consumer rights such as access, correction, and deletion
  • Appeal options
  • Obtaining consent before collecting or processing sensitive data, e.g. geolocation, genetic and biometric information, and browsing history
  • Appointment of a data protection officer
  • Providing evidence that adequate safeguards are in place
  • Registration of data brokers with the Federal Trade Commission (FTC)
  • FTC will establish and maintain a searchable, centralized online public registry of all registered data traders, as well as a “Do Not Collect” registry that will allow individuals to request all data traders to delete their data within 30 days
  • Entities shall not collect, process, or transfer collected data in a manner that discriminates on the basis of race, color, religion, national origin, sex, sexual orientation, or disability
  • Implement appropriate administrative, technical, and physical data security practices and procedures to protect covered data from unauthorized access and disclosure

Outcome still uncertain

Shortly after a draft of the ADPPA was released, privacy organizations, civil liberties groups, and businesses spoke out, taking sides for and against the law.

As the legislative session draws to a close, the prospects for ADPPA’s adoption remain uncertain. Strong disagreement remains among key stakeholders on important aspects of the proposed legislation. However, there is consensus that the United States is in dire need of a federal privacy law. Thus, passage of such legislation is quite likely in the foreseeable future.

Thailand’s Personal Data Protection Act enters into force

29. June 2022

On June 1, 2022, Thailand’s Personal Data Protection Act (PDPA) entered into force after three years of delays after its enactment in May 2019. Due to the COVID-19 pandemic, the Thai government issued royal decrees to extend the compliance deadline to June 1, 2022.

The PDPA is widely based on the EU General Data Protection Regulation (GDPR). In particular, it also requires data controllers and processors to have a valid legal basis for processing personal data (i.e., data that can identify living natural persons directly or indirectly). If such personal data is sensitive personal data (e.g. health data, biometric data, race, religion, sexual preference and criminal record), data controllers and processors must ensure that data subjects give explicit consent for any collection, use or disclosure of such data. Exemptions are granted for public interest, contractual obligations, vital interest or compliance with the law.

The PDPA also ensures that data subjects have specific rights, which are very similar to the GDPR: the right to be informed, access, rectify and update data, as well as restrict and object to processing and the right to data erasure and portability.

One major difference to the GDPR is that, while there are fines for breaching the PDPA obligations, certain data breaches involving sensitive personal data and unlawful disclosure also carry criminal penalties including imprisonment of up to one year.

Just like the GDPR, the PDPA also affects both entities in Thailand as well as entities abroad that process personal data for the provision of products and/or services within Thai borders.

Just as we have seen with the GDPR, it will be important to observe the evolution the PDPA will venture through as it becomes more incorporated into the Thai companies’ compliance.

Canada’s new privacy policy: Bill C-27

On June 16th, 2022 the Canadian Federal Government has introduced a new privacy bill, named Bill C-27 (a re-working of Bill C-11). Among its main goals there is the will to strengthen the role of the Privacy Commissioner and to establish a special Data Protection Tribunal. Furthermore, it aims to propose new regulations regarding artificial intelligence. If passed, the act would substitute Part 1 of the current PIPEDA (Personal Information and Electronic Documents Act), replacing it with the new CPPA (Consumer Privacy Protection Act). Bill C-27 still needs to undergo reviews by various committees and is not expected to come into force until after summer.

The Office of the Privacy Commissioner  enforces the Canadian federal privacy laws and provides counsel to individuals regarding the protection of their personal data and their rights. With the new bill the Commissioner will be able to make recommendations about penalties to the Tribunal along with other authorities.

If the Bill comes into force, the Data Protection Tribunal’s power will be amplified. Its decisions will be binding and final.  Moreover, its decisions may be enforced as if they were orders of a superior court. The Tribunal also may review the recommendations made by the Privacy Commissioner, but is not bound to follow them in any way.

One other important innovation brought by Bill C-27 is the clarification of the concept of legitimate interest: this has been added as an exception to consent, as it outweighs potential adverse effects on the data subject.

All data regarding children are now considered to be sensitive, and must be treated as such by organizations and corporations. This means introducing higher standards for handling that data and limiting the rights to collect that information.

The concepts of de-identification and anonymization have been adapted to global standards.

Finally, along with Bill C-27 the Government aims to introduce the new Artificial Intelligence and Data Act, creating a framework for high-impact AI systems. Its goals are to regulate international and intraprovincial AI systems commerce by introducing common requirements across Canada, and to prohibit conduct in relation to AI systems that may result in harm to individuals or their interests. A new working definition of AI system is given.

Lastly, the Act aims at the creation of a new AI Data Commissioner inside a ministry. This figure will help the enforcement of the Act across Canada.

Steps towards data protection law in India

17. June 2022

At present, there is no comprehensive data protection law in India. The relevant provisions are governed by several laws, regulations and court decisions, including the Information Technology Act 2000 and the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules 2011.

Following the inclusion of privacy as a fundamental right in Article 21 of the Indian Constitution on August 24th, 2017, a Personal Data Protection Bill (PDPB) was formulated and introduced in the Lower House of the Parliament on December 11th, 2019. The PDPB was intended to constitute the first comprehensive data protection law in India.

The PDPB was pending consideration of the Parliament for a long time. On November 22nd, 2021, the Indian Joint Parliamentary Committee (JPC) responsible for reviewing the PDPB issued its report on the proposed law. Back then, the Indian Parliament was expected to table JPC’s final report and consider the bill on December 21st, 2021, ahead of the end of its legislative session on December 23rd, 2021. Once passed by both houses of the Parliament and approved by the president, the PDPB was then to be enacted as legislation.

However, as it has recently become known, new regulations may soon be introduced to replace the proposed PDPB, which was scrapped in favor of a total overhaul after data localization and data mirroring requirements raised concerns among business stakeholders. In addition, the Indian Government is expected to commence work on a new law to replace the Information Technology Act 2000, which would entail new guidelines for data governance and cybersecurity as part of a ‘Digital India Act’.

This would be a major, and long overdue, step towards a modern data protection law that takes into account both economic interests and individual rights, as well as integrates into the progressive legal development worldwide.

Connecticut enacts privacy law

3. June 2022

On May 10, 2022, Connecticut Gov. Ned Lamont approved the Connecticut Privacy Act (“CTDPA”) concerning Personal Data Privacy and Online Monitoring. The passage of the CTDPA continues the trend in the U.S. for states to individually address consumer rights and business obligations relating to consumer data, in the absence of uniform legislation from the U.S. Congress. This makes Connecticut the 5th state in the United Sates to pass a comprehensive data privacy law.

The CTDPA shares many similarities with the California Consumer Privacy Act (“CPRA”), Colorado Privacy Act (“CPA”), Virginia Consumer Data Protection Act (“VCDPA”) and Utah Consumer Privacy Act (“UCPA”). The Connecticut Privacy Act applies to “personal data”, which is defined as “any information that is linked or reasonably linkable to an identified or identifiable individual,” not including de-identified data or publicly available information. It imposes obligations on both controllers and processors of personal data.

Who does the Connecticut Privacy Act apply to?

The law will apply to individuals and entities who/ that

  • conduct business in Connecticut.
  • produce products or services that are targeted to Connecticut residents.
  • during the preceding calendar year, either controlled or processed the personal data of at least 100,000 consumers (excluding for the purpose of completing a payment transaction) or controlled or processed the personal data of at least 25,000 consumers and derived more than 25% of their gross revenue from the sale of personal data.

Certain entities are exempted, for example:

  • State and local government entities
  • Nonprofits
  • Higher education institutions
  • Financial institutions subject to the Gramm-Leach-Bliley Act (“GLB”)
  • Entities and business associates subject to the Health Insurance Portability and Accountability Act (“HIPAA”)

Consumers will have the right to:

  • access – the right to know what personal data a company has collected about them
  • correct inaccuracies in the consumer’s personal data
  • delete personal data provided by, or obtained about, the consumer
  • obtain a copy of the consumer’s personal data processed by a controller, in a portable and, to the extent technically feasible, readily usable format
  • opt out of the processing of their personal data for purposes of targeted advertising, the sale of personal data, or profiling in furtherance of solely automated decisions that produce legal or similarly significant effects concerning the consumer

Among other obligations, controllers will be required to:

  • limit the use of the personal data to only the purpose of the collection (“what is adequate, relevant and reasonably necessary”) or as the consumer has authorized
  • establish, implement and maintain reasonable administrative, technical and physical data security practices
  • not to process personal data of a consumer for purposes of targeted advertising
  • obtain consent before processing sensitive data, including data of any individual under the age of 13, and follow the provisions of the Children’s Online Privacy Protection Act

The Connecticut Privacy Act is set to become effective on July 1, 2023. Violation of the CPDPA may result in an enforcement action by the Connecticut Attorney General (AG), who can levy fines and penalties under the Connecticut Unfair Trade Practices Act. However, there is a grace period for enforcement actions until December 31, 2024, for the AG to provide organizations an opportunity to cure any alleged violations.

Like other US data privacy laws, the Connecticut laws are not as comprehensive as the EU’s GDPR but they better align with some of the definitions and especially the mechanisms of consent.

UK announces Data Reform Bill

31. May 2022

In 2021 the Department for Culture, Media and Sport (DCMS) published a consultation document entitled “Data: a new direction”, requesting opinions on proposals that could bring changes to the UK’s data protection regime. On May 10, 2022, as part of the Queen’s Speech, Prince Charles confirmed that the government of the United Kingdom (UK) is in the process of reforming its data privacy rules, raising questions about whether the country could still be in compliance with the General Data Protection Regulation (GDPR).

Other than the statement itself, not much information was provided regarding the specific details. The accompanying briefing notes provided more information. They set out the main purposes of the Bill, namely to:

  • The establishment of a new pro-growth and trusted data protection framework
  • Reducing the burdens on business
  • Creation of a world class data rights regime
  • Supporting innovation
  • Driving industry participation in schemes which give citizens and small businesses more control of their data, particularly in relation to health and social care
  • Modernization of the  Information Commissioner’s Office (ICO), including strengthening its enforcement powers and increasing its accountability

Nevertheless, the defined goals are rather superficial. Another concern is that the new bill could deviate too far from the GDPR. The new regime might not be able to retain the adequacy-status with the EU, allowing personal data to be exchanged between UK and EU organizations. Prime Minister Johnson said that the Data Reform Bill would “improve the burdensome GDPR, allowing information to be shared more effectively and securely between public bodies.” So far, no time frame for the adoption of the new law has been published.

EU: Commission publishes Q&A on SCCs

30. May 2022

On 25 May 2022, the European Commission published guidance outlining questions and answers (‘Q&A’) on the two sets of Standard Contractual Clauses (‘SCCs’), on controllers and processors (‘the Controller-Processor SCCs’) and third-country data transfers (‘the Data Transfer SCCs’) respectively, as adopted by the European Commission on 4 June 2021. The Q&A are intended to provide practical guidance on the use of the SCCs. They are based on feedback from various stakeholders on their experiences using the new SCCs in the months following their adoption. 

Specifically, 44 questions are addressed, including those related to contracting, amendments, the relationship to other contract clauses, and the operation of the so-called docking clause.  In addition, the Q&A contains a specific section dedicated to each set of SCCs. Notably, in the section on the Data Transfer SCCs, the Commission addresses the scope of data transfers for which the Data Transfer SCCs may be used, highlighting that they may not be used for data transfers to controllers or processors whose processing operations are directly subject to the General Data Protection Regulation (Regulation (EU) 2016/679) (‘GDPR’) by virtue of Article 3 of the GDPR. Further to this point, the Q&A highlights that the Commission is in the process of developing an additional set of SCCs for this scenario, which will consider the requirements that already apply directly to those controllers and processors under the GDPR. 

In addition, the Q&A includes a section with questions on the obligations of data importers and exporters, specifically addressing the SCC liability scheme. Specifically, the Q&A states that other provisions in the broader (commercial) contract (e.g., specific rules for allocation of liability, caps on liability between the parties) may not contradict or undermine liability schemes of the SCCs. 

Additionally, with respect to the Court of Justice of the European Union’s judgment in Data Protection Commissioner v. Facebook Ireland Limited, Maximillian Schrems (C-311/18) (‘the Schrems II Case’), the Q&A includes a set of questions on local laws and government access aimed at clarifying contracting parties’ obligations under Clause 14 of the Data Transfer SCCs. 

In this regard, the Q&A highlights that Clause 14 of the Data Transfer SCCs should not be read in isolation but used together with the European Data Protection Board’s Recommendations 01/2020 on measures that supplement transfer tools. 

Twitter fined $150m for handing users’ contact details to advertisers

Twitter has been fined $150 million by U.S. authorities after the company collected users’ email addresses and phone numbers for security reasons and then used the data for targeted advertising. 

According to a settlement with the U.S. Department of Justice and the Federal Trade Commission, the social media platform had told users that the information would be used to keep their accounts secure. “While Twitter represented to users that it collected their telephone numbers and email addresses to secure their accounts, Twitter failed to disclose that it also used user contact information to aid advertisers in reaching their preferred audiences,” said a court complaint filed by the DoJ. 

A stated in the court documents, the breaches occurred between May 2013 and September 2019, and the information was apparently used for purposes such as two-factor authentication. However, in addition to the above-mentioned purposes, Twitter used that data to allow advertisers to target specific groups of users by matching phone numbers and email addresses with advertisers’ own lists. 

In addition to financial compensation, the settlement requires Twitter to improve its compliance practices. According to the complaint, the false disclosures violated FTC law and a 2011 settlement with the agency. 

Twitter’s chief privacy officer, Damien Kieran, said in a statement that the company has “cooperated with the FTC at every step of the way.” 

“In reaching this settlement, we have paid a $150m penalty, and we have aligned with the agency on operational updates and program enhancements to ensure that people’s personal data remains secure, and their privacy protected,” he added. 

Twitter generates 90 percent of its $5 billion (£3.8 billion) in annual revenue from advertising.  

The complaint also alleges that Twitter falsely claimed to comply with EU and U.S. privacy laws, as well as Swiss and U.S. privacy laws, which prohibit companies from using data in ways that consumers have not approved of. 

The settlement with Twitter follows years of controversy over tech companies’ privacy practices. Revelations in 2018 that Facebook, the world’s largest social network, used phone numbers provided for two-factor authentication for advertising purposes enraged privacy advocates. Facebook, now Meta, also settled the matter with the FTC as part of a $5 billion settlement in 2019. 

 

Pages: 1 2 3 4 5 6 7 8 9 10 11 Next
1 2 3 11