Category: Countries

Personal data risks in the aftermath of the overturning of Roe vs. Wade

23. August 2022

At the end of June 2022, the United States Supreme Court overturned its 1973 ruling in the case of Roe vs. Wade, thus concretely ending federal abortion rights. The decision caused a worldwide outrage, but now a concerning situation presents itself: the massive use of social media and the Internet by the population could result in serious personal privacy violations by the authorities. For example, tech giants such as Apple, Google and Meta Inc. could share users’ data if law enforcement authorities suspect a felony is being committed. This could especially be the case in those States who chose to make abortion illegal after the Supreme Court’s ruling. According to the United States’ Federal Rules of Civil Procedure no. 45, this kind of personal data could be made object of a subpoena, thus forcing the subject to produce them in court. In such a scenario tech companies would have no choice than to provide the consumer’s data. It is clear that this is a high risk for the consumer’s privacy.

In particular, location data could show if a person visited an abortion clinic. Many women use specific apps in order to track periods, fertility and an eventual pregnancy. All these data could be put under surveillance and seized by law enforcement in order to investigate and prosecute abortion – related cases.

In some States this already happened. In 2018 in Mississippi a woman was charged with second – degree murder after seeking health care for a pregnancy loss which happened at home. Prosecutors produced her Internet browser history as proof. After two years she was acquitted of the charges.

Another risk is posed by the so – called data brokers: these are companies that harvest data, cleanse or analyze it and sell them to the highest bidder. These companies could also be used by law enforcement agencies to arbitrarily investigate people who could be related to abortion cases.

The lack of legislation regarding personal data protection is a serious issue in the United States. For example, there is no principle of data minimization as found in the GDPR. The Supreme Courts’ ruling makes this historical moment unexplored territory from a legal point of view. Privacy advisors and activists recommend to try to limit the digital footprint users leave on the web. Also, new laws and bills could be introduce in order to limit the access law enforcement agencies have to personal data.

Privacy issues in the antitrust legal framework: “the Facebook case”

21. July 2022

European countries were among the first to introduce privacy laws in the context of antitrust and in the competition law framework. As a result of this implementation, in 2019 the German Federal Cartel Office took action to stop Facebook (now a part of Meta Inc.) from further processing personal data that had been acquired through third – party installations (most of all referring to cookies). The proceedings on the matter are still ongoing. Recently also the Irish Data Protection Authority took position against Facebook (which has in the meantime become Meta Inc.), by preventing the American tech giant to transfer user data to the United States due to data safety issues. Also in this matter the parties are still in debate.

In 2014 Facebook notoriously purchased messaging company WhatsApp for almost 22 bln. dollars. At the time Europe did not give much thought to the potential consequences of this merger. This operation was the object of an opinion of the European Commission; in the Commission’s mind the two companies’ privacy policies were way different, and the thought that Facebook now had control over all of the data collected by WhatsApp did not sit well with the European authorities. Another key argument brought forward by the Commission was the lack of an effective competition between the two companies. However, no further action was taken at the time.

A few years later, academic research highlighted the mistake made by the European Commission in not considering the enormous meaning personal data have for these tech companies: due to the fact that personal data are considered to be so – called “nonprice competition”, they play a key role in the strategies and decision – making of big data – driven business models. In particular, when a company depends on collecting and using personal data, it usually lowers the bar of privacy protection standards and raises the number of data collected. This argument was brought forward by the U.K.’s Competition Agency, which stated that by considering the enormous importance personal data have gained in the digital market, companies such as Facebook do not have to face a strong competition in their business.

These arguments and the growing unrest in various DPAs around the globe has brought in 2020 to the notorious investigation of Facebook by the Federal Trade Commission of the United States. In particular the FTC accused Meta Inc. (in particular Facebook) of stifling its competition in order to retain its monopoly of the digital market. On one hand an American court dismissed the claims, but on the other hand the high risks connected with an enormous data collection was highlighted. In particular, according to Section 2 of the Sherman Act, the State has:

  • To prove that a company is in fact a monopoly, and
  • That it has to harm consumers

This does not apply directly to the case, but the FTC argued that the harm to the consumers is to be seen in Meta Inc.’s lowering privacy standards. The case is still pending as of July 2022.

This merger showed how much privacy and antitrust issues overlap in the digitalized market.

In the following months, policymakers and enforcers both in the United States and in the European Union have been struggling to establish new sets of rules to better regulate mergers between companies whose business model relies on the collection of personal data, and above all they called for more cooperation between privacy and antitrust agencies.

DPC sends draft decision on Meta’s EU-US data transfers to other European DPAs

14. July 2022

On July 7, 2022, it became known that the Irish Data Protection Commission (DPC) had forwarded a draft decision concerning Meta’s EU-US data transfers to other European DPAs for consultation. Having to respect a four-week-period, European DPAs may comment on this draft or formulate objections to it. In such an event, the DPC would be given an additional month to respond to the objections raised (article 60 GDPR).

According to information available to politico, the DPC is intending to halt Meta’s EU-US transfer. The DPC is said to have concluded in its out of “own volition” draft decision that Meta can no longer rely on the SCCs when it transfers their user’s personal data to US based servers. In other words, even though Meta has implemented the EU’s SSCs, it cannot be ruled out that US intelligence services may gain access to personal data of data subjects using facebook, instagram and other meta products.

Following the striking down of both, the Safe Harbour Agreement in 2015 and the EU-US Privacy Shield in 2020 by the Court of Justice of the European Union, this draft decision seems to question the legality and compatibility of EU-US data transfers with the GDPR for a third time.

In this context it is worthy to consider a statement Meta made in its annual report to the United States Securities and Exchange Commission (SEC):

“If a new transatlantic data transfer framework is not adopted and we are unable to continue to rely on SCCs or rely upon other alternative means of data transfers from Europe to the United States, we will likely be unable to offer a number of our most significant products and services, including Facebook and Instagram, in Europe, which would materially and adversely affect our business, financial condition, and results of operations.”

Despite the possibility of a halt of Meta’s EU-US data transfers, there is reason to believe that this DPC initiated procedure will be continued in the future and that it will go beyond the previously mentioned four-weeks timeline. “We expect other DPAs to issue objections, as some major issues are not dealt with in the DPC’s draft. This will lead to another draft and then a vote”, says NOYB’s Max Schrems who filed the original complaint to the DPC. Hence, it seems rather unlikely that an instant stop of an EU-US transfer will occur. Instead, we could rather expect article 65 GDPR to be triggered meaning that the EDPB would be required to issue a final decision, including a vote, on the matter.

With no concrete EU-US transfer agreement in sight and the ongoing uncertainty on whether the DPC will eventually succeed with its draft decision, this matter continues to be of big interest.

U.S. lawmakers unveil bipartisan Data Privacy and Protection Act

30. June 2022

In early June, three of the four chairmen of the U.S. congressional committees responsible for data privacy submitted a drafted American Data Privacy and Protection Act (ADPPA) for consideration. If passed, it would override certain recently enacted privacy laws in some U.S. states.

The draft includes elements of the California Consumer Privacy Act and the European General Data Protection Regulation.

States led the way

Until now, data protection in the United States has primarily been at the top of the agenda at the state level. California, Colorado, Connecticut, Virginia and Utah have recently enacted comprehensive data privacy laws. This year alone, more than 100 privacy bills have already been introduced in the states.  Although not all of these were adopted, the proliferation of state laws and their varying regulatory requirements has led to increasing calls for the adoption of a federal privacy law. A unified federal law, if passed, would provide much-needed clarity to entities and businesses and, ideally, would also stem the tide of class action and other privacy lawsuits brought under various state laws.

Affected Entities

The ADPPA broadly applies (with exceptions) to organizations operating in the United States that collect, process, or transfer personal information and fall into one of the following categories:

  • Subject to the Federal Trade Commission Act
  • Nonprofit organizations
  • So-called Common Carriers, subject to Title II of the Communications Act of 1934

Requirements of the ADPPA (not final)

  • Limit data collection and processing to that which is reasonably necessary
  • Compliance with public and internal privacy regulations
  • Granting consumer rights such as access, correction, and deletion
  • Appeal options
  • Obtaining consent before collecting or processing sensitive data, e.g. geolocation, genetic and biometric information, and browsing history
  • Appointment of a data protection officer
  • Providing evidence that adequate safeguards are in place
  • Registration of data brokers with the Federal Trade Commission (FTC)
  • FTC will establish and maintain a searchable, centralized online public registry of all registered data traders, as well as a “Do Not Collect” registry that will allow individuals to request all data traders to delete their data within 30 days
  • Entities shall not collect, process, or transfer collected data in a manner that discriminates on the basis of race, color, religion, national origin, sex, sexual orientation, or disability
  • Implement appropriate administrative, technical, and physical data security practices and procedures to protect covered data from unauthorized access and disclosure

Outcome still uncertain

Shortly after a draft of the ADPPA was released, privacy organizations, civil liberties groups, and businesses spoke out, taking sides for and against the law.

As the legislative session draws to a close, the prospects for ADPPA’s adoption remain uncertain. Strong disagreement remains among key stakeholders on important aspects of the proposed legislation. However, there is consensus that the United States is in dire need of a federal privacy law. Thus, passage of such legislation is quite likely in the foreseeable future.

Connecticut enacts privacy law

3. June 2022

On May 10, 2022, Connecticut Gov. Ned Lamont approved the Connecticut Privacy Act (“CTDPA”) concerning Personal Data Privacy and Online Monitoring. The passage of the CTDPA continues the trend in the U.S. for states to individually address consumer rights and business obligations relating to consumer data, in the absence of uniform legislation from the U.S. Congress. This makes Connecticut the 5th state in the United Sates to pass a comprehensive data privacy law.

The CTDPA shares many similarities with the California Consumer Privacy Act (“CPRA”), Colorado Privacy Act (“CPA”), Virginia Consumer Data Protection Act (“VCDPA”) and Utah Consumer Privacy Act (“UCPA”). The Connecticut Privacy Act applies to “personal data”, which is defined as “any information that is linked or reasonably linkable to an identified or identifiable individual,” not including de-identified data or publicly available information. It imposes obligations on both controllers and processors of personal data.

Who does the Connecticut Privacy Act apply to?

The law will apply to individuals and entities who/ that

  • conduct business in Connecticut.
  • produce products or services that are targeted to Connecticut residents.
  • during the preceding calendar year, either controlled or processed the personal data of at least 100,000 consumers (excluding for the purpose of completing a payment transaction) or controlled or processed the personal data of at least 25,000 consumers and derived more than 25% of their gross revenue from the sale of personal data.

Certain entities are exempted, for example:

  • State and local government entities
  • Nonprofits
  • Higher education institutions
  • Financial institutions subject to the Gramm-Leach-Bliley Act (“GLB”)
  • Entities and business associates subject to the Health Insurance Portability and Accountability Act (“HIPAA”)

Consumers will have the right to:

  • access – the right to know what personal data a company has collected about them
  • correct inaccuracies in the consumer’s personal data
  • delete personal data provided by, or obtained about, the consumer
  • obtain a copy of the consumer’s personal data processed by a controller, in a portable and, to the extent technically feasible, readily usable format
  • opt out of the processing of their personal data for purposes of targeted advertising, the sale of personal data, or profiling in furtherance of solely automated decisions that produce legal or similarly significant effects concerning the consumer

Among other obligations, controllers will be required to:

  • limit the use of the personal data to only the purpose of the collection (“what is adequate, relevant and reasonably necessary”) or as the consumer has authorized
  • establish, implement and maintain reasonable administrative, technical and physical data security practices
  • not to process personal data of a consumer for purposes of targeted advertising
  • obtain consent before processing sensitive data, including data of any individual under the age of 13, and follow the provisions of the Children’s Online Privacy Protection Act

The Connecticut Privacy Act is set to become effective on July 1, 2023. Violation of the CPDPA may result in an enforcement action by the Connecticut Attorney General (AG), who can levy fines and penalties under the Connecticut Unfair Trade Practices Act. However, there is a grace period for enforcement actions until December 31, 2024, for the AG to provide organizations an opportunity to cure any alleged violations.

Like other US data privacy laws, the Connecticut laws are not as comprehensive as the EU’s GDPR but they better align with some of the definitions and especially the mechanisms of consent.

UK announces Data Reform Bill

31. May 2022

In 2021 the Department for Culture, Media and Sport (DCMS) published a consultation document entitled “Data: a new direction”, requesting opinions on proposals that could bring changes to the UK’s data protection regime. On May 10, 2022, as part of the Queen’s Speech, Prince Charles confirmed that the government of the United Kingdom (UK) is in the process of reforming its data privacy rules, raising questions about whether the country could still be in compliance with the General Data Protection Regulation (GDPR).

Other than the statement itself, not much information was provided regarding the specific details. The accompanying briefing notes provided more information. They set out the main purposes of the Bill, namely to:

  • The establishment of a new pro-growth and trusted data protection framework
  • Reducing the burdens on business
  • Creation of a world class data rights regime
  • Supporting innovation
  • Driving industry participation in schemes which give citizens and small businesses more control of their data, particularly in relation to health and social care
  • Modernization of the  Information Commissioner’s Office (ICO), including strengthening its enforcement powers and increasing its accountability

Nevertheless, the defined goals are rather superficial. Another concern is that the new bill could deviate too far from the GDPR. The new regime might not be able to retain the adequacy-status with the EU, allowing personal data to be exchanged between UK and EU organizations. Prime Minister Johnson said that the Data Reform Bill would “improve the burdensome GDPR, allowing information to be shared more effectively and securely between public bodies.” So far, no time frame for the adoption of the new law has been published.

Twitter fined $150m for handing users’ contact details to advertisers

30. May 2022

Twitter has been fined $150 million by U.S. authorities after the company collected users’ email addresses and phone numbers for security reasons and then used the data for targeted advertising. 

According to a settlement with the U.S. Department of Justice and the Federal Trade Commission, the social media platform had told users that the information would be used to keep their accounts secure. “While Twitter represented to users that it collected their telephone numbers and email addresses to secure their accounts, Twitter failed to disclose that it also used user contact information to aid advertisers in reaching their preferred audiences,” said a court complaint filed by the DoJ. 

A stated in the court documents, the breaches occurred between May 2013 and September 2019, and the information was apparently used for purposes such as two-factor authentication. However, in addition to the above-mentioned purposes, Twitter used that data to allow advertisers to target specific groups of users by matching phone numbers and email addresses with advertisers’ own lists. 

In addition to financial compensation, the settlement requires Twitter to improve its compliance practices. According to the complaint, the false disclosures violated FTC law and a 2011 settlement with the agency. 

Twitter’s chief privacy officer, Damien Kieran, said in a statement that the company has “cooperated with the FTC at every step of the way.” 

“In reaching this settlement, we have paid a $150m penalty, and we have aligned with the agency on operational updates and program enhancements to ensure that people’s personal data remains secure, and their privacy protected,” he added. 

Twitter generates 90 percent of its $5 billion (£3.8 billion) in annual revenue from advertising.  

The complaint also alleges that Twitter falsely claimed to comply with EU and U.S. privacy laws, as well as Swiss and U.S. privacy laws, which prohibit companies from using data in ways that consumers have not approved of. 

The settlement with Twitter follows years of controversy over tech companies’ privacy practices. Revelations in 2018 that Facebook, the world’s largest social network, used phone numbers provided for two-factor authentication for advertising purposes enraged privacy advocates. Facebook, now Meta, also settled the matter with the FTC as part of a $5 billion settlement in 2019. 

 

Record GDPR fine by the Hungarian Data Protection Authority for the unlawful use of AI

22. April 2022

The Hungarian Data Protection Authority (Nemzeti Adatvédelmi és Információszabadság Hatóság, NAIH) has recently published its annual report in which it presented a case where the Authority imposed the highest fine to date of ca. €670,000 (HUF 250 million).

This case involved the processing of personal data by a bank that acted as a data controller. The controller automatically analyzed recorded audio of costumer calls. It used the results of the analysis to determine which customers should be called back by analyzing the emotional state of the caller using an artificial intelligence-based speech signal processing software that automatically analyzed the call based on a list of keywords and the emotional state of the caller. The software then established a ranking of the calls serving as a recommendation as to which caller should be called back as a priority.

The bank justified the processing on the basis of its legitimate interests in retaining its customers and improving the efficiency of its internal operations.

According to the bank this procedure aimed at quality control, in particular at the prevention of customer complaints. However, the Authority held that the bank’s privacy notice referred to these processing activities in general terms only, and no material information was made available regarding the voice analysis itself. Furthermore, the privacy notice only indicated quality control and complaint prevention as purposes of the data processing.

In addition, the Authority highlighted that while the Bank had conducted a data protection impact assessment and found that the processing posed a high risk to data subjects due to its ability to profile and perform assessments, the data protection impact assessment did not provide substantive solutions to address these risks. The Authority also emphasized that the legal basis of legitimate interest cannot serve as a “last resort” when all other legal bases are inapplicable, and therefore data controllers cannot rely on this legal basis at any time and for any reason. Consequently, the Authority not only imposed a record fine, but also required the bank to stop analyzing emotions in the context of speech analysis.

 

Google launches “Reject All” button on cookie banners

After being hit with a €150 million fine by France’s data protection agency CNIL earlier in the year for making the process of rejecting cookies unnecessarily confusing and convoluted for users, Google has added a new “Reject All” button to the cookie consent banners that have become ubiquitous on websites in Europe. Users visiting Search and YouTube in Europe while signed out or in incognito mode will soon see an updated cookie dialogue with reject all and accept all buttons.

Previously, users only had two options: “I accept” and “personalize.” While this allowed users to accept all cookies with a single click, they had to navigate through various menus and options if they wanted to reject all cookies. “This update, which began rolling out earlier this month on YouTube, will provide you with equal “Reject All” and “Accept All” buttons on the first screen in your preferred language,” wrote Google product manager Sammit Adhya in a blog post.

According to Google they have kicked off the rollout of the new cookie banner in France and will be extending the change to all Google users in Europe, the U.K., and Switzerland soon.

Google’s plan to include a “Reject All” button on cookie banners after its existing policy violated EU law was also welcomed by Hamburg’s Commissioner for Data Protection and Freedom of Information Thomas Fuchs during a presentation of his 2021 activity report.

But the introduction of the “Reject All” button is likely to be only an interim solution because the US giant already presented far-reaching plans at the end of January to altogether remove Google cookies from third-party providers by 2023.

Instead of cookies, the internet giant wants to rely on in-house tracking technology for the Google Privacy Sandbox project.

UK’s new data protection clauses now in force

31. March 2022

After the British government announced reforms to UK’s data protection system last year, the Secretary of State submitted on February 2nd, 2022, a framework to the Parliament to regulate international data transfers and replace the EU Standard Contractual Clauses (SCC). As no objections were raised and the Parliament approved the documents, they entered into force on March 21st, 2022.

The set of rules consists of the International Data Transfer Agreement (IDTA), the International Data Transfer Addendum to the European Commission’s SCC for international data transfers (Addendum) and a Transitional Provisions document. The transfer rules are issued under Section 119A of the Data Protection Act 2018 and take into account the binding judgement of the European Court of Justice in the case commonly referred to as “Schrems II”.

The documents serve as a new tool for compliance with Art. 46 UK GDPR for data transfers to third countries and broadly mirror the rules of the EU GDPR. The UK government also retained the ability to issue its own adequacy decisions regarding data transfers to other third countries and international organizations.

The transfer rules are of immediate benefit to organizations transferring personal data outside the UK. In addition, the transitional provisions allow organizations to rely on the EU SCC until March 21st, 2024, for contracts entered into up to and including September 21st, 2022. However, this is subject to the condition that the data processing activities remain unchanged and that the clauses ensure adequate safeguards.

Pages: 1 2 3 4 5 6 7 8 9 10 ... 20 21 22 Next
1 2 3 22