Category: USA

American company ordered to pay 75.000 Euros to wrongfully terminated employee

12. October 2022

A few days ago a Dutch court ordered a Florida – based company to pay a compensation of 75.000 Euros to an employee. The employee had been fired because he had refused to keep his work computer’s camera on the whole day, as required by the company, being concerned with the fact that this was an invasion of his privacy.

After he was fired he took his former employer to court, suing for wrongful termination; the judges recognized the issue and stated that the American company’s regulation was a violation of the employee’s privacy and were in violation of data protection laws. The worker had already stated his complaint with his employer, also stating that they already could see his shared screen while he was working, and that it was not necessary for him to keep the camera on.

Rather than a matter of personal data protection, this was a matter of the employee’s right to privacy, as stated in Article 8 of the European Convention of Human Rights: the court argued that the company’s request was disproportionate and intrusive of the worker’s privacy.

According to Dutch law, an appeal is possible for the company within three months of the ruling. In the aftermath of the ruling, the company shut down its offices in Rijswijk, Netherlands, where the plaintiff worked.

Personal data risks in the aftermath of the overturning of Roe vs. Wade

23. August 2022

At the end of June 2022, the United States Supreme Court overturned its 1973 ruling in the case of Roe vs. Wade, thus concretely ending federal abortion rights. The decision caused a worldwide outrage, but now a concerning situation presents itself: the massive use of social media and the Internet by the population could result in serious personal privacy violations by the authorities. For example, tech giants such as Apple, Google and Meta Inc. could share users’ data if law enforcement authorities suspect a felony is being committed. This could especially be the case in those States who chose to make abortion illegal after the Supreme Court’s ruling. According to the United States’ Federal Rules of Civil Procedure no. 45, this kind of personal data could be made object of a subpoena, thus forcing the subject to produce them in court. In such a scenario tech companies would have no choice than to provide the consumer’s data. It is clear that this is a high risk for the consumer’s privacy.

In particular, location data could show if a person visited an abortion clinic. Many women use specific apps in order to track periods, fertility and an eventual pregnancy. All these data could be put under surveillance and seized by law enforcement in order to investigate and prosecute abortion – related cases.

In some States this already happened. In 2018 in Mississippi a woman was charged with second – degree murder after seeking health care for a pregnancy loss which happened at home. Prosecutors produced her Internet browser history as proof. After two years she was acquitted of the charges.

Another risk is posed by the so – called data brokers: these are companies that harvest data, cleanse or analyze it and sell them to the highest bidder. These companies could also be used by law enforcement agencies to arbitrarily investigate people who could be related to abortion cases.

The lack of legislation regarding personal data protection is a serious issue in the United States. For example, there is no principle of data minimization as found in the GDPR. The Supreme Courts’ ruling makes this historical moment unexplored territory from a legal point of view. Privacy advisors and activists recommend to try to limit the digital footprint users leave on the web. Also, new laws and bills could be introduce in order to limit the access law enforcement agencies have to personal data.

Privacy issues in the antitrust legal framework: “the Facebook case”

21. July 2022

European countries were among the first to introduce privacy laws in the context of antitrust and in the competition law framework. As a result of this implementation, in 2019 the German Federal Cartel Office took action to stop Facebook (now a part of Meta Inc.) from further processing personal data that had been acquired through third – party installations (most of all referring to cookies). The proceedings on the matter are still ongoing. Recently also the Irish Data Protection Authority took position against Facebook (which has in the meantime become Meta Inc.), by preventing the American tech giant to transfer user data to the United States due to data safety issues. Also in this matter the parties are still in debate.

In 2014 Facebook notoriously purchased messaging company WhatsApp for almost 22 bln. dollars. At the time Europe did not give much thought to the potential consequences of this merger. This operation was the object of an opinion of the European Commission; in the Commission’s mind the two companies’ privacy policies were way different, and the thought that Facebook now had control over all of the data collected by WhatsApp did not sit well with the European authorities. Another key argument brought forward by the Commission was the lack of an effective competition between the two companies. However, no further action was taken at the time.

A few years later, academic research highlighted the mistake made by the European Commission in not considering the enormous meaning personal data have for these tech companies: due to the fact that personal data are considered to be so – called “nonprice competition”, they play a key role in the strategies and decision – making of big data – driven business models. In particular, when a company depends on collecting and using personal data, it usually lowers the bar of privacy protection standards and raises the number of data collected. This argument was brought forward by the U.K.’s Competition Agency, which stated that by considering the enormous importance personal data have gained in the digital market, companies such as Facebook do not have to face a strong competition in their business.

These arguments and the growing unrest in various DPAs around the globe has brought in 2020 to the notorious investigation of Facebook by the Federal Trade Commission of the United States. In particular the FTC accused Meta Inc. (in particular Facebook) of stifling its competition in order to retain its monopoly of the digital market. On one hand an American court dismissed the claims, but on the other hand the high risks connected with an enormous data collection was highlighted. In particular, according to Section 2 of the Sherman Act, the State has:

  • To prove that a company is in fact a monopoly, and
  • That it has to harm consumers

This does not apply directly to the case, but the FTC argued that the harm to the consumers is to be seen in Meta Inc.’s lowering privacy standards. The case is still pending as of July 2022.

This merger showed how much privacy and antitrust issues overlap in the digitalized market.

In the following months, policymakers and enforcers both in the United States and in the European Union have been struggling to establish new sets of rules to better regulate mergers between companies whose business model relies on the collection of personal data, and above all they called for more cooperation between privacy and antitrust agencies.

DPC sends draft decision on Meta’s EU-US data transfers to other European DPAs

14. July 2022

On July 7, 2022, it became known that the Irish Data Protection Commission (DPC) had forwarded a draft decision concerning Meta’s EU-US data transfers to other European DPAs for consultation. Having to respect a four-week-period, European DPAs may comment on this draft or formulate objections to it. In such an event, the DPC would be given an additional month to respond to the objections raised (article 60 GDPR).

According to information available to politico, the DPC is intending to halt Meta’s EU-US transfer. The DPC is said to have concluded in its out of “own volition” draft decision that Meta can no longer rely on the SCCs when it transfers their user’s personal data to US based servers. In other words, even though Meta has implemented the EU’s SSCs, it cannot be ruled out that US intelligence services may gain access to personal data of data subjects using facebook, instagram and other meta products.

Following the striking down of both, the Safe Harbour Agreement in 2015 and the EU-US Privacy Shield in 2020 by the Court of Justice of the European Union, this draft decision seems to question the legality and compatibility of EU-US data transfers with the GDPR for a third time.

In this context it is worthy to consider a statement Meta made in its annual report to the United States Securities and Exchange Commission (SEC):

“If a new transatlantic data transfer framework is not adopted and we are unable to continue to rely on SCCs or rely upon other alternative means of data transfers from Europe to the United States, we will likely be unable to offer a number of our most significant products and services, including Facebook and Instagram, in Europe, which would materially and adversely affect our business, financial condition, and results of operations.”

Despite the possibility of a halt of Meta’s EU-US data transfers, there is reason to believe that this DPC initiated procedure will be continued in the future and that it will go beyond the previously mentioned four-weeks timeline. “We expect other DPAs to issue objections, as some major issues are not dealt with in the DPC’s draft. This will lead to another draft and then a vote”, says NOYB’s Max Schrems who filed the original complaint to the DPC. Hence, it seems rather unlikely that an instant stop of an EU-US transfer will occur. Instead, we could rather expect article 65 GDPR to be triggered meaning that the EDPB would be required to issue a final decision, including a vote, on the matter.

With no concrete EU-US transfer agreement in sight and the ongoing uncertainty on whether the DPC will eventually succeed with its draft decision, this matter continues to be of big interest.

U.S. lawmakers unveil bipartisan Data Privacy and Protection Act

30. June 2022

In early June, three of the four chairmen of the U.S. congressional committees responsible for data privacy submitted a drafted American Data Privacy and Protection Act (ADPPA) for consideration. If passed, it would override certain recently enacted privacy laws in some U.S. states.

The draft includes elements of the California Consumer Privacy Act and the European General Data Protection Regulation.

States led the way

Until now, data protection in the United States has primarily been at the top of the agenda at the state level. California, Colorado, Connecticut, Virginia and Utah have recently enacted comprehensive data privacy laws. This year alone, more than 100 privacy bills have already been introduced in the states.  Although not all of these were adopted, the proliferation of state laws and their varying regulatory requirements has led to increasing calls for the adoption of a federal privacy law. A unified federal law, if passed, would provide much-needed clarity to entities and businesses and, ideally, would also stem the tide of class action and other privacy lawsuits brought under various state laws.

Affected Entities

The ADPPA broadly applies (with exceptions) to organizations operating in the United States that collect, process, or transfer personal information and fall into one of the following categories:

  • Subject to the Federal Trade Commission Act
  • Nonprofit organizations
  • So-called Common Carriers, subject to Title II of the Communications Act of 1934

Requirements of the ADPPA (not final)

  • Limit data collection and processing to that which is reasonably necessary
  • Compliance with public and internal privacy regulations
  • Granting consumer rights such as access, correction, and deletion
  • Appeal options
  • Obtaining consent before collecting or processing sensitive data, e.g. geolocation, genetic and biometric information, and browsing history
  • Appointment of a data protection officer
  • Providing evidence that adequate safeguards are in place
  • Registration of data brokers with the Federal Trade Commission (FTC)
  • FTC will establish and maintain a searchable, centralized online public registry of all registered data traders, as well as a “Do Not Collect” registry that will allow individuals to request all data traders to delete their data within 30 days
  • Entities shall not collect, process, or transfer collected data in a manner that discriminates on the basis of race, color, religion, national origin, sex, sexual orientation, or disability
  • Implement appropriate administrative, technical, and physical data security practices and procedures to protect covered data from unauthorized access and disclosure

Outcome still uncertain

Shortly after a draft of the ADPPA was released, privacy organizations, civil liberties groups, and businesses spoke out, taking sides for and against the law.

As the legislative session draws to a close, the prospects for ADPPA’s adoption remain uncertain. Strong disagreement remains among key stakeholders on important aspects of the proposed legislation. However, there is consensus that the United States is in dire need of a federal privacy law. Thus, passage of such legislation is quite likely in the foreseeable future.

Connecticut enacts privacy law

3. June 2022

On May 10, 2022, Connecticut Gov. Ned Lamont approved the Connecticut Privacy Act (“CTDPA”) concerning Personal Data Privacy and Online Monitoring. The passage of the CTDPA continues the trend in the U.S. for states to individually address consumer rights and business obligations relating to consumer data, in the absence of uniform legislation from the U.S. Congress. This makes Connecticut the 5th state in the United Sates to pass a comprehensive data privacy law.

The CTDPA shares many similarities with the California Consumer Privacy Act (“CPRA”), Colorado Privacy Act (“CPA”), Virginia Consumer Data Protection Act (“VCDPA”) and Utah Consumer Privacy Act (“UCPA”). The Connecticut Privacy Act applies to “personal data”, which is defined as “any information that is linked or reasonably linkable to an identified or identifiable individual,” not including de-identified data or publicly available information. It imposes obligations on both controllers and processors of personal data.

Who does the Connecticut Privacy Act apply to?

The law will apply to individuals and entities who/ that

  • conduct business in Connecticut.
  • produce products or services that are targeted to Connecticut residents.
  • during the preceding calendar year, either controlled or processed the personal data of at least 100,000 consumers (excluding for the purpose of completing a payment transaction) or controlled or processed the personal data of at least 25,000 consumers and derived more than 25% of their gross revenue from the sale of personal data.

Certain entities are exempted, for example:

  • State and local government entities
  • Nonprofits
  • Higher education institutions
  • Financial institutions subject to the Gramm-Leach-Bliley Act (“GLB”)
  • Entities and business associates subject to the Health Insurance Portability and Accountability Act (“HIPAA”)

Consumers will have the right to:

  • access – the right to know what personal data a company has collected about them
  • correct inaccuracies in the consumer’s personal data
  • delete personal data provided by, or obtained about, the consumer
  • obtain a copy of the consumer’s personal data processed by a controller, in a portable and, to the extent technically feasible, readily usable format
  • opt out of the processing of their personal data for purposes of targeted advertising, the sale of personal data, or profiling in furtherance of solely automated decisions that produce legal or similarly significant effects concerning the consumer

Among other obligations, controllers will be required to:

  • limit the use of the personal data to only the purpose of the collection (“what is adequate, relevant and reasonably necessary”) or as the consumer has authorized
  • establish, implement and maintain reasonable administrative, technical and physical data security practices
  • not to process personal data of a consumer for purposes of targeted advertising
  • obtain consent before processing sensitive data, including data of any individual under the age of 13, and follow the provisions of the Children’s Online Privacy Protection Act

The Connecticut Privacy Act is set to become effective on July 1, 2023. Violation of the CPDPA may result in an enforcement action by the Connecticut Attorney General (AG), who can levy fines and penalties under the Connecticut Unfair Trade Practices Act. However, there is a grace period for enforcement actions until December 31, 2024, for the AG to provide organizations an opportunity to cure any alleged violations.

Like other US data privacy laws, the Connecticut laws are not as comprehensive as the EU’s GDPR but they better align with some of the definitions and especially the mechanisms of consent.

Twitter fined $150m for handing users’ contact details to advertisers

30. May 2022

Twitter has been fined $150 million by U.S. authorities after the company collected users’ email addresses and phone numbers for security reasons and then used the data for targeted advertising. 

According to a settlement with the U.S. Department of Justice and the Federal Trade Commission, the social media platform had told users that the information would be used to keep their accounts secure. “While Twitter represented to users that it collected their telephone numbers and email addresses to secure their accounts, Twitter failed to disclose that it also used user contact information to aid advertisers in reaching their preferred audiences,” said a court complaint filed by the DoJ. 

A stated in the court documents, the breaches occurred between May 2013 and September 2019, and the information was apparently used for purposes such as two-factor authentication. However, in addition to the above-mentioned purposes, Twitter used that data to allow advertisers to target specific groups of users by matching phone numbers and email addresses with advertisers’ own lists. 

In addition to financial compensation, the settlement requires Twitter to improve its compliance practices. According to the complaint, the false disclosures violated FTC law and a 2011 settlement with the agency. 

Twitter’s chief privacy officer, Damien Kieran, said in a statement that the company has “cooperated with the FTC at every step of the way.” 

“In reaching this settlement, we have paid a $150m penalty, and we have aligned with the agency on operational updates and program enhancements to ensure that people’s personal data remains secure, and their privacy protected,” he added. 

Twitter generates 90 percent of its $5 billion (£3.8 billion) in annual revenue from advertising.  

The complaint also alleges that Twitter falsely claimed to comply with EU and U.S. privacy laws, as well as Swiss and U.S. privacy laws, which prohibit companies from using data in ways that consumers have not approved of. 

The settlement with Twitter follows years of controversy over tech companies’ privacy practices. Revelations in 2018 that Facebook, the world’s largest social network, used phone numbers provided for two-factor authentication for advertising purposes enraged privacy advocates. Facebook, now Meta, also settled the matter with the FTC as part of a $5 billion settlement in 2019. 

 

European Commission and United States agree in principle on Trans-Atlantic Data Privacy Framework

29. March 2022

On March 25th, 2022, the United States and the European Commission have committed to a new Trans-Atlantic Data Privacy Framework that aims at taking the place of the previous Privacy Shield framework.

The White House stated that the Trans-Atlantic Data Privacy Framework “will foster trans-Atlantic data flows and address the concerns raised by the Court of Justice of the European Union when it struck down in 2020 the Commission’s adequacy decision underlying the EU-US Privacy Shield framework”.

According to the joint statement of the US and the European Commission, “under the Trans-Atlantic Data Privacy Framework, the United States is to put in place new safeguards to ensure that signals surveillance activities are necessary and proportionate in the pursuit of defined national security objectives, establish a two-level independent redress mechanism with binding authority to direct remedial measures, and enhance rigorous and layered oversight of signals intelligence activities to ensure compliance with limitations on surveillance activities”.

This new Trans-Atlantic Data Privacy Framework has been a strenuous work in the making and reflects more than a year of detailed negotiations between the US and EU led by Secretary of Commerce Gina Raimondo and Commissioner for Justice Didier Reynders.

It is hoped that this new framework will provide a durable basis for the data flows between the EU and the US, and underscores the shared commitment to privacy, data protection, the rule of law, and the collective security.

Like the Privacy Shield before, this new framework will represent a self-certification with the US Department of Commerce. Therefore, it will be crucial for data exporters in the EU to ensure that their data importers are certified under the new framework.

The establishment of a new “Data Protection Review Court” will be the responsible department in cases of the new two-tier redress system that will allow EU citizens to raise complaints in cases of access of their data by US intelligence authorities, aiming at investigating and resolving the complaints.

The US’ commitments will be concluded by an Executive Order, which will form the basis of the adequacy decision by the European Commission to put the new framework in place. While this represents a quicker solution to reach the goal, it also means that Executive Orders can be easily repealed by the next government of the US. Therefore, it remains to be seen if this new framework, so far only agreed upon in principle, will bring the much hoped closure on the topic of trans-Atlantic data flows that is intended to bring.

Apps are tracking personal data despite contrary information

15. February 2022

Tracking in apps enables the app providers to offer users personalized advertising. On the one hand, this causes higher financial revenues for app providers. On the other hand, it leads to approaches regarding data processing which are uncompliant with the GDPR.

For a year now data privacy labels are mandatory and designed to show personal data the app providers access (article in German) and provide to third parties. Although these labels on iPhones underline that data access does not take place, 80% of the analyzed applications that have these labels have access to data by tracking personal information. This is a conclusion of an analysis done by an IT specialist at the University of Oxford.

For example, the “RT News” app, which supposedly does not collect data, actually provides different sets of data to tracking services like Facebook, Google, ComScore and Taboola. However, data transfer activities have to be shown in the privacy labels of apps that may actually contain sensitive information of viewed content.

In particular, apps that access GPS location information are sold by data companies. This constitutes an abuse of data protection because personal data ishandled without being data protection law compliant and provided illegally to third parties.

In a published analysis in the Journal Internet Policy Review, tests of two million Android apps have shown that nearly 90 percent of Google’s Play Store apps share data with third parties directly after launching the app. However, Google indicates that these labels with false information about not tracking personal data come from the app provider. Google therefore evades responsibility for the implementation for these labels. Whereby, Apple asserts that controls of correctness are made.

Putting it into perspective, this issue raises the question whether these privacy labels make the use of apps safer in terms of data protection. One can argue that, if the app developers can simply give themselves these labels under Google, the Apple approach seems more legitimate. It remains to be seen if any actions will be taken in this regard.

CNIL judges use of Google Analytics illegal

14. February 2022

On 10th February 2022, the French Data Protection Authority Commission Nationale de l’Informatique et des Libertés (CNIL) has pronounced the use of Google Analytics on European websites to not be in line with the requirements of the General Data Protection Regulation (GDPR) and has ordered the website owner to comply with the requirements of the GDPR within a month’s time.

The CNIL judged this decision in regard to several complaints maybe by the NOYB association concerning the transfer to the USA of personal data collected during visits to websites using Google Analytics. All in all, NOYB filed 101 complaints against data controllers allegedly transferring personal data to the USA in all of the 27 EU Member States and the three further states of European Economic Area (EEA).

Only two weeks ago, the Austrian Data Protection Authority (ADPA) made a similar decision, stating that the use of Google Analytics was in violation of the GDPR.

Regarding the French decision, the CNIL concluded that transfers to the United States are currently not sufficiently regulated. In the absence of an adequacy decision concerning transfers to the USA, the transfer of data can only take place if appropriate guarantees are provided for this data flow. However, while Google has adopted additional measures to regulate data transfers in the context of the Google Analytics functionality, the CNIL deemed that those measures are not sufficient to exclude the accessibility of the personal data for US intelligence services. This would result in “a risk for French website users who use this service and whose data is exported”.

The CNIL stated therefore that “the data of Internet users is thus transferred to the United States in violation of Articles 44 et seq. of the GDPR. The CNIL therefore ordered the website manager to bring this processing into compliance with the GDPR, if necessary by ceasing to use the Google Analytics functionality (under the current conditions) or by using a tool that does not involve a transfer outside the EU. The website operator in question has one month to comply.”

The CNIL has also given advice regarding website audience measurement and analysis services. For these purposes, the CNIL recommended that these tools should only be used to produce anonymous statistical data. This would allow for an exemption as the aggregated data would not be considered “personal” data and therefore not fall under the scope of the GDPR and the requirements for consent, if the data controller ensures that there are no illegal transfers.

Pages: 1 2 3 4 5 6 7 8 9 10 ... 13 14 15 Next
1 2 3 15