Tag: USA

U.S. Commerce Department publishes FAQs on EU-US Privacy Shield

12. August 2020

The U.S. Commerce Department has released a frequently asked questions page (FAQ) with regards to the EU-US Privacy Shield, following the latest decision of the Court of Justice of the European Union (CJEU) in the Schrems II case.

The FAQ consists of five questions which revolve around the situation after the invalidation of the Privacy Shield by the CJEU, especially the status of companies already certified under the Privacy Shield.

The Commerce Department states in its FAQ that despite the invalidity of the Privacy Shield certification as a GDPR compliant transfer mechanism, the decision of the CJEU does not relieve companies certified under the Privacy Shield from their obligations. On July 21, 2020, the Federal Trade Commission (FTC) stated that they expect controllers to continue to follow the obligations laid out under the Privacy Shield Framework for transfers.

Further, the Commerce Department will continue to administer certification and re-certification under the Privacy Shield despite the new development. The Commerce Department emphasizes that the continued dedication to the Privacy Shield will show the commitment of the parties and the controllers certified under it to the Data Protection cause.

However, the Commerce Department also notes that the costs coming along with a Privacy Shield certification will remain, which could have an effect on the motivation for companies to get self- and re-certified.

Transatlantic Data Transfers in light of the Two Year Anniversary of GDPR Application

7. July 2020

In the last two years since the General Data Protection Regulation (GDPR) came into effect on May 25, 2018, it has received an overall positive feedback and structured the data protection culture not only in the European Union, but has set an example for international privacy standards.

However, especially from the American side of the world, criticism has been constant. Different principles are a prerequisite for different opinions and priorities, and the effort to bring European data protection standards and American personal data business together has been a challenge on both sides.

One of the main criticisms coming from the US government is the increasing obstacles the GDPR poses in case of cybercrime investigations and law enforcement. Not only the restrictive implications of the GDPR are an issue, but also the divergent interpretations due to national adaptations of the GDPR are seen as a problem by government officials.

In the cases of cybercrime, the main issue for the US critics is the now less effective database of domain name owners, WHOIS. The online directory, which was created in the 1970s, is an important tool for law enforcement combatting cybercrime. Before the GDPR came into effect in 2018, the request for information on domain owners was straightforward. Now, due to the restrictions of the GDPR, this process has been made long and tedious.

But fighting cybercrime is not the only tension between the EU and the USA concerning data protection. In a judgement in the Schrems II case, expected for July 16, 2020, the European Court of Justice (ECJ) is expected to take a stance on transatlantic data transfers and the current Privacy Shield, which is the basis for the EU-US dataflows under adequate data protection standards. If the Privacy Shield is deemed insufficient protection, it will have a major effect on EU-US business transactions.

However, these are issues that the European Commission (EC) is very aware of. In their communication concerning the two-year review of the GDPR, the Commission stated that they are planning to balance out diverging and fragmented interpretations of the GDPR on national levels and find a common data protection culture within Europe.

In addition, the restrictions the GDPR poses to law enforcement are another point the European Commission knows it needs to fix. The plan for the future is a bilateral and multilateral framework that can allow for simple requests to share data for law enforcement purposes and avoid conflicts of law, while keeping data protection safeguards intact.

The upcoming judgement of the ECJ is seen with watchful eyes by the Commission, and will be incorporated in their upcoming adequacy decisions and re-evaluations, as well as their development of a modern international transfer toolbox, which includes a modernized version of the standard contractual clauses.

Overall, the two-year mark of the existence of the GDPR is seen more as a success, despite the clear areas for future improvement. One of the big challenges in transatlantic data transfers ahead is without a doubt the outcome of the judgement in the Schrems case in mid-July, the implications of which are, at this point in time, not yet able to be defined.

Zoom agrees on security and privacy measures with NY Attorney General

13. May 2020

Due to the COVID-19 pandemic, Zoom has seen an exponential surge in new users over the past two months. As we have mentioned in a previous blog post, this increase in activity highlighted a range of different issues and concerns both on the security and on the privacy side of the teleconference platform.

In light of these issues, which induced a wave of caution around the use of Zoom by a lot of companies, schools, religious institutions and governmental departments, urging to stop the use of the platform, Zoom has agreed to enhance security measures and privacy standards.

In the Agreement struck on May 7th with the New York Attorney General Laetitia James, Zoom has come to terms over several new measures it will enforce over the course of the next weeks. However, most of these enhancements have already been planned in the CEO Yang’s “90-day plan” published on April 1st, and have been slowly put into effect.

These measures include:

  • a new data security program,
  • conduction of risk assessment reviews,
  • enhancement of encryption protocols,
  • a default password for every meeting,
  • halt to sharing user data with Facebook.

In response to the Agreement being struck, Attorney General James stated: “Our lives have inexorably changed over the past two months, and while Zoom has provided an invaluable service, it unacceptably did so without critical security protections. This agreement puts protections in place so that Zoom users have control over their privacy and security, and so that workplaces, schools, religious institutions, and consumers don’t have to worry while participating in a video call.“

A day prior, Zoom was also reinstated for the use of online classes by the New York City Department of Education. In order to ensure the privacy of the students and counteract “Zoombombing”, Zoom has agreed to enhanced privacy controls for free accounts, as well as kindergarten through 12th grade education accounts. Hosts, even those with free accounts, will, by default, be able to control access to their video conferences by requiring a password or the placement of users in a digital waiting room before a meeting can be accessed.

This is not the only new addition to the controls that hosts will be able to access: they will also be able to control access to private messages in a Zoom chat, control access to email domains in a Zoom directory, decide who can share screens, and more.

Overall, Zoom stated that it was happy to have been able to reach a resolution with the Attorney General quickly. It remains to see how the measures in is implementing will hold up to the still growing audience, and how fast they can be implemented for worldwide use.

US Lawmakers to introduce bill that restricts Government Surveillance

3. February 2020

On Thursday January 23rd a bipartisan group of US lawmakers have revealed a legislation which would reduce the scope of the National Security Agency’s (NSA) warrantless internet and telephone surveillance program.

The bill aims to reform section 215 of the PATRIOT Act, which is expiring on March 15, and prevent abuses of the Foreign Intelligence Surveillance Act. Under the PATRIOT Act, the NSA can create a secret mass surveillance that taps into the internet data and telephone records of American residents. Further, the Foreign Intelligence Surveillance Act allows for U.S. intelligence agencies to eavesdrop on and store vast amounts of digital communications from foreign suspects living outside the United States, with American citizens often caught in the cross hairs.

The newly introduced bill is supposed to host a lot of reforms such as prohibiting the warrantless collection of cell site location, GPS information, browsing history and internet search history, ending the authority for the NSA’s massive phone record program which was disclosed by Edward Snowden, establishing a three-year limitation on retention of information that is not foreign intelligence or evidence of a crime, and more.

This new legislation is seen favorably by national civil rights groups and Democrats, who hope the bill will stop the continuous infringement to the fourth Amendment of the American Constitution in the name of national security.

More US States are pushing on with new Privacy Legislation

3. January 2020

The California Consumer Privacy Act (CCPA) came into effect on January 1, 2020 and will be the first step in the United States in regulating data privacy on the Internet. Currently, the US does not have a federal-level general consumer data privacy law that is comparable to that of the privacy laws in EU countries or even the supranational European GDPR.

But now, several other US States have taken inspiration from the CCPA and are in the process of bringing forth their own state legislation on consumer privacy protections on the Internet, including

  • The Massachusetts Data Privacy Law “S-120“,
  • The New York Privacy Act “S5642“,
  • The Hawaii Consumer Privacy Protection Act “SB 418“,
  • The Maryland Online Consumer Protection Act “SB 613“, and
  • The North Dakota Bill “HB 1485“.

Like the CCPA, most of these new privacy laws have a broad definition of the term “Personal Information” and are aimed at protecting consumer data by strenghtening consumer rights.

However, the various law proposals differ in the scope of the consumer rights. All of them grant consumers the ‘right to access’ their data held by businesses. There will also be a ‘right to delete’ in most of these states, but only some give consumers a private ‘right of action’ for violations.

There are other differences with regards to the businesses that will be covered by the privacy laws. In some states, the proposed laws will apply to all businesses, while in other states the laws will only apply to businesses with yearly revenues of over 10 or 25 Million US-Dollars.

As more US states are beginning to introduce privacy laws, there is an increasing possiblity of a federal US privacy law in the near future. Proposals from several members of Congress already exist (Congresswomen Eshoo and Lofgren’s Proposal and Senators Cantwell/Schatz/Klobuchar/Markey’s Proposal and Senator Wicker’s Proposal).

NIST examines the effect of demographic differences on face recognition

31. December 2019

As part of its Face Recognition Vendor Test (FRVT) program, the U.S. National Institute of Standards and Technology (NIST) conducted a study that evaluated face recognition algorithms submitted by industry and academic developers for their ability to perform various tasks. The study evaluated 189 software algorithms submitted by 99 developers. It focuses on how well each algorithm performs one of two different tasks that are among the most common applications of face recognition.

The two tasks are “one-to-one” matching, i.e. confirming that a photo matches another photo of the same person in a database. This is used, for example, when unlocking a smartphone or checking a passport. The second task involved “one-to-many” matching, i.e. determining whether the person in the photo matches any database. This is used to identify a person of interest.

A special focus of this study was that it also looked at the performance of the individual algorithms taking demographic factors into account. For one-to-one matching, only a few previous studies examined demographic effects; for one-to-many matching, there were none.

To evaluate the algorithms, the NIST team used four photo collections containing 18.27 million images of 8.49 million people. All were taken from operational databases of the State Department, Department of Homeland Security and the FBI. The team did not use images taken directly from Internet sources such as social media or from video surveillance. The photos in the databases contained metadata information that indicated the age, gender, and either race or country of birth of the person.

The study found that the result depends ultimately on the algorithm at the heart of the system, the application that uses it, and the data it is fed with. But the majority of face recognition algorithms exhibit demographic differences. In one-to-one matching, the algorithm rated photos of two different people more often as one person if they were Asian or African-American than if they were white. In algorithms developed by Americans, the same error occurred when the person was a Native American. In contrast, algorithms developed in Asia did not show such a significant difference in one-to-one matching results between Asian and Caucasian faces. However, these results show that algorithms can be trained to achieve correct face recognition results by using a wide range of data.

Advocate General releases opinion on the validity of SCCs in case of Third Country Transfers

19. December 2019

Today, Thursday 19 of December, the European Court of Justice’s (CJEU) Advocate General Henrik Saugmandsgaard Øe released his opinion on the validity of Standard Contractual Clauses (SCCs) in cases of personal data transfers to processors situated in third countries.

The background of the case, on which the opinion builds on, originates in the proceedings initiated by Mr. Maximillian Schrems, where he stepped up against Facebook’s business practice of transferring the personal data of its European subscribers to servers located in the United States. The case (Schrems I) led the CJEU on October 6, 2015, to invalidate the Safe Harbor arrangement, which up to that point governed data transfers between the EU and the U.S.A.

Following the ruling, Mr. Schrems decided to challenge the transfers performed on the basis of the EU SCCs, the alternative mechanism Facebook has chosen to rely on to legitimize its EU-U.S. data flows, on the basis of similar arguments to those raised in the Schrems I case. The Irish DPA brought proceedings before the Irish High Court, which referred 11 questions to the CJEU for a preliminary ruling, the Schrems II case.

In the newly published opinion, the Advocate General validates the established SCCs in case of a commercial transfer, despite the possibility of public authorities in the third country processing the personal data for national security reasons. Furthermore, the Advocate General states that the continuity of the high level of protection is not only guaranteed by the adequacy decision of the court, but just as well by the contractual safeguards which the exporter has in place that need to match that level of protection. Therefore, the SCCs represent a general mechanism applicable to transfers, no matter the third country and its adequacy of protection. In addition, and in light of the Charter, there is an obligation for the controller as well as the supervisory authority to suspend any third country transfer if, because of a conflict between the SCCs and the laws in the third country, the SCCs cannot be complied with.

In the end, the Advocate General also clarified that the EU-U.S. Privacy Shield decision of 12 July 2016 is not part of the current proceedings, since those only cover the SCCs under Decision 2010/87, taking the questions of the validity of the Privacy Shield off the table.

While the Advocate General’s opinion is not binding, it represents the suggestion of a legal solution for cases for which the CJEU is responsible. However, the CJEU’s decision on the matter is not expected until early 2020, setting the curiosity on the outcome of the case high.

FTC reaches settlements with companies regarding Privacy Shield misrepresentations

10. December 2019

On December 3, 2019, the Federal Trade Commission (FTC) announced that it had reached settlements in four different cases of Privacy Shield misrepresentation. The FTC alleged that in particular Click Labs, Inc., Incentive Services, Inc., Global Data Vault, LLC, and TDARX, Inc. each falsely claimed to have participated in the framework agreements of the EU-US Privacy Shield. According to the FTC, Global Data and TDARX continued to claim participation in the EU-U.S. Privacy Shield upon expiration of their Privacy Shield certifications. Click Labs and Incentive Services have also erroneously claimed to participate in the Swiss-U.S. Privacy Shield Framework. In addition, Global Data and TDARX have violated the Privacy Shield Framework by failing to follow the annual review of whether statements about their privacy shield practices were accurate. Also, according to the complaints, they did not affirm that they would continue to apply Privacy Shield protection to personal information collected during participation in the program.

As part of the proposed settlements, each of the companies is prohibited from misrepresenting its participation in the EU-U.S. Privacy Shield Framework or any other privacy or data security program sponsored by any government or self-regulatory or standard-setting organization. In addition, Global Data Vault and TDARX are required to continue to apply Privacy Shield protection to personal information collected during participation in the program. Otherwise, they are required to return or delete such information.

The EU-U.S. and Swiss-U.S. Privacy Shield Frameworks allow companies to legally transfer personal data from the EU or Switzerland to the USA. Since the framework was established in 2016, the FTC has initiated a total of 21 enforcement measures in connection with the Privacy Shield.

A description of the consent agreements is published in the Federal Register and publicly commented on for 30 days. The FTC will then decide whether the proposed consent orders are final.

USA and UK sign Cross Border Data Access Agreement for Criminal Electronic Data

10. October 2019

The United States and the United Kingdom have entered into the first of its kind CLOUD Act Data Access Agreement, which will allow both countries’ law enforcement authorities to demand authorized access to electronic data relating to serious crime. In both cases, the respective authorities are permitted to ask the tech companies based in the other country, for electronic data directly and without legal barriers.

At the base of this bilateral Agreement stands the U.S.A.’s Clarifying Lawful Overseas Use of Data Act (CLOUD Act), which came into effect in March 2018. It aims to improve procedures for U.S. and foreign investigators for obtaining electronic information held by service providers in the other country. In light of the growing number of mutual legal assistance requests for electronic data from U.S. service providers, the current process for access may take up to two years. The Data Access Agreement can reduce that time considerably by allowing for a more efficient and effective access to data needed, while protecting the privacy and civil liberties of the data subjects.

The Cloud Act focuses on updating legal frameworks to respond to the growing technology in electronic communications and service systems. It further enables the U.S. and other countries to enter into a mutual executive Agreement in order to use own legal authorities to access electronic evidence in the other respective country. An Agreement of this form can only be signed by rights-respecting countries, after it has been certified by the U.S. Attorney General to the U.S. Congress that their laws have robust substansive and procedural protections for privacy and civil liberties.

The Agreement between the U.K. and the U.S.A. further assures providers that the requested disclosures are compatible with data protection laws in both respective countries.

In addition to the Agreement with the United Kingdom, there have been talks between the United States and Australia on Monday, reporting negotiations for such an Agreement between the two countries. Other negotiations have also been held between the U.S. and the European Commission, representing the European Union, in regards to a Data Access Agreement.

Category: General · UK · USA
Tags: , , , ,

CJEU rules that Right To Be Forgotten is only applicable in Europe

27. September 2019

In a landmark case on Tuesday the Court of Justice of the European Union (CJEU) ruled that Google will not have to apply the General Data Privacy Regulation’s (GDPR) “Right to be Forgotten” to its search engines outside of the European Union. The ruling is a victory for Google in a case against a fine imposed by the french Commission nationale de l’informatique et des libertés (CNIL) in 2015 in an effort to force the company and other search engines to take down links globally.

Seeing as the internet has grown into a worldwide media net with no borders, this case is viewed as a test of wether people can demand a blanket removal of information about themselves from searches without overbearing on the principles of free speech and public interest. Around the world, it has also been perceived as a trial to see if the European Union can extend its laws beyond its own borders.

“The balance between right to privacy and protection of personal data, on the one hand, and the freedom of information of internet users, on the other, is likely to vary significantly around the world,” the court stated in its decision.The Court also expressed in the judgement that the protection of personal data is not an absolute right.

While this leads to companies not being forced to delete sensitive information on their search engines outside of the EU upon request, they must take precautions to seriously discourage internet users from going onto non-EU versions of their pages. Furthermore, companies with search engines within the EU will have to closely weigh freedom of speech against the protection of privacy, keeping the currently common case to case basis for deletion requests.

In effect, since the Right to be Forgotten had been first determined by the CJEU in 2014, Google has since received over 3,3 million deletion requests. In 45% of the cases it has complied with the delisting of links from its search engine. As it stands, even while complying with deletion requests, the delisted links within the EU search engines can still be accessed by using VPN and gaining access to non-EU search engines, circumventing the geoblocking. This is an issue to which a solution has not yet been found.

Pages: 1 2 Next
1 2