Category: Countries

Advocate General releases opinion on the validity of SCCs in case of Third Country Transfers

19. December 2019

Today, Thursday 19 of December, the European Court of Justice’s (CJEU) Advocate General Henrik Saugmandsgaard Øe released his opinion on the validity of Standard Contractual Clauses (SCCs) in cases of personal data transfers to processors situated in third countries.

The background of the case, on which the opinion builds on, originates in the proceedings initiated by Mr. Maximillian Schrems, where he stepped up against Facebook’s business practice of transferring the personal data of its European subscribers to servers located in the United States. The case (Schrems I) led the CJEU on October 6, 2015, to invalidate the Safe Harbor arrangement, which up to that point governed data transfers between the EU and the U.S.A.

Following the ruling, Mr. Schrems decided to challenge the transfers performed on the basis of the EU SCCs, the alternative mechanism Facebook has chosen to rely on to legitimize its EU-U.S. data flows, on the basis of similar arguments to those raised in the Schrems I case. The Irish DPA brought proceedings before the Irish High Court, which referred 11 questions to the CJEU for a preliminary ruling, the Schrems II case.

In the newly published opinion, the Advocate General validates the established SCCs in case of a commercial transfer, despite the possibility of public authorities in the third country processing the personal data for national security reasons. Furthermore, the Advocate General states that the continuity of the high level of protection is not only guaranteed by the adequacy decision of the court, but just as well by the contractual safeguards which the exporter has in place that need to match that level of protection. Therefore, the SCCs represent a general mechanism applicable to transfers, no matter the third country and its adequacy of protection. In addition, and in light of the Charter, there is an obligation for the controller as well as the supervisory authority to suspend any third country transfer if, because of a conflict between the SCCs and the laws in the third country, the SCCs cannot be complied with.

In the end, the Advocate General also clarified that the EU-U.S. Privacy Shield decision of 12 July 2016 is not part of the current proceedings, since those only cover the SCCs under Decision 2010/87, taking the questions of the validity of the Privacy Shield off the table.

While the Advocate General’s opinion is not binding, it represents the suggestion of a legal solution for cases for which the CJEU is responsible. However, the CJEU’s decision on the matter is not expected until early 2020, setting the curiosity on the outcome of the case high.

Advocate General’s opinion on “Schrems II” is delayed

11. December 2019

The Court of Justice of the European Union (CJEU) Advocate General’s opinion in the case C-311/18 (‘Facebook Ireland and Schrems’) will be released on December 19, 2019. Originally, the CJEU announced that the opinion of the Advocate General in this case, Henrik Saugmandsgaard Øe, would be released on December 12, 2019. The CJEU did not provide a reason for this delay.

The prominent case deals with the complaint to the Irish Data Protection Commission (DPC) by privacy activist and lawyer Maximilian Schrems and the transfer of his personal data from Facebook Ireland Ltd. to Facebook Inc. in the U.S. under the European Commission’s controller-to-processor Standard Contractual Clauses (SCCs).

Perhaps, the most consequential question that the High Court of Ireland set before the CJEU is whether the transfers of personal data from the EU to the U.S. under the SCCs violate the rights of the individuals under Articles 7 and/or 8 of the Charter of Fundamental Rights of the European Union (Question No. 4). The decision of the CJEU in “Schrems II” will also have ramifications on the parallel case T-738/16 (‘La Quadrature du net and others’). The latter case poses the question whether the EU-U.S. Privacy Shield for data transfers from the EU to the U.S. protects the rights of EU individuals sufficiently. If it does not, the European Commission would face a “Safe Harbor”-déjà vu after approving of the new Privacy Shield in its adequacy decision from 2016.

The CJEU is not bound to the opinion of the Advocate General (AG), but in some cases, the AG’s opinion may be a weighty indicator of the CJEU’s final ruling. The final decision by the Court is expected in early 2020.

FTC reaches settlements with companies regarding Privacy Shield misrepresentations

10. December 2019

On December 3, 2019, the Federal Trade Commission (FTC) announced that it had reached settlements in four different cases of Privacy Shield misrepresentation. The FTC alleged that in particular Click Labs, Inc., Incentive Services, Inc., Global Data Vault, LLC, and TDARX, Inc. each falsely claimed to have participated in the framework agreements of the EU-US Privacy Shield. According to the FTC, Global Data and TDARX continued to claim participation in the EU-U.S. Privacy Shield upon expiration of their Privacy Shield certifications. Click Labs and Incentive Services have also erroneously claimed to participate in the Swiss-U.S. Privacy Shield Framework. In addition, Global Data and TDARX have violated the Privacy Shield Framework by failing to follow the annual review of whether statements about their privacy shield practices were accurate. Also, according to the complaints, they did not affirm that they would continue to apply Privacy Shield protection to personal information collected during participation in the program.

As part of the proposed settlements, each of the companies is prohibited from misrepresenting its participation in the EU-U.S. Privacy Shield Framework or any other privacy or data security program sponsored by any government or self-regulatory or standard-setting organization. In addition, Global Data Vault and TDARX are required to continue to apply Privacy Shield protection to personal information collected during participation in the program. Otherwise, they are required to return or delete such information.

The EU-U.S. and Swiss-U.S. Privacy Shield Frameworks allow companies to legally transfer personal data from the EU or Switzerland to the USA. Since the framework was established in 2016, the FTC has initiated a total of 21 enforcement measures in connection with the Privacy Shield.

A description of the consent agreements is published in the Federal Register and publicly commented on for 30 days. The FTC will then decide whether the proposed consent orders are final.

Health data transfered to Google, Amazon and Facebook

18. November 2019

Websites, spezialized on health topics transfer information of website users to Google, Amazon and Facebook, as the Financial Times reports.

The transferred information are obtained through cookies and include medical symtoms and clinical pictures of the users.

Referring to the report of the Financial Times does the transfer take place without the express consent of the data subject, contrary to the Data Protection Law in the UK. Besides the legal obligations in the UK, the procedure of the website operators, using the cookie, contradicts also the legal requirements of the GDPR.

According to the requirements of the GDPR the processing of health data falls under Art. 9 GDPR and is a prohibition subject to permission, meaning, that the processing of health data is forbidden unless the data subject has given its explicit consent.

The report is also interesting considering the Cookie judgement of the CJEU (we reported). Based on the judgment, consent must be obtained for the use of each cookie.

Accordingly, the procedure of the website operators will (hopefully) change in order to comply with the new case law.

 

European Commission releases third annual Privacy Shield Review report

25. October 2019

The European Commission has released a report on the E.U.-U.S. Privacy Shield, which represents the third annual report on the performance of the supranational Agreement, after it came into effect in July 2016. The discussions on the review were launched on 12 September 2019 by Commissioner for Justice, Consumers and Gender Equality Věra Jourová, with the U.S. Secretary of Commerce Wilbur Ross in Washington, DC.

The Privacy Shield protects the fundamental rights of anyone in the European Union whose personal data is transferred to certified companies in the United States for commercial purposes and brings legal clarity for businesses relying on transatlantic data transfer. The European Commission is commited to review the Agreement on an annual basis to ensure that the level of protection certified under the Privacy Shield continues to be at an adequate level.

This year’s report validates the continuous adequacy of the protection for personal data transferred to certified companies in the U.S. from the Europan Union under the Privacy Shield. Since the Framework was implemented, about 5000 companies have registered with the Privacy Shield. The EU Commissioner for Justice, Consumers and Gender Equality stated that “the Privacy Shield has become a success story. The annual review is an important health check for its functioning“.

The improvements compared to the last annual review in 2018 include the U.S. Department of Commerce’s efforts to ensure necessary oversight in a systematic manner. This is done by monthly checks with samply companies that are certified unter the Privacy Shield. Furthermore, an increasing number of European Citizens are making use of their rights under the Framework, and the resulting response mechanisms are functioning well.

The biggest criticism the European Commission has stated came in the form of the recommendation of firm steps to ensure a better process in the (re)certification process under the Privacy Shield. The time of the (re)certification process allows companies to get recertified within three months after their certification has run out, which can lead to a lack of transparency and confusion, since those companies will still be listed in the registry. A shorter time frame has been proposed by the European Commission to guarantee a higher level of security.

Overall, the third annual review has been seen as a success in the cooperation between the two sides, and both the U.S. and the European officials agree that there is a need for strong and credible enforcement of privacy rules to protect the respective citizens and ensure trust in the digital economy.

USA and UK sign Cross Border Data Access Agreement for Criminal Electronic Data

10. October 2019

The United States and the United Kingdom have entered into the first of its kind CLOUD Act Data Access Agreement, which will allow both countries’ law enforcement authorities to demand authorized access to electronic data relating to serious crime. In both cases, the respective authorities are permitted to ask the tech companies based in the other country, for electronic data directly and without legal barriers.

At the base of this bilateral Agreement stands the U.S.A.’s Clarifying Lawful Overseas Use of Data Act (CLOUD Act), which came into effect in March 2018. It aims to improve procedures for U.S. and foreign investigators for obtaining electronic information held by service providers in the other country. In light of the growing number of mutual legal assistance requests for electronic data from U.S. service providers, the current process for access may take up to two years. The Data Access Agreement can reduce that time considerably by allowing for a more efficient and effective access to data needed, while protecting the privacy and civil liberties of the data subjects.

The Cloud Act focuses on updating legal frameworks to respond to the growing technology in electronic communications and service systems. It further enables the U.S. and other countries to enter into a mutual executive Agreement in order to use own legal authorities to access electronic evidence in the other respective country. An Agreement of this form can only be signed by rights-respecting countries, after it has been certified by the U.S. Attorney General to the U.S. Congress that their laws have robust substansive and procedural protections for privacy and civil liberties.

The Agreement between the U.K. and the U.S.A. further assures providers that the requested disclosures are compatible with data protection laws in both respective countries.

In addition to the Agreement with the United Kingdom, there have been talks between the United States and Australia on Monday, reporting negotiations for such an Agreement between the two countries. Other negotiations have also been held between the U.S. and the European Commission, representing the European Union, in regards to a Data Access Agreement.

Category: General · UK · USA
Tags: , , , ,

CJEU rules that Right To Be Forgotten is only applicable in Europe

27. September 2019

In a landmark case on Tuesday the Court of Justice of the European Union (CJEU) ruled that Google will not have to apply the General Data Privacy Regulation’s (GDPR) “Right to be Forgotten” to its search engines outside of the European Union. The ruling is a victory for Google in a case against a fine imposed by the french Commission nationale de l’informatique et des libertés (CNIL) in 2015 in an effort to force the company and other search engines to take down links globally.

Seeing as the internet has grown into a worldwide media net with no borders, this case is viewed as a test of wether people can demand a blanket removal of information about themselves from searches without overbearing on the principles of free speech and public interest. Around the world, it has also been perceived as a trial to see if the European Union can extend its laws beyond its own borders.

“The balance between right to privacy and protection of personal data, on the one hand, and the freedom of information of internet users, on the other, is likely to vary significantly around the world,” the court stated in its decision.The Court also expressed in the judgement that the protection of personal data is not an absolute right.

While this leads to companies not being forced to delete sensitive information on their search engines outside of the EU upon request, they must take precautions to seriously discourage internet users from going onto non-EU versions of their pages. Furthermore, companies with search engines within the EU will have to closely weigh freedom of speech against the protection of privacy, keeping the currently common case to case basis for deletion requests.

In effect, since the Right to be Forgotten had been first determined by the CJEU in 2014, Google has since received over 3,3 million deletion requests. In 45% of the cases it has complied with the delisting of links from its search engine. As it stands, even while complying with deletion requests, the delisted links within the EU search engines can still be accessed by using VPN and gaining access to non-EU search engines, circumventing the geoblocking. This is an issue to which a solution has not yet been found.

CNIL updates its FAQs for case of a No-Deal Brexit

24. September 2019

The French data protection authority “CNIL” updated its existing catalogue of questions and answers (“FAQs”) to inform about the impact of a no-deal brexit and how controllers should prepare for the transfer of data from the EU to the UK.

As things stand, the United Kingdom will leave the European Union on 1st of November 2019. The UK will then be considered a third country for the purposes of the European General Data Protection Regulation (“GDPR”). For this reason, after the exit, data transfer mechanisms become necessary to transfer personal data from the EU to the UK.

The FAQs recommend five steps that entities should take when transferring data to a controller or processor in the UK to ensure compliance with GDPR:

1. Identify processing activities that involve the transfer of personal data to the United Kingdom.
2. Determine the most appropriate transfer mechanism to implement for these processing activities.
3. Implement the chosen transfer mechanism so that it is applicable and effective as of November 1, 2019.
4. Update your internal documents to include transfers to the United Kingdom as of November 1, 2019.
5. If necessary, update relevant privacy notices to indicate the existence of transfers of data outside the EU and EEA where the United Kingdom is concerned.

CNIL also discusses the GDPR-compliant data transfer mechanisms (e.g., standard contractual clauses, binding corporate rules, codes of conduct) and points out that, whichever one is chosen, it must take effect on 1st of November. If controllers should choose a derogation admissible according to GDPR, CNIL stresses that this must strictly comply with the requirements of Art. 49 GDPR.

High Court dismisses challenge regarding Automated Facial Recognition

12. September 2019

On 4 September, the High Court of England and Wales dismissed a challenge to the police’s use of Automated Facial Recognition Technology (“AFR”). The court ruled that the use of AFR was proportionate and necessary to meet the legal obligations of the police.

The pilot project AFR Locate was used for certain events and public places when the commission of crimes was likely. Up to 50 faces per second can be detected. The faces are then compared by biometric data analysis with wanted persons registered in police databases. If no match is found the images are deleted immediately and automatically.

An individual has initiated a judicial review process after he has not been identified as a wanted person, but is likely to have been captured by AFR Locate. He considered this to be illegal, in particular due to a violation of the right to respect for private and family life under Article 8 of the European Convention on Human Rights (“ECHR”) and data protection law in the United Kingdom. In his view, the police did not respect the data protection principles. In particular, that approach would violate the principle of Article 35 of the Data Protection Act 2018 (“DPA 2018”), which requires the processing of personal data for law enforcement purposes to be lawful and fair. He also pointed out that the police had failed to carry out an adequate data protection impact assessment (“DPIA”).

The Court stated that the use of AFR has affected a person’s rights under Article 8 of the ECHR and that this type of biometric data has a private character in itself. Despite the fact that the images were erased immediately, this procedure constituted an interference with Article 8 of the ECHR, since it suffices that the data is temporarily stored.

Nevertheless, the Court found that the police’s action was in accordance with the law, as it falls within the police’s public law powers to prevent and detect criminal offences. The Court also found that the use of the AFR system is proportionate and that the technology can be used openly, transparently and with considerable public commitment, thus fulfilling all existing criteria. It was only used for a limited period, for a specific purpose and published before it was used (e.g. on Facebook and Twitter).

With regard to data protection law, the Court considers that the images of individuals captured constitute personal data, even if they do not correspond to the lists of persons sought, because the technology has singled them out and distinguished them from others. Nevertheless, the Court held that there was no violation of data protection principles, for the same reasons on which it denied a violation of Art. 8 ECHR. The Court found that the processing fulfilled the conditions of legality and fairness and was necessary for the legitimate interest of the police in the prevention and detection of criminal offences, as required by their public service obligations. The requirement of Sec. 35 (5) DPA 2018 that the processing is absolutely necessary was fulfilled, as was the requirement that the processing is necessary for the exercise of the functions of the police.

The last requirement under Sec. 35 (5) of the DPA 2018 is that a suitable policy document is available to regulate the processing. The Court considered the relevant policy document in this case to be short and incomplete. Nevertheless, it refused to give a judgment as to whether the document was adequate and stated that it would leave that judgment to the Information Commissioner Office (“ICO”), as it would publish more detailed guidelines.

Finally, the Court found that the impact assessment carried out by the police was sufficient to meet the requirements of Sec. 64 of DPA 2018.

The ICO stated that it would take into account the High Court ruling when finalising its recommendations and guidelines for the use of live face recognition systems.

London’s King’s Cross station facial recognition technology under investigation by the ICO

11. September 2019

Initially reported by the Financial Times, London’s King’s Cross station is under crossfire for making use of a live face-scanning system across its 67 acres large site. Developed by Argent, it was confirmed that the system has been used to ensure public safety, being part of a number of detection and tracking methods used in terms of surveillance at the famous train station. While the site is privately owned, it is widely used by the public and houses various shops, cafes, restaurants, as well as office spaces with tenants like, for example, Google.

The controversy behind the technology and its legality stems from the fact that it records everyone in its parameters without their consent, analyzing their faces and compairing them to a database of wanted criminals, suspects and persons of interest. While Developer Argent defended the technology, it has not yet explained what the system is, how it is used and how long it has been in place.

A day before the ICO launched its investigation, a letter from King’s Cross Chief Executive Robert Evans reached Mayor of London Sadiq Khan, explaining the matching of the technology against a watchlist of flagged individuals. In effect, if footage is unmatched, it is blurred out and deleted. In case of a match, it is only shared with law enforcement. The Metropolitan Police Service has stated that they have supplied images for a database to carry out facial scans to system, though it claims to not have done so since March, 2018.

Despite the explanation and the distinct statements that the software is abiding by England’s data protection laws, the Information Commissioner’s Office (ICO) has launched an investigation into the technology and its use in the private sector. Businesses would need to explicitly demonstrate that the use of such surveillance technology is strictly necessary and proportionate for their legitimate interests and public safety. In her statement, Information Commissioner Elizabeth Denham further said that she is deeply concerned, since “scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all,” especially if its being done without their knowledge.

The controversy has sparked a demand for a law about facial recognition, igniting a dialogue about new technologies and future-proofing against the yet unknown privacy issues they may cause.

Category: GDPR · General · UK
Tags: , , , ,
Pages: Prev 1 2 3 4 5 6 7 8 9 10 ... 21 22 23 Next
1 7 8 9 10 11 23