Category: General Data Protection Regulation

G7 Data Protection Authorities discuss flow of data across borders

27. September 2022

From September 6th to September 9th, 2022 a meeting between representatives of the G7’s Data Protection Authorities was held in Bonn, Germany, to discuss current regulatory and technological issues concerning the concept of Data Flow with Free Trust (DFFT), a proposed guiding principle for international cooperation on data flows.

It aims at providing answers to several questions in order to create a safe global digital environment in which the protection of data flow is guaranteed. The most important question is: how to overcome the existing data flow barriers? It may seem difficult to introduce a harmonization between countries that have a completely different approach and regulations in regard to personal data protection. To answer this question, a bottom – up approach was adopted for the implementation of the DFFT: it is foreseen that high – level intragovernmental discussions that result in pragmatic rule – making will be held, in order to parallel the public/private relationship for the resolution of individual issues.

Scholars and experts seem to think that RegTech could prove a very useful help to the implementation of the DFFT. To tackle some of the issues that were found in the various discussions and that resulted from research, the World Economic Forum issued a white paper finding seven common success factors that define the best deployment of RegTech.

This concept, first proposed by Japan’s late Prime Minister Shinzo Abe in 2019, is now moving into the implementation phase, mainly concerning trade agreements including e – commerce. A milestone regarding this topic will probably be the next G7 Conference, which will be held in Japan in 2023. Kishida Fumio, the new Japanese Prime Minister, claimed his country’s initiative in the project, and pledged his commitment to the continuous development of the DFFT.

U.S. lawmakers unveil bipartisan Data Privacy and Protection Act

30. June 2022

In early June, three of the four chairmen of the U.S. congressional committees responsible for data privacy submitted a drafted American Data Privacy and Protection Act (ADPPA) for consideration. If passed, it would override certain recently enacted privacy laws in some U.S. states.

The draft includes elements of the California Consumer Privacy Act and the European General Data Protection Regulation.

States led the way

Until now, data protection in the United States has primarily been at the top of the agenda at the state level. California, Colorado, Connecticut, Virginia and Utah have recently enacted comprehensive data privacy laws. This year alone, more than 100 privacy bills have already been introduced in the states.  Although not all of these were adopted, the proliferation of state laws and their varying regulatory requirements has led to increasing calls for the adoption of a federal privacy law. A unified federal law, if passed, would provide much-needed clarity to entities and businesses and, ideally, would also stem the tide of class action and other privacy lawsuits brought under various state laws.

Affected Entities

The ADPPA broadly applies (with exceptions) to organizations operating in the United States that collect, process, or transfer personal information and fall into one of the following categories:

  • Subject to the Federal Trade Commission Act
  • Nonprofit organizations
  • So-called Common Carriers, subject to Title II of the Communications Act of 1934

Requirements of the ADPPA (not final)

  • Limit data collection and processing to that which is reasonably necessary
  • Compliance with public and internal privacy regulations
  • Granting consumer rights such as access, correction, and deletion
  • Appeal options
  • Obtaining consent before collecting or processing sensitive data, e.g. geolocation, genetic and biometric information, and browsing history
  • Appointment of a data protection officer
  • Providing evidence that adequate safeguards are in place
  • Registration of data brokers with the Federal Trade Commission (FTC)
  • FTC will establish and maintain a searchable, centralized online public registry of all registered data traders, as well as a “Do Not Collect” registry that will allow individuals to request all data traders to delete their data within 30 days
  • Entities shall not collect, process, or transfer collected data in a manner that discriminates on the basis of race, color, religion, national origin, sex, sexual orientation, or disability
  • Implement appropriate administrative, technical, and physical data security practices and procedures to protect covered data from unauthorized access and disclosure

Outcome still uncertain

Shortly after a draft of the ADPPA was released, privacy organizations, civil liberties groups, and businesses spoke out, taking sides for and against the law.

As the legislative session draws to a close, the prospects for ADPPA’s adoption remain uncertain. Strong disagreement remains among key stakeholders on important aspects of the proposed legislation. However, there is consensus that the United States is in dire need of a federal privacy law. Thus, passage of such legislation is quite likely in the foreseeable future.

Thailand’s Personal Data Protection Act enters into force

29. June 2022

On June 1, 2022, Thailand’s Personal Data Protection Act (PDPA) entered into force after three years of delays after its enactment in May 2019. Due to the COVID-19 pandemic, the Thai government issued royal decrees to extend the compliance deadline to June 1, 2022.

The PDPA is widely based on the EU General Data Protection Regulation (GDPR). In particular, it also requires data controllers and processors to have a valid legal basis for processing personal data (i.e., data that can identify living natural persons directly or indirectly). If such personal data is sensitive personal data (e.g. health data, biometric data, race, religion, sexual preference and criminal record), data controllers and processors must ensure that data subjects give explicit consent for any collection, use or disclosure of such data. Exemptions are granted for public interest, contractual obligations, vital interest or compliance with the law.

The PDPA also ensures that data subjects have specific rights, which are very similar to the GDPR: the right to be informed, access, rectify and update data, as well as restrict and object to processing and the right to data erasure and portability.

One major difference to the GDPR is that, while there are fines for breaching the PDPA obligations, certain data breaches involving sensitive personal data and unlawful disclosure also carry criminal penalties including imprisonment of up to one year.

Just like the GDPR, the PDPA also affects both entities in Thailand as well as entities abroad that process personal data for the provision of products and/or services within Thai borders.

Just as we have seen with the GDPR, it will be important to observe the evolution the PDPA will venture through as it becomes more incorporated into the Thai companies’ compliance.

EU: Commission publishes Q&A on SCCs

30. May 2022

On 25 May 2022, the European Commission published guidance outlining questions and answers (‘Q&A’) on the two sets of Standard Contractual Clauses (‘SCCs’), on controllers and processors (‘the Controller-Processor SCCs’) and third-country data transfers (‘the Data Transfer SCCs’) respectively, as adopted by the European Commission on 4 June 2021. The Q&A are intended to provide practical guidance on the use of the SCCs. They are based on feedback from various stakeholders on their experiences using the new SCCs in the months following their adoption. 

Specifically, 44 questions are addressed, including those related to contracting, amendments, the relationship to other contract clauses, and the operation of the so-called docking clause.  In addition, the Q&A contains a specific section dedicated to each set of SCCs. Notably, in the section on the Data Transfer SCCs, the Commission addresses the scope of data transfers for which the Data Transfer SCCs may be used, highlighting that they may not be used for data transfers to controllers or processors whose processing operations are directly subject to the General Data Protection Regulation (Regulation (EU) 2016/679) (‘GDPR’) by virtue of Article 3 of the GDPR. Further to this point, the Q&A highlights that the Commission is in the process of developing an additional set of SCCs for this scenario, which will consider the requirements that already apply directly to those controllers and processors under the GDPR. 

In addition, the Q&A includes a section with questions on the obligations of data importers and exporters, specifically addressing the SCC liability scheme. Specifically, the Q&A states that other provisions in the broader (commercial) contract (e.g., specific rules for allocation of liability, caps on liability between the parties) may not contradict or undermine liability schemes of the SCCs. 

Additionally, with respect to the Court of Justice of the European Union’s judgment in Data Protection Commissioner v. Facebook Ireland Limited, Maximillian Schrems (C-311/18) (‘the Schrems II Case’), the Q&A includes a set of questions on local laws and government access aimed at clarifying contracting parties’ obligations under Clause 14 of the Data Transfer SCCs. 

In this regard, the Q&A highlights that Clause 14 of the Data Transfer SCCs should not be read in isolation but used together with the European Data Protection Board’s Recommendations 01/2020 on measures that supplement transfer tools. 

CJEU considers representative actions admissible

29. April 2022

Associations can bring legal proceedings against companies according to a press release of the European Court of Justice (CJEU).

This is the conclusion reached by the Court in a decision on the proceedings of the Federation of German Consumer Organisations (vzbv), which challenged Facebook’s data protection directive. Accordingly, it allows a consumer protection association to bring legal proceedings, in the absence of a mandate conferred on it for that purpose and independently of the infringement of specific rights of the data subjects, against the person allegedly responsible for an infringement of the laws protecting personal data, The vzbv is an institution that is entitled to bring legal proceeding under the GDPR because it pursues an objective in the public interest.

Specifically, the case is about third-party games on Facebook, in which users must agree to the use of data in order to be able to play these games on Facebook. According to the association, Facebook has not informed the data subjects in a precise, transparent and understandable form about the use of the data, as is actually prescribed by the General Data Protection Regulation (GDPR). The Federal Court of Justice in Germany (BGH) already came to this conclusion in May 2020 however, it was not considered sufficiently clarified whether the association can bring legal proceedings in this case.

The EU Advocate General also concluded before that the association can bring legal proceeding in a legally non-binding statement.

Thus, the CJEU confirmed this view so that the BGH must now finally decide on the case of vzbv vs. facebook. It is also important that this decision opens doors for similar collective actions against other companies.

ECJ against data retention without any reason or limit

6. April 2022

In the press release of the judgment of 5.4.2022, the ECJ has once again ruled that the collection of private communications data is unlawful without any reason or limit. This reinforces the rulings of 2014, 2016 and 2020, according to which changes are necessary at EU and national level.

In this judgment, the ECJ states that the decision to allow data retention as evidence in the case of a long-standing murder case is for the national court in Ireland.

Questions regarding this issue were submitted in 2020 by Germany, France and Ireland. The EU Advocate General confirmed, in a legally non-binding manner, the incompatibility of national laws with EU fundamental rights.

However, a first exception to data retention resulted from the 2020 judgment, according to which, in the event of a serious threat to national security, storage for a limited period and subject to judicial review was recognized as permissible.

Subsequently, a judgment in 2021 stated that national law must provide clear and precise rules with minimum conditions for the purpose of preventing abuse.

According to the ECJ, an without cause storage with restriction should be allowed in the following cases:

  • When limited to specific individuals or locations;
  • No concrete evidence of crime necessary, local crime rate is sufficient;
  • Frequently visited locations such as airports and train stations;
  • When national laws require the identity of prepaid cardholders to be stored;
  • Quick freeze, an immediate backup and temporary data storage if there is suspicion of crime.

All of these are to be used only to combat serious crime or prevent threats to national security.

In Germany, Justice Minister Marco Buschmann is in favor of a quick freeze solution as an alternative that preserves fundamental rights. However, the EU states are to work on a legally compliant option for data retention despite the ECJ’s criticism of this principle.

Italian DPA imposes a 20 Mio Euro Fine on Clearview AI

29. March 2022

The Italian data protection authority “Garante” has fined Clearview AI 20 million Euros for data protection violations regarding its facial recognition technology. Clearview AI’s facial recognition system uses over 10 billion images from the internet and prides themself to have the largest biometric image database in the world. The data protection authority has found Clearview AI to be in breach of numerous GDPR requirements. For example, fair and lawful processing was not carried out within the data protection framework, and there was no lawful basis for the collection of information and no appropriate transparency and data retention policies.

Last November, the UK ICO warned of a potential 17 million pound fine against Clearview, and in this context, and also ordered Clearview to stop processing data.

Then, in December, the French CNIL ordered Clearview to stop processing citizens’ data and gave it two months to delete all the data it had stored, but did not mention any explicit financial sanction.

In Italy, Clearview AI must now, in addition to the 20 million Euro fine, not only delete all images of Italian citizens from its database. It must also delete the biometric information needed to search for a specific face. Furthermore, the company must provide a EU representative as a point of contact for EU data subjects and the supervisory authority.

European Commission and United States agree in principle on Trans-Atlantic Data Privacy Framework

On March 25th, 2022, the United States and the European Commission have committed to a new Trans-Atlantic Data Privacy Framework that aims at taking the place of the previous Privacy Shield framework.

The White House stated that the Trans-Atlantic Data Privacy Framework “will foster trans-Atlantic data flows and address the concerns raised by the Court of Justice of the European Union when it struck down in 2020 the Commission’s adequacy decision underlying the EU-US Privacy Shield framework”.

According to the joint statement of the US and the European Commission, “under the Trans-Atlantic Data Privacy Framework, the United States is to put in place new safeguards to ensure that signals surveillance activities are necessary and proportionate in the pursuit of defined national security objectives, establish a two-level independent redress mechanism with binding authority to direct remedial measures, and enhance rigorous and layered oversight of signals intelligence activities to ensure compliance with limitations on surveillance activities”.

This new Trans-Atlantic Data Privacy Framework has been a strenuous work in the making and reflects more than a year of detailed negotiations between the US and EU led by Secretary of Commerce Gina Raimondo and Commissioner for Justice Didier Reynders.

It is hoped that this new framework will provide a durable basis for the data flows between the EU and the US, and underscores the shared commitment to privacy, data protection, the rule of law, and the collective security.

Like the Privacy Shield before, this new framework will represent a self-certification with the US Department of Commerce. Therefore, it will be crucial for data exporters in the EU to ensure that their data importers are certified under the new framework.

The establishment of a new “Data Protection Review Court” will be the responsible department in cases of the new two-tier redress system that will allow EU citizens to raise complaints in cases of access of their data by US intelligence authorities, aiming at investigating and resolving the complaints.

The US’ commitments will be concluded by an Executive Order, which will form the basis of the adequacy decision by the European Commission to put the new framework in place. While this represents a quicker solution to reach the goal, it also means that Executive Orders can be easily repealed by the next government of the US. Therefore, it remains to be seen if this new framework, so far only agreed upon in principle, will bring the much hoped closure on the topic of trans-Atlantic data flows that is intended to bring.

Belgian DPA declares technical standard used for cookie banner for consent requests illegal

28. March 2022

In a long-awaited decision on the Transparency and Consent Framework (TCF), the Belgian data protection authority APD concludes that this technical standard, which advertisers use to collect consent for targeted advertising on the Internet, does not comply with the principles of legality and fairness. Accordingly, it violates the GDPR.

The ADP’s decision is aligned with other European data protection authorities and has consequences for cookie banners and behavioral online advertising in the EU. The advertising association IAB Europe, which develops and operates the TCF system, must now delete the personal data collected in this way and pay a fine of 250,000 euros. In addition, conditions have been determined for the advertising industry under which the TCF may continue to be used at all.

Almost all companies, including advertising companies such as Google or Amazon, use the mechanism to pass on users’ presumed consent to the processing of their personal data for personalized advertising purposes. This decision will have a major impact on the protection of users’ personal data. This is also confirmed by Hielke Hijmans from APD.

The basic structure of the targeted advertising system is that each visit to a participating website triggers an auction among the providers of advertisements. Based on the desired prices and the user’s data profile, among other things, a decision is made in milliseconds as to which advertisements she will see. For this real-time bidding (RTB) to work, the advertising companies collect data to compile target groups for ads.

If users accept cookies or do not object that the use of their data is in the legitimate interest of the provider, the TCF generates a so-called TC string, which contains information about consent decisions. This identifier forms the basis for the creation of individual profiles and for the auctions in which advertising spaces and, with them, the attention of the desired target group are auctioned off, and is forwarded to partners in the OpenRTB system.

According to the authority, the TC strings already constitute personal data because they enable users to be identified with the IP address and the cookies set by the TCF. In addition, IAB Europe is said to be jointly legally responsible for any data processing via the framework, although IAB Europe has not positioned itself as a data processor, only as a provider of a standard.
The TCF envisions advertising providers invoking a “legitimate interest” in data collection in cookie banners that pop up all the time, rather than asking for consent. This would have to be prohibited, for example, for it to be lawful. The principles of privacy by design and by default are also violated, since consent is literally tricked by design tricks, the data flows are not manageable, and revocation of consent is hardly possible.

Dutch data protection authority imposes fine of €525,000

23. March 2022

The Dutch Data Protection Authority, autoriteit persoonsgegevens (hereinafter “ap”) imposed a fine of €525,000 on DPG Media at the beginning of March.

The background to the fine were access and deletion requests of various data subjects who had a newspaper subscription or received increased advertising. If a data subject wanted to know what personal data the company had collected about him, he had to send an ID document to DPG Media to prove his identity. The same applied to anyone who asked the company to delete their data. The customer was supposed to either upload a scan of his ID document or send it to the company by mail or letter.

DPG Media’s procedure for proof of identity was criticized for several reasons. From ap’s point of view, too much data was requested and it was made too difficult for the data subjects to assert their rights to access and deletion. If, for example, DPG Media had requested blackened ID documents, this method of proof of identity would also have been questionable. The ap emphasizes that requesting blackened ID documents is often disproportionate.

It also notes that ID documents are documents that are particularly worthy of protection. Especially regarding possible identity theft, they must be handled very carefully.

Thus, ap clarifies that, even if an identification document is in principle suitable for identifying the data subject, less intrusive identifiers should be used in preference. Milder identifiers, but equally suitable in this specific case, are for example to request the postal address for a telephone inquiry or – as recital 57 states – the use of an “authentication mechanism such as the same credentials, used by the data subject to log-in to the online service offered by the data controller.“

Pages: 1 2 3 4 5 6 7 8 9 10 ... 14 15 16 Next
1 2 3 16