Category: General

Artificial Intelligence and Personal Data: a hard co-existence. A new perspective for the EU

7. July 2022

In the last decades AI has had an impressive development in various fields. At the same time, with each step forward the new machines and the new processes they are programmed to perform need to collect way more data than before in order to function properly.

One of the first things that come to mind is how can the rise of AI and the principle of data minimization, as contained in Art. 5 para. 1 lit. c) GDPR, be reconciled? At first glance it seems contradictory that there may be a way: after all, the GDPR clearly states that the number of personal data collected should be as small as possible. A study carried out by the Panel for the Future of Science and Technology of the European Union suggests that, given the wide scope (referring to the exceptions contained in the article) conceded by the norm, this issue could be addressed by measures like pseudonymization. This means that the data collected by the AI is deprived of every information that could refer personal data to a specific individual without additional information, thus lowering the risks for individuals.

The main issue with the current legal framework of the European Union regarding personal data protection is the fact that certain parts have been left vague, which causes uncertainty also in the regulation of artificial intelligence. To address this problem, the EU has put forward a proposal for a new Artificial Intelligence Act (“AIA”), aiming to create a common and more “approachable” legal framework.

One of the main features of this Act is that it divides the application of artificial intelligence in three main categories of risk levels:

  1. Creating an unacceptable risk, thus prohibited AIs (e.g. systems that violate fundamental rights).
  2. Creating a high risk, subject to specific regulation.
  3. Creating a low or minimum risk, with no further regulation.

Regarding high-risk AIs, the AIA foresees the creation of post-market monitoring obligations. If the AI in question violates any part of the AIA, it can then be forcibly withdrawn from the market by the regulator.

This approach has been welcomed by the Joint Opinion of the EDPB – EDPS, although the two bodies stated that the draft still needs to be more aligned with the GDPR.

Although the Commission’s draft contains a precise description of the first two categories, these will likely change over the course of the next years as the proposal is undergoing the legislative processes of the EU.

The draft was published by the European Commission in April 2021 and must still undergo scrutiny from the European Parliament and the Council of the European Union. Currently, some amendments have been formulated and the draft is still under review by the Parliament. After the Act has passed the scrutiny, it will be subject to a two – year implementation period.

Finally, a question remains to be answered: who shall oversee and control the Act’s implementation?It is foreseen that national supervisory authorities shall be established in each EU member state. Furthermore, the AIA aims at establishing a special European AI Board made up of representatives both of the member States and of the European Commission, which will also be the chair. Similar to the EDPB, this Board shall have the power to issue opinions and recommendations, and ensure the consistent application of the regulation throughout the EU.

European Data Protection Supervisor criticizes Amended Europol Regulation

30. June 2022

On June, 27, 2022, the European Data Protection Supervisor (EDPS), an independent supervisory authority responsible for the monitoring of the processing of personal data by EU institutions and bodies, published a press release on its website criticizing the amended Europol Regulation that entered into force on June 28, 2022.

Unlike in the case for other EU institutions and bodies, Europol operates within an autonomous data protection framework included in the Europol Regulation. This means that only administrative personal data processed by Europol falls under the scope of the otherwise applicable regulation 2018/1725 on the protection of natural persons with regard to the processing of personal data by the Union institutions, bodies, offices and agencies and on the free movement of such data.

In general, Europol is equipped with broad and far-reaching competencies to process personal data. This is because Europol works closely with several actors, such as other EU Agencies, national Law Enforcement Agencies, third countries, and Interpol.

In a journal article, Teresa Quintel points out that “(…) Europol could theoretically retain all data in one single repository and carry out data mining for different types of LE-purposes, which provides Europol with a remarkably broad mandate to process personal data”

Amendments to the Europol Regulation newly in force include the processing of large datasets as well as cooperation with private parties meaning that Europol can receive personal data from these third parties.

The EDPS also points to the fact that the amended regulation allows Europol to create and process large datasets of individuals who have no criminal link. This amendment contradicts an EDPS decision from December 2021 that ordered Europol to delete that data. As a consequence, this order is being made obsolete. The Kinast privacy ticker blogged about this matter earlier this year.

The press release further reads: “The EDPS regrets that the expansion of Europol’s mandate has not been compensated with strong data protection safeguards that would allow the effective supervision of the Agency’s new powers.”

 

U.S. lawmakers unveil bipartisan Data Privacy and Protection Act

In early June, three of the four chairmen of the U.S. congressional committees responsible for data privacy submitted a drafted American Data Privacy and Protection Act (ADPPA) for consideration. If passed, it would override certain recently enacted privacy laws in some U.S. states.

The draft includes elements of the California Consumer Privacy Act and the European General Data Protection Regulation.

States led the way

Until now, data protection in the United States has primarily been at the top of the agenda at the state level. California, Colorado, Connecticut, Virginia and Utah have recently enacted comprehensive data privacy laws. This year alone, more than 100 privacy bills have already been introduced in the states.  Although not all of these were adopted, the proliferation of state laws and their varying regulatory requirements has led to increasing calls for the adoption of a federal privacy law. A unified federal law, if passed, would provide much-needed clarity to entities and businesses and, ideally, would also stem the tide of class action and other privacy lawsuits brought under various state laws.

Affected Entities

The ADPPA broadly applies (with exceptions) to organizations operating in the United States that collect, process, or transfer personal information and fall into one of the following categories:

  • Subject to the Federal Trade Commission Act
  • Nonprofit organizations
  • So-called Common Carriers, subject to Title II of the Communications Act of 1934

Requirements of the ADPPA (not final)

  • Limit data collection and processing to that which is reasonably necessary
  • Compliance with public and internal privacy regulations
  • Granting consumer rights such as access, correction, and deletion
  • Appeal options
  • Obtaining consent before collecting or processing sensitive data, e.g. geolocation, genetic and biometric information, and browsing history
  • Appointment of a data protection officer
  • Providing evidence that adequate safeguards are in place
  • Registration of data brokers with the Federal Trade Commission (FTC)
  • FTC will establish and maintain a searchable, centralized online public registry of all registered data traders, as well as a “Do Not Collect” registry that will allow individuals to request all data traders to delete their data within 30 days
  • Entities shall not collect, process, or transfer collected data in a manner that discriminates on the basis of race, color, religion, national origin, sex, sexual orientation, or disability
  • Implement appropriate administrative, technical, and physical data security practices and procedures to protect covered data from unauthorized access and disclosure

Outcome still uncertain

Shortly after a draft of the ADPPA was released, privacy organizations, civil liberties groups, and businesses spoke out, taking sides for and against the law.

As the legislative session draws to a close, the prospects for ADPPA’s adoption remain uncertain. Strong disagreement remains among key stakeholders on important aspects of the proposed legislation. However, there is consensus that the United States is in dire need of a federal privacy law. Thus, passage of such legislation is quite likely in the foreseeable future.

Connecticut enacts privacy law

3. June 2022

On May 10, 2022, Connecticut Gov. Ned Lamont approved the Connecticut Privacy Act (“CTDPA”) concerning Personal Data Privacy and Online Monitoring. The passage of the CTDPA continues the trend in the U.S. for states to individually address consumer rights and business obligations relating to consumer data, in the absence of uniform legislation from the U.S. Congress. This makes Connecticut the 5th state in the United Sates to pass a comprehensive data privacy law.

The CTDPA shares many similarities with the California Consumer Privacy Act (“CPRA”), Colorado Privacy Act (“CPA”), Virginia Consumer Data Protection Act (“VCDPA”) and Utah Consumer Privacy Act (“UCPA”). The Connecticut Privacy Act applies to “personal data”, which is defined as “any information that is linked or reasonably linkable to an identified or identifiable individual,” not including de-identified data or publicly available information. It imposes obligations on both controllers and processors of personal data.

Who does the Connecticut Privacy Act apply to?

The law will apply to individuals and entities who/ that

  • conduct business in Connecticut.
  • produce products or services that are targeted to Connecticut residents.
  • during the preceding calendar year, either controlled or processed the personal data of at least 100,000 consumers (excluding for the purpose of completing a payment transaction) or controlled or processed the personal data of at least 25,000 consumers and derived more than 25% of their gross revenue from the sale of personal data.

Certain entities are exempted, for example:

  • State and local government entities
  • Nonprofits
  • Higher education institutions
  • Financial institutions subject to the Gramm-Leach-Bliley Act (“GLB”)
  • Entities and business associates subject to the Health Insurance Portability and Accountability Act (“HIPAA”)

Consumers will have the right to:

  • access – the right to know what personal data a company has collected about them
  • correct inaccuracies in the consumer’s personal data
  • delete personal data provided by, or obtained about, the consumer
  • obtain a copy of the consumer’s personal data processed by a controller, in a portable and, to the extent technically feasible, readily usable format
  • opt out of the processing of their personal data for purposes of targeted advertising, the sale of personal data, or profiling in furtherance of solely automated decisions that produce legal or similarly significant effects concerning the consumer

Among other obligations, controllers will be required to:

  • limit the use of the personal data to only the purpose of the collection (“what is adequate, relevant and reasonably necessary”) or as the consumer has authorized
  • establish, implement and maintain reasonable administrative, technical and physical data security practices
  • not to process personal data of a consumer for purposes of targeted advertising
  • obtain consent before processing sensitive data, including data of any individual under the age of 13, and follow the provisions of the Children’s Online Privacy Protection Act

The Connecticut Privacy Act is set to become effective on July 1, 2023. Violation of the CPDPA may result in an enforcement action by the Connecticut Attorney General (AG), who can levy fines and penalties under the Connecticut Unfair Trade Practices Act. However, there is a grace period for enforcement actions until December 31, 2024, for the AG to provide organizations an opportunity to cure any alleged violations.

Like other US data privacy laws, the Connecticut laws are not as comprehensive as the EU’s GDPR but they better align with some of the definitions and especially the mechanisms of consent.

UK announces Data Reform Bill

31. May 2022

In 2021 the Department for Culture, Media and Sport (DCMS) published a consultation document entitled “Data: a new direction”, requesting opinions on proposals that could bring changes to the UK’s data protection regime. On May 10, 2022, as part of the Queen’s Speech, Prince Charles confirmed that the government of the United Kingdom (UK) is in the process of reforming its data privacy rules, raising questions about whether the country could still be in compliance with the General Data Protection Regulation (GDPR).

Other than the statement itself, not much information was provided regarding the specific details. The accompanying briefing notes provided more information. They set out the main purposes of the Bill, namely to:

  • The establishment of a new pro-growth and trusted data protection framework
  • Reducing the burdens on business
  • Creation of a world class data rights regime
  • Supporting innovation
  • Driving industry participation in schemes which give citizens and small businesses more control of their data, particularly in relation to health and social care
  • Modernization of the  Information Commissioner’s Office (ICO), including strengthening its enforcement powers and increasing its accountability

Nevertheless, the defined goals are rather superficial. Another concern is that the new bill could deviate too far from the GDPR. The new regime might not be able to retain the adequacy-status with the EU, allowing personal data to be exchanged between UK and EU organizations. Prime Minister Johnson said that the Data Reform Bill would “improve the burdensome GDPR, allowing information to be shared more effectively and securely between public bodies.” So far, no time frame for the adoption of the new law has been published.

ECJ against data retention without any reason or limit

6. April 2022

In the press release of the judgment of 5.4.2022, the ECJ has once again ruled that the collection of private communications data is unlawful without any reason or limit. This reinforces the rulings of 2014, 2016 and 2020, according to which changes are necessary at EU and national level.

In this judgment, the ECJ states that the decision to allow data retention as evidence in the case of a long-standing murder case is for the national court in Ireland.

Questions regarding this issue were submitted in 2020 by Germany, France and Ireland. The EU Advocate General confirmed, in a legally non-binding manner, the incompatibility of national laws with EU fundamental rights.

However, a first exception to data retention resulted from the 2020 judgment, according to which, in the event of a serious threat to national security, storage for a limited period and subject to judicial review was recognized as permissible.

Subsequently, a judgment in 2021 stated that national law must provide clear and precise rules with minimum conditions for the purpose of preventing abuse.

According to the ECJ, an without cause storage with restriction should be allowed in the following cases:

  • When limited to specific individuals or locations;
  • No concrete evidence of crime necessary, local crime rate is sufficient;
  • Frequently visited locations such as airports and train stations;
  • When national laws require the identity of prepaid cardholders to be stored;
  • Quick freeze, an immediate backup and temporary data storage if there is suspicion of crime.

All of these are to be used only to combat serious crime or prevent threats to national security.

In Germany, Justice Minister Marco Buschmann is in favor of a quick freeze solution as an alternative that preserves fundamental rights. However, the EU states are to work on a legally compliant option for data retention despite the ECJ’s criticism of this principle.

European Commission and United States agree in principle on Trans-Atlantic Data Privacy Framework

29. March 2022

On March 25th, 2022, the United States and the European Commission have committed to a new Trans-Atlantic Data Privacy Framework that aims at taking the place of the previous Privacy Shield framework.

The White House stated that the Trans-Atlantic Data Privacy Framework “will foster trans-Atlantic data flows and address the concerns raised by the Court of Justice of the European Union when it struck down in 2020 the Commission’s adequacy decision underlying the EU-US Privacy Shield framework”.

According to the joint statement of the US and the European Commission, “under the Trans-Atlantic Data Privacy Framework, the United States is to put in place new safeguards to ensure that signals surveillance activities are necessary and proportionate in the pursuit of defined national security objectives, establish a two-level independent redress mechanism with binding authority to direct remedial measures, and enhance rigorous and layered oversight of signals intelligence activities to ensure compliance with limitations on surveillance activities”.

This new Trans-Atlantic Data Privacy Framework has been a strenuous work in the making and reflects more than a year of detailed negotiations between the US and EU led by Secretary of Commerce Gina Raimondo and Commissioner for Justice Didier Reynders.

It is hoped that this new framework will provide a durable basis for the data flows between the EU and the US, and underscores the shared commitment to privacy, data protection, the rule of law, and the collective security.

Like the Privacy Shield before, this new framework will represent a self-certification with the US Department of Commerce. Therefore, it will be crucial for data exporters in the EU to ensure that their data importers are certified under the new framework.

The establishment of a new “Data Protection Review Court” will be the responsible department in cases of the new two-tier redress system that will allow EU citizens to raise complaints in cases of access of their data by US intelligence authorities, aiming at investigating and resolving the complaints.

The US’ commitments will be concluded by an Executive Order, which will form the basis of the adequacy decision by the European Commission to put the new framework in place. While this represents a quicker solution to reach the goal, it also means that Executive Orders can be easily repealed by the next government of the US. Therefore, it remains to be seen if this new framework, so far only agreed upon in principle, will bring the much hoped closure on the topic of trans-Atlantic data flows that is intended to bring.

Belgian DPA declares technical standard used for cookie banner for consent requests illegal

28. March 2022

In a long-awaited decision on the Transparency and Consent Framework (TCF), the Belgian data protection authority APD concludes that this technical standard, which advertisers use to collect consent for targeted advertising on the Internet, does not comply with the principles of legality and fairness. Accordingly, it violates the GDPR.

The ADP’s decision is aligned with other European data protection authorities and has consequences for cookie banners and behavioral online advertising in the EU. The advertising association IAB Europe, which develops and operates the TCF system, must now delete the personal data collected in this way and pay a fine of 250,000 euros. In addition, conditions have been determined for the advertising industry under which the TCF may continue to be used at all.

Almost all companies, including advertising companies such as Google or Amazon, use the mechanism to pass on users’ presumed consent to the processing of their personal data for personalized advertising purposes. This decision will have a major impact on the protection of users’ personal data. This is also confirmed by Hielke Hijmans from APD.

The basic structure of the targeted advertising system is that each visit to a participating website triggers an auction among the providers of advertisements. Based on the desired prices and the user’s data profile, among other things, a decision is made in milliseconds as to which advertisements she will see. For this real-time bidding (RTB) to work, the advertising companies collect data to compile target groups for ads.

If users accept cookies or do not object that the use of their data is in the legitimate interest of the provider, the TCF generates a so-called TC string, which contains information about consent decisions. This identifier forms the basis for the creation of individual profiles and for the auctions in which advertising spaces and, with them, the attention of the desired target group are auctioned off, and is forwarded to partners in the OpenRTB system.

According to the authority, the TC strings already constitute personal data because they enable users to be identified with the IP address and the cookies set by the TCF. In addition, IAB Europe is said to be jointly legally responsible for any data processing via the framework, although IAB Europe has not positioned itself as a data processor, only as a provider of a standard.
The TCF envisions advertising providers invoking a “legitimate interest” in data collection in cookie banners that pop up all the time, rather than asking for consent. This would have to be prohibited, for example, for it to be lawful. The principles of privacy by design and by default are also violated, since consent is literally tricked by design tricks, the data flows are not manageable, and revocation of consent is hardly possible.

Google to launch Google Analytics 4 with aim to address EU Data Protection concerns

24. March 2022

On March 16, 2022, Google announced the launch of its new analytics solution, “Google Analytics 4”. Among other things, “Google Analytics 4” aims to address the most recent data protection developments regarding the use of analytical cookies and the transfers tied to such processing.

The announcement of this new launch comes following 101 complaints made by the non-governmental organization None of Your Business (NOYB) complaints with 30 EEA countries’ data protection authorities (DPA). Assessing the data transfer from the EU to the US after the Schrems II decision of the CJEU for the use of Google Analytics, the French and Austrian DPAs ruled that the transfer of EU personal data from the EU to the U.S. through the use of the Google Analytics cookies is unlawful under the GDPR.

In the press release, Google states that “Google Analytics 4 is designed with privacy at its core to provide a better experience for both our customers and their users. It helps businesses meet evolving needs and user expectations, with more comprehensive and granular controls for data collection and usage.”

However, the most important change that the launch of “Google Analytics 4” will have on the processing of personal data is that it will no longer store users’ IP addresses. This will limit the data processing and resulting transfers that Google Analytics was under scrutiny for in the EU, however it is unclear at this point if the EU DPAs will change their opinion on the use of Google Analytics with this new version.

According to the press release, the current Google Analytics will be suspended starting July 2023, and Google is recommending companies to move onto “Google Analytics 4” as soon as possible.

Dutch data protection authority imposes fine of €525,000

23. March 2022

The Dutch Data Protection Authority, autoriteit persoonsgegevens (hereinafter “ap”) imposed a fine of €525,000 on DPG Media at the beginning of March.

The background to the fine were access and deletion requests of various data subjects who had a newspaper subscription or received increased advertising. If a data subject wanted to know what personal data the company had collected about him, he had to send an ID document to DPG Media to prove his identity. The same applied to anyone who asked the company to delete their data. The customer was supposed to either upload a scan of his ID document or send it to the company by mail or letter.

DPG Media’s procedure for proof of identity was criticized for several reasons. From ap’s point of view, too much data was requested and it was made too difficult for the data subjects to assert their rights to access and deletion. If, for example, DPG Media had requested blackened ID documents, this method of proof of identity would also have been questionable. The ap emphasizes that requesting blackened ID documents is often disproportionate.

It also notes that ID documents are documents that are particularly worthy of protection. Especially regarding possible identity theft, they must be handled very carefully.

Thus, ap clarifies that, even if an identification document is in principle suitable for identifying the data subject, less intrusive identifiers should be used in preference. Milder identifiers, but equally suitable in this specific case, are for example to request the postal address for a telephone inquiry or – as recital 57 states – the use of an “authentication mechanism such as the same credentials, used by the data subject to log-in to the online service offered by the data controller.“

Pages: Prev 1 2 3 4 5 6 7 8 9 10 ... 30 31 32 Next
1 2 3 4 32