Category: Data Protection

CNIL Plan for AI

19. May 2023

The French Data Protection Authority (CNIL), also known as the AI Action Plan, released a statement on May 16, 2023, outlining its artificial intelligence policy. This strategy expands on the CNIL’s prior work in the field of AI and contains a number of projects targeted at encouraging the adoption of AI systems that respect people’s right to privacy.

The four key goals of the AI Action Plan are as follows:

Increasing awareness of AI systems and how they affect people: The newly created artificial intelligence service at the CNIL will place a high priority on addressing critical data protection issues related to the creation and use of AI applications. These problems include preventing illegitimate scraping of publicly accessible online data, securing user-transmitted data within AI systems and guaranteeing users’ rights over their data with regard to AI training datasets and created outputs.

Directing the creation of AI that respects privacy: The CNIL will publish guidelines and best practices on a variety of AI subjects in order to enable organizations engaged in AI innovation and to get ready for the eventual adoption of the EU AI Act. Along with advice for the creation of generative AI systems, this will include a thorough manual on the regulations governing data exchange and reuse.

Supporting creative actors in the French and European AI ecosystem: The CNIL prioritizes the defense of fundamental rights and freedoms in France and Europe while attempting to promote innovation within the AI ecosystem. The CNIL intends to issue a call for projects inviting participation in its 2023 regulatory sandbox as part of this endeavour. It also aims to promote more communication among academic groups, R&D facilities, and businesses engaged in the creation of AI systems.

The CNIL will create an auditing tool specifically made for assessing AI systems in order to conduct audits and ensure control over these systems. It will keep looking into AI-related grievances brought to its attention, especially those involving generative AI.

Montana, Tennessee join Indiana and Iowa as next States to pass comprehensive data protection laws

28. April 2023

Montana and Tennessee have both passed comprehensive bills in their state legislatures on April 21st, making them the latest additions to the states that have enacted privacy laws this year, alongside Indiana and Iowa.

Iowa Data Privacy Act (IDPA)

Iowa joined Connecticut, Utah, Virginia, Colorado, and California on March 29th as the sixth state to approve a comprehensive privacy law. The law will become effective on January 1st, 2025, which provides organizations with 21 months to meet the new requirements. Even though the law shares several similarities with other state privacy laws, organizations need to pay attention to a few distinctions as they broaden their compliance efforts across the United States.

The Iowa Data Privacy Act (IDPA) applies to businesses that operate in Iowa or target products or services to Iowa consumers and control or process personal data of 100,000 or more Iowa consumers or 25,000 or more Iowa consumers and derive over 50% of gross revenue from the sale of that data. The IDPA’s definition of a “consumer” includes natural persons who are Iowa residents acting in a personal (noncommercial and nonemployment) context, and excludes employees and B2B contacts. The IDPA imposes obligations on data controllers, such as limiting the purpose of processing personal data, implementing reasonable safeguards, refraining from discrimination, being transparent in their privacy notice, and ensuring contracts control relationships with their processors. It provides Iowa consumers with opt-out, deletion, access, appeal, and data portability rights. Sensitive personal information includes racial/ethnic origin, religious beliefs, and geolocation data, among others, and controllers must provide clear notice and an opportunity to opt-out of nonexempt processing.

The Iowa Attorney General has exclusive enforcement authority, and the IDPA does not allow for a private right of action.

Indiana Bill on Consumer data protection

Indiana is set to become the seventh state to enact a comprehensive privacy law when Senate Bill No. 5 is signed by Governor Eric Holcomb. The law goes into effect on January 1, 2026.

The Indiana privacy law applies to businesses that process the personal data of at least 100,000 Indiana residents or 25,000 Indiana residents and generate more than 50% of their gross revenue from the sale of personal data. Certain entities and data are exempt from the law. The law requires businesses to provide consumers with a clear and meaningful privacy notice and gives consumers the right to confirm, access, correct, delete, and port their personal data. Consumers can also opt-out of the processing of their personal data for targeted advertising, sale of personal data, or profiling that produces significant effects. There is no private right of action, and businesses have a 30-day cure period for any alleged violations. The Indiana privacy law is similar to other comprehensive state privacy laws, such as the Virginia Consumer Data Protection Act.

Montana Consumer Data Privacy Act (MCDPA)

After passing both houses of the Montana legislature, the Montana Consumer Data Privacy Act (MCDPA) now awaits Governor Greg Gianforte’s signature. The MCDPA is similar to the laws in Connecticut and Virginia, suggesting that these models are becoming the foundation for other state privacy laws concerning consumers.

The Montana Consumer Data Privacy Act (MCDPA) applies to companies that do business in Montana, control or process personal data of 50,000 or more Montana consumers or 25,000 or more Montana consumers and derive over 25% of gross revenue from the sale of that data. “Consumer” is defined as a natural person who is a resident of Montana acting in a personal context. Personal data includes information that is linked or reasonably linkable to an identified or identifiable individual. Sensitive data includes information about a person’s race/ethnic origin, religion, health diagnosis, sex life, sexual orientation, citizenship, immigration status, and genetic or biometric information. Companies must provide a standard set of consumer rights, including opt-out rights related to the sale of personal data, deletion rights, access rights, correction rights, appeal rights, opt-in rights for advertising and targeted marketing to individuals aged 13 to 16, and data portability rights. Sensitive data cannot be processed without obtaining the consumer’s consent or, in the case of a child, complying with COPPA. The MCDPA requires controllers to limit the purpose of processing personal data to that which is reasonably necessary and proportional, take steps to implement reasonable safeguards for the personal data within their control, refrain from discriminating against consumers for exercising their rights, and be transparent in their privacy notice.

The Montana Attorney General has exclusive enforcement authority, and there is no private right of action. The MCDPA will go into effect on October 1, 2024.

Tennessee Information Privacy Act (TIPA)

If Governor Bill Lee approves, Tennessee will soon join the states with comprehensive privacy laws with the implementation of the Tennessee Information Privacy Act (TIPA). The TIPA largely follows the model of California’s CCPA, but with one notable exception.

TIPA applies to companies doing business in or targeting products or services to Tennessee residents, and processing personal information of at least 100,000 consumers, or 25,000 consumers and deriving more than 50% of their gross revenues from the sale of personal information. Compliance with CCPA obligations will likely result in compliance with TIPA, subject to obligations with respect to the NIST Privacy Framework. The NIST Privacy Framework requires companies to identify, govern, control, communicate and protect privacy risks. Failure to comply with TIPA may result in penalties of up to $15,000 per violation, enforced by the Tennessee Attorney General.

Outlook

Several states are currently working on passing their own comprehensive consumer privacy bills this year, and there are also plans for more specialized privacy laws. For example, there are proposed laws focused on children, social media (such as Utah’s Social Media Regulation Act), and health information not covered by HIPAA (such as Washington’s My Health My Data Act). In addition, there is also the draft legislation for a comprehensive data protection law at the federal level.

For US-Congress, privacy is top of mind

3. March 2023

The lack of comprehensive federal privacy legislation in the United States continues to be a cause of concern for many, as consumers and industry struggle with the growing patchwork of state laws. With the rise of data breaches, hacking, and other cyber threats, individuals are rightly concerned about the security and privacy of their personal information. As a result, lawmakers in the United States have introduced several data protection bills that could get a second look in Congress.

Several data protection bills

The “Health Data Use and Privacy Commission Act”, sponsored by Senator Bill Cassidy, aims to establish a blue-ribbon panel to recommend changes to health privacy laws. This bill seeks to address the growing concerns about the collection, use, and dissemination of personal health data. The panel would be tasked with evaluating current laws and regulations, identifying gaps and weaknesses, and recommending changes to ensure that individuals’ health data is adequately protected.

The “My Body, My Data Act” would create a new national standard to protect personal reproductive health data. By minimizing the personal reproductive health data that is collected and retained, the bill would prevent this information from being disclosed or misused.

The “Data Care Act” would require websites, apps, and other online providers to take responsible steps to safeguard personal information and stop the misuse of users’ data. This bill seeks to hold companies accountable for their data practices and prevent them from using personal data in ways that could lead to harm. It would require companies to take reasonable steps to safeguard personal data and to disclose how they use and share consumer data.

A national data protection framework remains the main goal

The “American Data Privacy and Protection Act” (ADPPA) was proposed last year, and while it failed to make it to the House floor, it remains the preferred framework for addressing current regulatory shortcomings. The latest Congressional hearing dedicated to privacy, hosted by the House Committee on Energy and Commerce’s new Subcommittee on Innovation, Data and Commerce, discussed the need for comprehensive federal legislation and confirmed that the ADPPA is the only framework being considered at this time.

The hearing also highlighted the industry benefits of a national standard, particularly for small and medium-sized businesses, who are struggling to keep up with the growing state privacy law patchwork. Federal preemption remains a point of contention in ADPPA talks, with several states rejecting proposed preemption last year, most notably California.

The subcommittee also focused on the need to regulate the growing data broker industry, which was characterized as a “multibillion-dollar economy selling consumers’ data with virtually no restrictions or oversight.” The ADPPA carries important provisions on broker disclosure and user opt-out obligations, which are designed to increase transparency and give consumers greater control over their data.

Outlook

The lack of comprehensive federal privacy legislation in the United States continues to be a concern for consumers and industry. As technology continues to advance and new threats emerge, it is essential that lawmakers in the United States take proactive steps to ensure that individuals’ rights to privacy are protected. By passing these bills, Congress can help to establish a framework for data protection that will safeguard individuals’ personal information and prevent abuses of data use. Until now, data protection in the United States has primarily been at the top of the agenda at the state level. California, Colorado, Connecticut, Virginia, and Utah have recently enacted comprehensive data privacy laws. The ADPPA remains the preferred framework for addressing current regulatory incompletion, and there are growing calls for a national standard to avoid the problems that arise with a growing state privacy law patchwork. While federal preemption remains a point of contention, there are hopes that new Republican leadership could bring better odds of the ADPPA making it to the floor in 2023.

Irish DPA did not investigate Facebook with “due diligence”

17. January 2023

On January 12th, 2023, the European Data Protection Board (EDPB) issued a decision criticizing the Irish Data Protection Commissioner’s attempt to narrow the scope of an investigation in Facebook’s (a part of American tech giant Meta Inc.).

Furthermore, the EDPB found that the Commissioner had ignored a key element arising from a complaint filed in Austria in 2018: Meta Inc. had adapted its terms and conditions to the new GDPR rules in order to be compliant with the European regulation. This resulted in user consent becoming a requirement for continued use of the service.

The complaint argued that this could amount to forced consent. However, the Data Protection Commissioner disagreed and stated that the tech company can rely on the argument that it is fulfilling a contract with its users to provide personalized ads, although breaching transparency obligations.

The EDPB ordered the Commission to reverse its legal position on Meta Inc.’s data collection and processing as its contractual basis for data collection breached EU law.

Furthermore, the EDPB stated that the Irish Data Protection Commission failed to clearly establish the legal basis of data collection generally, and also failed to investigate specific concerns in the matter of sensitive information.

FCC proposes updated data breach reporting requirements

10. January 2023

In the first week of January 2023, the Federal Communications Commission voted on a Proposed Rulemaking, which was passed with 4 votes in favour against none opposed, in order to strengthen the Commission’s rules for notifying customers and federal law enforcement agencies of breaches of customary proprietary network information.

The proposition was made after the wave of new legislations regarding the right to privacy and personal data protection both on a State and a federal level across the U.S.

One of the most relevant proposals contained in the document is to eliminate the current mandatory seven day waiting period for notifying customers of a data breach.

The FCCs Chairman, Jessica Rosenworcel, stated that the rules which were applied until now need to be updated. The FCC will open a formal phase in order to gather more information on how to implement the proposed changes and will also take into account comments made by the FCC Board.

Category: Data Protection · USA
Tags: ,

French DPA fines phone operator for various violations of the GDPR

After receiving several complaints , in November 2022, the French Data Protection Authority (CNIL) decided to impose a fine of 300.000 Euros upon the French phone operator FREE for several violations of the rules contained in the GDPR.

In particular, findings included violations of:

  • Article 12 and 21 GDPR, regarding transparent communication on how the data subjects can exercise their rights, in particular the right of erasure.
  • Article 15 GDPR, regarding the right of access by the data subject.
  • Article 32 GDPR, regarding the security of personal data.
  • Article 33 GDPR, as FREE did not comply with the obligation to document a personal data breach.

As a consequence of these findings, CNIL decided to impose a fine upon FREE, with an order to comply with the GDPR’s rules regarding the management of access and erasure requests and to justify this compliance within three months from the decision, with an additional fine of 500 Euros for each day overdue.

Category: Data Protection · EU · French DPA · GDPR
Tags: , ,

KINAST is ranked among the Top 5 of Data Protection Law Firms in Germany

28. October 2022

We are very pleased about our renewed top placement in this year’s ranking of the Kanzleimonitor* study 2022-23 and would like to thank all clients who recommended us!

In the field of Data Protection Law, we achieved 5th place with numerous direct recommendations. Our firm can thus once again hold its own in a strong field of competitors alongside various large law firms (including Taylor Wessing, Osborne Clarke) in the absolute top group in Data Protection Law.

Three of our Attorneys are also mentioned by name in the current ranking of personal recommendations: Kristin Bauer, Dr. Karsten Kinast and Benjamin Schuh.

We are particularly pleased with this study result, as it is a transparent, direct evaluation from our clients and is carried out by our own professional group of lawyers.

Many thanks again to all clients who have recommended us (again)!

*The German Kanzleimonitor study (law firm monitor) (“kanzleimonitor.de – recommendation is the best reference”) provides an annual comprehensive ranking of the 100 most recommended lawyers and law firms in each legal field in Germany. This overview is intended to serve corporate lawyers in all industries as a selection criterion for mandating commercial law firms.

Another 20 million Euro fine for Clearview AI

The French data protection authority CNIL imposed a fine of 20 million Euros on Clearview AI, being the latest in a line of authorities deeming the processing activities of the biometrics company unlawful under data protection law.

Clearview AI is a US company that extracts photographs and videos that are directly accessible online, including social media, in order to feed its biometric image database, which it prides itself to be the biggest in the world. Access to the search engine based on this database is offered to law enforcement authorities.

The case

The decision followed several complaints from data subjects in 2020, which led to the CNIL’s investigations and a formal notice to Clearview AI in November 2021 to “cease the collection and use of data of persons on French territory in the absence of a legal basis” and “facilitate the exercise of individuals’ rights and to comply with requests for erasure.” However, the company did not react to this notice within the two-month deadline imposed by the CNIL. Therefore, the authority imposed not only the fine but also an order to Clearview AI “to stop collecting and processing data of individuals residing in France without a legal basis and to delete the data of these persons that it had already collected, within a period of two months.” In addition, it set a “penalty of 100,000 euros per day of delay beyond these two months.”

CNIL based its decision on three breaches. First, Clearview AI had processed the data without a legal basis. Given the “intrusive and massive nature of the process which makes it possible to retrieve the images present on Internet of the millions of internet users in France”, Clearview AI had no legitimate interest in the data processing. Second, the CNIL sanctioned Clearview AI’s inadequate handling of data subjects’ requests. Lastly, it penalized the company’s failure to cooperate with the CNIL.

The impact of the decision

For over two years, Clearview AI has been under the scrutiny of data protection authorities (“DPA”s) all over the world. So far, it has been fined more than 68 million Euros in total. Apart from CNIL’s fine, there have been fines of 20 million Euros by Greece’s Hellenic DPA in July 2022, over 7.5 million pounds by the UK Information Commissioner’s Office in May 2022 and 20 million Euros by the Italian Garante in March 2022.

CNIL’s decision was likely not the last one, considering that the all-encompassing nature of Clearview AI’s collection of personal data that – given the company’s business model – inevitably concerns EU data subjects. Whether the company will comply within the two-month period is yet to be seen.

UN Report on privacy and data protection as an increasingly precious asset in the digital era

UN Special Rapporteur on the right to privacy Ana Brian Nougrères published a report in which she laid out ten guiding principles “as a key structural part of every national legal system that regulate the actions of controllers and processors in the processing of personal data”.

According to the Special Rapporteur, “privacy is a human right that enables the free development of personality and the exercise of rights in accordance with the dignity of the human being […]. But today, we live in a world where participating in public and private activity at the national and international level requires more and more personal data to be processed”. Her goal is to achieve “cooperation and regulatory harmonization at the international level”. While many States regulate data protection and privacy issues nationally, international law enshrines the right to privacy in Article 12 of the Universal Declaration of Human Rights. The Special Rapporteur indicated that national legislation already has much in common regarding the principles of privacy and data protection which can “serve as a basis for progressing towards a global consensus that will make it possible to address various challenges that arise in the processing and international transfer of data concerning individuals to ensure that their right to privacy is safeguarded in both virtual and face-to-face environments”.

The ten key principles analyzed are legality, consent, transparency, purpose, loyalty, proportionality, minimization, quality, responsibility, and security – hardly news from an EU perspective. This is not a coincidence, as the Special Rapporteur used several supranational legal frameworks, including the GDPR, as a base for her analysis. This shows once more that a solely Eurocentric view on privacy and data protection is ill-advised, as other parts of the world may not find the principles quite as self-evident. With her report, the Special Rapporteur wishes to encourage and guide States “to strike a balance between the different conflicting interests in the processing of personal data and the right to privacy in the global and digital era”.

Microsoft data leak allegedly affected over 65,000 entities worldwide

Sensitive customer data was openly accessible on the internet via an incorrectly configured Microsoft server. After security researchers from the threat intelligence firm SOCRadar informed the company about the data leak on September 24, 2022, the server was secured, Microsoft announced on October 19, 2022. 

According to Microsoft, an “unintentional misconfiguration on an endpoint that is not in use across the Microsoft ecosystem” “resulted in the potential for unauthenticated access to some business transaction data corresponding to interactions between Microsoft and prospective customers, such as the planning or potential implementation and provisioning of Microsoft services.” The business transaction data that was leaked included “names, email addresses, email content, company name, and phone numbers, and may have included attached files relating to business between a customer and Microsoft or an authorized Microsoft partner.” 

While SOCRadar claims that the breach affected data of over 65,000 entities in 111 countries and entails data from 2017 to 2022 , Microsoft stated that the scope of the issue had been “greatly exaggerated”. Furthermore, Microsoft does not appreciate SOCRadar’s release of a public search tool and suggests that the tool does not meet basic data protection and privacy measures.  

Whether those numbers were indeed exaggerated or if Microsoft is trying to downplay the breach is difficult to judge from the outside. 

Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 Next
1 2 3 13