Tag: CNIL

CNIL plans to start enforcement on Ad Tracker Guideline

7. April 2021

Starting from April 1st, 2021, the French supervisory authority the Commission Nationale de l’Informatique et des Libertés (CNIL) is planning on starting its enforcement of Ad Tracker usage across the internet.

Following its Ad Tracker Guideline, the CNIL gave companies a time frame to adjust ad tracker usage and ensure compliance with the Guideline as well as the GDPR. This chance for the companies to adjust their ad tracker usage has ended on March 31st, 2021.

The new rules on cookies and ad trackers mainly revolve around the chance for the user to give active, free and informed consent. User consent for advertising cookies must be granted by a “clear and positive act”. This encompasses actions such as clicking an “I accept” button and no longer can be agreed to by simply continuing to use the website.

In addition, cookie banners must not only give the option to accept, they also have to give the option to reject. The act to reject cookie has to be as simple and easy as the act to accept cookies. Referring to “Cookie Options” is no longer a valid form of rejection, as it makes the user have to go through an extra step which may dissuade them from rejecting cookies. A valid option remains rejecting cookies by closing the Cookie Banner, but it has to be ensured that unless the cookies are indeed accepted, none but the essential cookies are activated.

Lastly, the Cookie Banner has to give a short information on the usage of the cookies. The CNIL’s Guideline allows for a more detailed information to be linked in the Cookie Banner, however companies should also give a short information in the Cookie Banner in order to be able to obtain “informed” consent.

At the beginning of March, the CNIL announced that “compliance with the rules applicable to cookies and other trackers” would be one of its three priorities for 2021, along with cybersecurity and the protection of health data. In a first act to follow that goal, the CNIL will now begin to conduct checks to ensure websites are in compliance with advertising tracker guidelines.

It is expected that companies that did not adjust their cookie and ad tracker usages will face fines according to the level of lacking compliance.

GDPR fines and data breach reports increased in 2020

12. February 2021

In 2020 a total of €158.5 million in fines were imposed, research by DLA Piper shows. This represents a 39% increase compared to the 20 months the GDPR was previously in force since May 25th, 2018.

Since that date, a total of € 272.5 million in fines have been imposed across Europe under the General Data Protection Regulation (“GDPR”). Italian authorities imposed a total of € 69.3 million, German authorities € 69.1 million, and French authorities 54.4 million. This calculation does not include two fines against Google LLC and Google Ireland Limited totalling € 100 million  (€ 60million + € 40million) and a fine of € 35 million against Amazon Europe Core issued by the French data protection authority “Commission nationale de l’informatique et des libertés” (“CNIL”) on December 10th, 2020, (please see our respective blog post), as proceedings on these fines are pending before the Conseil d’Etat.

A total of 281,000 data breaches were reported during this period, although the countries that imposed the highest fines were not necessarily those where the most data breaches were reported. While Germany and the UK can be found in the top of both lists, with 77,747 data breaches reported in Germany, 30,536 in the UK and 66,527 in the Netherlands, only 5,389 data breaches were reported in France and only 3,460 in Italy.

Although the biggest imposed fine to date still is a fine of € 50 million issued by CNIL against Google LLC in January 2019 (please see our respective blog post) a number of high-profile fines were imposed in 2020, with 6 of the top 10 all time fines being issued in 2020 and one in 2021.

1. H&M Hennes & Mauritz Online Shop A.B. & Co. KG was fined € 35 million for monitoring several hundred employees (please see our respective blog post).

2. TIM (Italian telecommunications operator) was fined € 27 million for making unwanted promotion calls.

3. British Airways was fined € 22 million for failing to protect personal and financial data of more than 400,000 customers (please see our blog post)

4. Marriott International was fined € 20 million for a data breach affecting up to 383 million customers (please see our respective blog post)

5. Wind Tre S.p.A. was fined € 17 million for unsolicited marketing communications.

A comparison of the highest fines shows that most of them were imposed due to an insufficient legal basis for the processing of personal data (Art. 5 & 6 GDPR) or due to insufficient technical and organizational measures to ensure an appropriate level of security (Art. 32 GDPR).

While the European authorities have shown their willingness to enforce the GDPR rules, they have also shown leniency due to the impact that the COVID 19 pandemic has had on businesses. At least in part due to the impact of the pandemic, the penalties planned by the UK ICO have been softened. A planned fine of €205 million for British Airways was reduced to €22 million and a planned fine of €110 million for Marriott International was reduced to €20 million. GDPR investigations are also often lengthy and contentious, so the increased fines may in part be due to more investigations having had sufficient time to be completed. For example, the dispute over the above fines for British Airways and Marriott International has already started in 2019.

Not only the fines but also the number of data breach notifications increased in 2020. In 2020 121,165 data breaches were reported, an average of 331 notifications per day, compared to 278 per day in 2019. In terms of reported data breaches per 100,000 inhabitants, there is a stark contrast between Northern and Southern European countries. In 2020, Denmark recorded 155.6 data breaches per 100,000 inhabitants, the Netherlands 150, Ireland 127.8, while Greece, Italy and Croatia reported the lowest number of data breaches per inhabitant.

The trend shows that the GDPR is being taken more and more seriously by companies and authorities, and this trend is likely to continue as authorities become more confident in enforcing the GDPR. Fines are only likely to increase, especially as none of the fines imposed so far even come close to the maximum possible amount of 4% of a company’s global annual turnover. The figures also show that while the laws are in principle the same and are supposed to be applied the same in all EEA countries, nations have different approaches to interpreting and implementing them. In the near future, we can expect to see the first penalties resulting from the GDPR restrictions on data transfers to third countries, especially in the aftermath of the Schrems II ruling on data transfers to the USA.

CNIL fines Google and Amazon

10. December 2020

The French Data Protection Authority Commission Nationale de l’Informatique et des Libertès – “CNIL” – announced that it has fined the big tech companies Google and Amazon due to violations of the GDPR and the French Data Protection Act.

Regarding Google CNIL announced financial penalties of an combined record breaking amount of € 100 million. € 60 million are against Google LLC, the US-based mother company, and € 40 million against Google Ireland Limited, the Irish daughter company. According to the statement of CNIL the fines are based on violations regarding the Cookie requirements on the website google.fr. Due to an online investigation, conducted on March 16th, 2020, CNIL considers it as proven that Google “placed advertising cookies on the computers of users of the search engine google.fr, without obtaining prior consent and without providing adequate information”.

Besides the findings on Cookies, CNIL also critizes a lack of information on the processed personal data and a partial failure of the opposition mechanism.

The high amount of the financial penalties is justified with the seriousness of the violation, the high amount of concerned data subjects and the significant profits of the companies arising of the advertisements.

CNIL also considers the fact, that this procedure is no longer in place since an update in September 2020, because the newly implemented banner does not allow to understand the purposes for which the cookies are used and does not let the data subject know that they can refuse the coolies.

This is already the second, financial penalty CNIL imposes against Google.

Also for violations in connection with cookies CNIL fines Amazon Europe Core a financial penalty of € 35 million. The accusation is the same as with Google and based on several investigations conducted between December 12th, 2019 and May 19th, 2020. CNIL found out, that when a user visited the website, cookies were automatically placed on his or her computer, without any action required on the users part. Several of these cookies were used for advertising purposes. Also a lack of information has been conducted.

The high amount of the financial penalties is in all cases justified with the seriousness of the violation, the high amount of concerned data subjects and the significant profits of the companies arising of the advertisements.

First judicial application of Schrems II in France

20. October 2020

France’s highest administrative court (Conseil d’État) issued a summary judgment that rejected a request for the suspension of France’s centralized health data platform – Health Data Hub (HDH) – on October 13th, 2020. The Conseil d’État further recognized that there is a risk of U.S. intelligence services requesting the data and called for additional guarantees.

For background, France’s HDH is a data hub supposed to consolidate all health data of people receiving medical care in France in order to facilitate data sharing and promote medical research. The French Government initially chose to partner with Microsoft and its cloud platform Azure. On April 15th, 2020, the HDH signed a contract with Microsoft’s Irish affiliate to host the health data in data centers in the EU. On September 28th, 2020, several associations, unions and individual applicants appealed to the summary proceedings judge of the Conseil d’État, asking for the suspension of the processing of health data related to the COVID-19 pandemic in the HDH. The worry was that the hosting of data by a company which is subject to U.S. laws entails data protection risks due to the potential surveillance done under U.S. national surveillance laws, as has been presented and highlighted in the Schrems II case.

On October 8th, 2020, the Commission Nationale de l’Informatique et Libertées (CNIL) submitted comments on the summary proceeding before the Conseil d’État. The CNIL considered that, despite all of the technical measures implemented by Microsoft (including data encryption), Microsoft could still be able to access the data it processes on behalf of the HDH and could be subject, in theory, to requests from U.S. intelligence services under FISA (or even EO 12333) that would require Microsoft to transfer personal data stored and processed in the EU.
Further, the CNIL recognized that the Court of Justice of the European Union (CJEU) in the Schrems II case only examined the situation where an operator transfers, on its own initiative, personal data to the U.S. However, according to the CNIL, the reasons for the CJEU’s decision also require examining the lawfulness of a situation in which an operator processes personal data in the EU but faces the possibility of having to transfer the data following an administrative or judicial order or request from U.S. intelligence services, which was not clearly stated in the Schrems II ruling. In that case, the CNIL considered that U.S. laws (FISA and EO 12333) also apply to personal data stored outside of the U.S.

In the decision of the Conseil d’État, it agreed with the CNIL that it cannot be totally discounted that U.S. public authorities could request Microsoft and its Irish affiliate to access some of the data held in the HDH. However, the summary proceedings judge did not consider the CJEU’s ruling in the Schrems II case to also require examination of the conditions under which personal data may be processed in the EU by U.S. companies or their affiliates as data processors. EU law does not prohibit subcontracting U.S. companies to process personal data in the EU. In addition, the Conseil d’État considered the violation of the GDPR in this case was purely hypothetical because it presupposes that U.S. authorities are interested in accessing the health data held in the HDH. Further, the summary proceedings judge noted that the health data is pseudonymized before being shared within the HDH, and is then further encrypted by Microsoft.

In the end, the judge highlighted that, in light of the COVID-19 pandemic, there is an important public interest in continuing the processing of health data as enabled by the HDH. The conclusion reached by the Conseil d’ètat was that there is no adequate justification for suspending the data processing activities conducted by the HDH, but the judge ordered the HDH to work with Microsoft to further strengthen privacy rights.

France’s supreme court, the Conseil d’État, restricts the CNIL’s Cookie Guidelines

22. June 2020

On June 19th, 2020, the French Conseil d’État has ordered the Commission Nationale de l’Informatique et des Libertés (CNIL) in a court decision to dismiss particular provisions made in its Guidelines on the subject of cookies and other tracers, which it published in 2019.

The Conseil d’État has received several complaints by businesses and professional associations, who turned to the supreme court in order to have the CNIL’s Guidelines refuted.

The main focus of the decision was the ban on cookie walls. Cookie walls are cookie consent pages which, upon declining consent to the processing of the cookies used for the website, deny the user access to the website. In their Guideline on cookies and other tracers from 2019, the CNIL had declared that such cookie walls were not in accordance with the principles of the General Data Protection Regulation (GDPR), causing a lot of businesses to appeal such a provision in front of the Conseil d’État.

In their decision on the matter, the Conseil d’État has declared that the CNIL, as only having suggestive and recommendatory competence in data protection matters, did not have the competence to issue a ban on cookie walls in the Guidelines. The Conseil d’État focused on the fact that the CNIL’s competence was only recommendatory, and did not have the finality to issue such a provision.

However, in its decision, the supreme court did not put to question whether the ban of cookie walls was in itself lawful or not. The Conseil d’État refrained from giving any substantive statement on the matter, leaving that question unanswered for the moment.

The Conseil d’État has further stated in its decision that in the case of the ability of data subjects to give their consent to processing activities, it is indeed necessary, in order to form free and informed consent, that the data subject is informed individually about each processing activity and its purpose before giving consent. However, business have the margin to decide if they collect the data subject’s consent througha one time, global consent with specifically individualized privacy policies, or over individual consent for each processing activity.

In the rest of its decision, the Conseil d’État has confirmed the remainder of the CNIL’s guidelines and provision on the matter as being lawful and applicable, giving the complainants only limited reason to rejoice.

CNIL publishes new Guidance on Teleworking

14. April 2020

The French Data Protection Authority (CNIL) has released a guidance on teleworking on April 1st, which is intended to help employers master the new working situation. In particular, it is supposed to bring clarity on the IT requirements in order to ensure a safe and well-functioning remote working environment.

In particular, the guidance touches on these following points to form a basis for coping with teleworking from an employer’s perspective:

  • It is recommended that employers formulate an IT Charter or internal regulation on how to use the teleworking systems which are to be followed by the employees,
  • Necessary measures have to be taken in case the systems have to be changed or adapted to the new situation,
  • It should be ensured that employee work stations have the minimum requirements of a firewall, anti-virus software and a tool blocking access to malicious websites,
  • To keep from being exposed on the internet and ensure security, a VPN is recommended to be put in use.

Furthermore, the CNIL has also given guidance on the cases where an organization’s services are mainly performed over the internet. In such cases, it recommended to follow a few necessary requirements in order to make sure the services can be delivered safely and smoothly:

  • Web protocols that guarantee confidentiality and authentication of the processes (such as https and sftp), and keeping them up to date,
  • Double factor authentication,
  • No access to interfaces of non-secure servers,
  • Reviewing logs of access to remotely accessible services to detect suspicious behaviors,
  • Ensuring that the used equipment follows latest security patches.

The CNIL also offered some best practices for employees to follow in cases of working remotely, to give both sides pointers on how to deal with the changing situation.

Specifically, employees are being recommended to ensure their WIFI is secure by using encryption such as WPA 2 or WPA 3, along with a secure password. In addition, the CNIL recommends work equipment given by the employer, as well as using a VPN provided by the company. In the case of using own devices, a firewall and an anti-virus software are the necessary requirements to ensure security of the equipment, as well as updating the operating system and software to the newest patches.

Lastly, the CNIL warns of increased phishing attempts in relation to the COVID-19 outbreak.

Overall, the guidance and best practices the CNIL has published indicate a need for continuous and active vigilance in regards to teleworking, as well as the sharing of personal data in the process.

This guidance is in line with our past assessment of the remote working situation, which you are welcome to check out in the respective blogpost in our Series on Data Protection and Corona.

CNIL announces focus for Control Procedures in 2020

16. March 2020

The french Commission Nationale de l’Informatique et des Libertés (CNIL) has announced their focus in regards to the Control Procedures they intend to take in 2020.

Out of 300 Control Procedures done in one year, in 2020 at least 50 of those are going to be focused on three prioritized themes: health data security, geolocation and cookies compliance. The CNIL decided on prioritizing these areas because of the high relevance all of them have on the daily life of the french citizens.

Especially in regards to health data because of the sensitive nature of the data collected, as well as geological data, due to the never ending new solutions to transportation or enhancements to daily life, it is important to keep an eye on the scope of the data processing and the private sphere which is affected.

Regarding cookies and other tracers, CNIL continues to underline the importance in regards to profiled advertisement. On top of the planned Control Procedures, the CNIL intends to publish a recommendation in the spring of 2020 with regards to cookies. It will keep an eye on the implementation of the recommendation, and give companies a 6 months period to adjust and implement them.

The CNIL also stated that in addition they will continue to work together with other national Data Protection Authorities, in order to ensure the regulation of transnational data processing.

CNIL publishes recommendations on how to get users’ cookie consent

21. January 2020

On 14 January 2020, the French data protection authority (“CNIL”) published recommendations on practical modalities for obtaining the consent of users to store or read non-essential cookies and similar technologies on their devices. In addition, the CNIL also published a series of questions and answers on the recommendations.

The purpose of the recommendations is to help private and public organisations to implement the CNIL guidelines on cookies and similar technologies dated 4 July 2019. To this end, CNIL describes the practical arrangements for obtaining users’ consent, gives concrete examples of the user interface to obtain consent and presents “best practices” that also go beyond the rules.

In order to find pragmatic and privacy-friendly solutions, CNIL consulted with organisations representing industries in the ad tech ecosystem and civil society organisations in advance and discussed the issue with them. The recommendations are neither binding or prescriptive nor exhaustive. Organisations may use other methods to obtain user consent, as long as these methods are in accordance with the guidelines.

Among the most important recommendations are:

Information about the purpose of cookies
First, the purposes of the cookies should be listed. The recommendations contain examples of this brief description for the following purposes or types of cookies:
(1) targeted or personalised advertising;
(2) non-personalized advertising;
(3) personalised advertising based on precise geolocation;
(4) customization of content or products and services provided by the Web Publisher;
(5) social media sharing;
(6) audience measurement/analysis.
In addition, the list of purposes should be complemented by a more detailed description of these purposes, which should be directly accessible, e.g. via a drop-down button or hyperlink.

Information on the data controllers
An exhaustive list of data controllers should be directly accessible, e.g. via a drop-down button or hyperlink. When users click on this hyperlink or button, they should receive specific information on data controllers (name and link to their privacy policy). However, web publishers do not have to list all third parties that use cookies on their website or application, but only those who are also data controllers. Therefore, the role of the parties (data controller, joint data controller, or data processor) has to be assessed individually for each cookie. This list should be regularly updated and should be permanently accessible (e.g. through the cookie consent mechanism, which would be available via a static icon or hyperlink at the bottom of each web page). Should a “substantial” addition be made to the list of data controllers, users’ consent should be sought again.

Real choice between accepting or rejecting cookies
Users must be offered a real choice between accepting or rejecting cookies. This can be done by means of two (not pre-ticked) checkboxes or buttons (“accept” / “reject”, “allow” / “deny”, etc.) or equivalent elements such as “on”/”off” sliders, which should be disabled by default. These checkboxes, buttons or sliders should have the same format and be presented at the same level. Users should have such a choice for each type or category of cookie.

The ability for users to delay this selection
A “cross” button should be included so that users can close the consent interface and do not have to make a choice. If the user closes the interface, no consent cookies should be set. However, consent could be obtained again until the user makes a choice and accepts or rejects cookies.

Overall consent for multiple sites
It is acceptable to obtain user consent for a group of sites rather than individually for each site. However, this requires that users are informed of the exact scope of their consent (i.e., by providing them with a list of sites to which their consent applies) and that they have the ability to refuse all cookies on those sites altogether (e.g., if there is a “refuse all” button along with an “accept all” button). To this end, the examples given in the recommendations include three buttons: “Personalize My Choice” (where users can make a more precise selection based on the purpose or type of cookies), “Reject All” and “Accept All”.

Duration of validity of the consent
It is recommended that users re-submit their consent at regular intervals. CNIL considers a period of 6 months to be appropriate.

Proof of consent
Data controllers should be able to provide individual proof of users’ consent and to demonstrate that their consent mechanism allows a valid consent to be obtained.

The recommendations are open for public consultation until 25 February 2020. A new version of the recommendations will then be submitted to the members of CNIL for adoption during a plenary session. CNIL will carry out enforcement inspections six months after the adoption of the recommendations. The final recommendations may also be updated and completed over time to take account of new technological developments and the responses to the questions raised by professionals and individuals on this subject.

CNIL publishes report on facial recognition

21. November 2019

The French Data Protection Authority, Commission Nationale de l’Informatique et des Libertés (CNIL), has released guidelines concerning the experimental use of facial recognition software by the french public authorities.

Especially concerned with the risks of using such a technology in the public sector, the CNIL made it clear that the use of facial recognition has vast political as well as societal influences and risks. In its report, the CNIL explicitly stated the software can yield very biased results, since the algorithms are not 100% reliable, and the rate of false-positives can vary depending on the gender and on the ethnicity of the individuals that are recorded.

To minimize the chances of an unlawful use of the technology, the CNIL came forth with three main requirements in its report. It recommended to the public authorities, that are using facial recognition in an experimental phase, to comply with them in order to keep the chances of risks to a minimum.

The three requirements put forth in the report are as follows:

  • Facial recognition should only be put to experimental use if there is an established need to implement an authentication mechanism with a high level of reliability. Further, there should be no less intrusive methods applicable to the situation.
  • The controller must under all circumstances respect the rights of the individuals beig recorded. That extends to the necessity of consent for each device used, data subjects’ control over their own data, information obligation, and transparency of the use and purpose, etc.
  • The experimental use must follow a precise timeline and be at the base of a rigorous methodology in order to minimize the risks.

The CNIL also states that it is important to evaluate each use of the technology on a case by case basis, as the risks depending on the way the software is used can vary between controllers.

While the CNIL wishes to give a red lining to the use of facial recognition in the future, it has also made clear that it will fulfill its role by showing support concerning issues that may arise by giving counsel in regards to legal and methodological use of facial recognition in an experimental stage.

Category: EU · French DPA · GDPR · General
Tags: , , , ,

CNIL updates its FAQs for case of a No-Deal Brexit

24. September 2019

The French data protection authority “CNIL” updated its existing catalogue of questions and answers (“FAQs”) to inform about the impact of a no-deal brexit and how controllers should prepare for the transfer of data from the EU to the UK.

As things stand, the United Kingdom will leave the European Union on 1st of November 2019. The UK will then be considered a third country for the purposes of the European General Data Protection Regulation (“GDPR”). For this reason, after the exit, data transfer mechanisms become necessary to transfer personal data from the EU to the UK.

The FAQs recommend five steps that entities should take when transferring data to a controller or processor in the UK to ensure compliance with GDPR:

1. Identify processing activities that involve the transfer of personal data to the United Kingdom.
2. Determine the most appropriate transfer mechanism to implement for these processing activities.
3. Implement the chosen transfer mechanism so that it is applicable and effective as of November 1, 2019.
4. Update your internal documents to include transfers to the United Kingdom as of November 1, 2019.
5. If necessary, update relevant privacy notices to indicate the existence of transfers of data outside the EU and EEA where the United Kingdom is concerned.

CNIL also discusses the GDPR-compliant data transfer mechanisms (e.g., standard contractual clauses, binding corporate rules, codes of conduct) and points out that, whichever one is chosen, it must take effect on 1st of November. If controllers should choose a derogation admissible according to GDPR, CNIL stresses that this must strictly comply with the requirements of Art. 49 GDPR.

Pages: Prev 1 2 3 4 Next
1 2 3 4