Tag: CNIL

CNIL fines Google and Amazon

10. December 2020

The French Data Protection Authority Commission Nationale de l’Informatique et des Libertès – “CNIL” – announced that it has fined the big tech companies Google and Amazon due to violations of the GDPR and the French Data Protection Act.

Regarding Google CNIL announced financial penalties of an combined record breaking amount of € 100 million. € 60 million are against Google LLC, the US-based mother company, and € 40 million against Google Ireland Limited, the Irish daughter company. According to the statement of CNIL the fines are based on violations regarding the Cookie requirements on the website google.fr. Due to an online investigation, conducted on March 16th, 2020, CNIL considers it as proven that Google “placed advertising cookies on the computers of users of the search engine google.fr, without obtaining prior consent and without providing adequate information”.

Besides the findings on Cookies, CNIL also critizes a lack of information on the processed personal data and a partial failure of the opposition mechanism.

The high amount of the financial penalties is justified with the seriousness of the violation, the high amount of concerned data subjects and the significant profits of the companies arising of the advertisements.

CNIL also considers the fact, that this procedure is no longer in place since an update in September 2020, because the newly implemented banner does not allow to understand the purposes for which the cookies are used and does not let the data subject know that they can refuse the coolies.

This is already the second, financial penalty CNIL imposes against Google.

Also for violations in connection with cookies CNIL fines Amazon Europe Core a financial penalty of € 35 million. The accusation is the same as with Google and based on several investigations conducted between December 12th, 2019 and May 19th, 2020. CNIL found out, that when a user visited the website, cookies were automatically placed on his or her computer, without any action required on the users part. Several of these cookies were used for advertising purposes. Also a lack of information has been conducted.

The high amount of the financial penalties is in all cases justified with the seriousness of the violation, the high amount of concerned data subjects and the significant profits of the companies arising of the advertisements.

First judicial application of Schrems II in France

20. October 2020

France’s highest administrative court (Conseil d’État) issued a summary judgment that rejected a request for the suspension of France’s centralized health data platform – Health Data Hub (HDH) – on October 13th, 2020. The Conseil d’État further recognized that there is a risk of U.S. intelligence services requesting the data and called for additional guarantees.

For background, France’s HDH is a data hub supposed to consolidate all health data of people receiving medical care in France in order to facilitate data sharing and promote medical research. The French Government initially chose to partner with Microsoft and its cloud platform Azure. On April 15th, 2020, the HDH signed a contract with Microsoft’s Irish affiliate to host the health data in data centers in the EU. On September 28th, 2020, several associations, unions and individual applicants appealed to the summary proceedings judge of the Conseil d’État, asking for the suspension of the processing of health data related to the COVID-19 pandemic in the HDH. The worry was that the hosting of data by a company which is subject to U.S. laws entails data protection risks due to the potential surveillance done under U.S. national surveillance laws, as has been presented and highlighted in the Schrems II case.

On October 8th, 2020, the Commission Nationale de l’Informatique et Libertées (CNIL) submitted comments on the summary proceeding before the Conseil d’État. The CNIL considered that, despite all of the technical measures implemented by Microsoft (including data encryption), Microsoft could still be able to access the data it processes on behalf of the HDH and could be subject, in theory, to requests from U.S. intelligence services under FISA (or even EO 12333) that would require Microsoft to transfer personal data stored and processed in the EU.
Further, the CNIL recognized that the Court of Justice of the European Union (CJEU) in the Schrems II case only examined the situation where an operator transfers, on its own initiative, personal data to the U.S. However, according to the CNIL, the reasons for the CJEU’s decision also require examining the lawfulness of a situation in which an operator processes personal data in the EU but faces the possibility of having to transfer the data following an administrative or judicial order or request from U.S. intelligence services, which was not clearly stated in the Schrems II ruling. In that case, the CNIL considered that U.S. laws (FISA and EO 12333) also apply to personal data stored outside of the U.S.

In the decision of the Conseil d’État, it agreed with the CNIL that it cannot be totally discounted that U.S. public authorities could request Microsoft and its Irish affiliate to access some of the data held in the HDH. However, the summary proceedings judge did not consider the CJEU’s ruling in the Schrems II case to also require examination of the conditions under which personal data may be processed in the EU by U.S. companies or their affiliates as data processors. EU law does not prohibit subcontracting U.S. companies to process personal data in the EU. In addition, the Conseil d’État considered the violation of the GDPR in this case was purely hypothetical because it presupposes that U.S. authorities are interested in accessing the health data held in the HDH. Further, the summary proceedings judge noted that the health data is pseudonymized before being shared within the HDH, and is then further encrypted by Microsoft.

In the end, the judge highlighted that, in light of the COVID-19 pandemic, there is an important public interest in continuing the processing of health data as enabled by the HDH. The conclusion reached by the Conseil d’ètat was that there is no adequate justification for suspending the data processing activities conducted by the HDH, but the judge ordered the HDH to work with Microsoft to further strengthen privacy rights.

France’s supreme court, the Conseil d’État, restricts the CNIL’s Cookie Guidelines

22. June 2020

On June 19th, 2020, the French Conseil d’État has ordered the Commission Nationale de l’Informatique et des Libertés (CNIL) in a court decision to dismiss particular provisions made in its Guidelines on the subject of cookies and other tracers, which it published in 2019.

The Conseil d’État has received several complaints by businesses and professional associations, who turned to the supreme court in order to have the CNIL’s Guidelines refuted.

The main focus of the decision was the ban on cookie walls. Cookie walls are cookie consent pages which, upon declining consent to the processing of the cookies used for the website, deny the user access to the website. In their Guideline on cookies and other tracers from 2019, the CNIL had declared that such cookie walls were not in accordance with the principles of the General Data Protection Regulation (GDPR), causing a lot of businesses to appeal such a provision in front of the Conseil d’État.

In their decision on the matter, the Conseil d’État has declared that the CNIL, as only having suggestive and recommendatory competence in data protection matters, did not have the competence to issue a ban on cookie walls in the Guidelines. The Conseil d’État focused on the fact that the CNIL’s competence was only recommendatory, and did not have the finality to issue such a provision.

However, in its decision, the supreme court did not put to question whether the ban of cookie walls was in itself lawful or not. The Conseil d’État refrained from giving any substantive statement on the matter, leaving that question unanswered for the moment.

The Conseil d’État has further stated in its decision that in the case of the ability of data subjects to give their consent to processing activities, it is indeed necessary, in order to form free and informed consent, that the data subject is informed individually about each processing activity and its purpose before giving consent. However, business have the margin to decide if they collect the data subject’s consent througha one time, global consent with specifically individualized privacy policies, or over individual consent for each processing activity.

In the rest of its decision, the Conseil d’État has confirmed the remainder of the CNIL’s guidelines and provision on the matter as being lawful and applicable, giving the complainants only limited reason to rejoice.

CNIL publishes new Guidance on Teleworking

14. April 2020

The French Data Protection Authority (CNIL) has released a guidance on teleworking on April 1st, which is intended to help employers master the new working situation. In particular, it is supposed to bring clarity on the IT requirements in order to ensure a safe and well-functioning remote working environment.

In particular, the guidance touches on these following points to form a basis for coping with teleworking from an employer’s perspective:

  • It is recommended that employers formulate an IT Charter or internal regulation on how to use the teleworking systems which are to be followed by the employees,
  • Necessary measures have to be taken in case the systems have to be changed or adapted to the new situation,
  • It should be ensured that employee work stations have the minimum requirements of a firewall, anti-virus software and a tool blocking access to malicious websites,
  • To keep from being exposed on the internet and ensure security, a VPN is recommended to be put in use.

Furthermore, the CNIL has also given guidance on the cases where an organization’s services are mainly performed over the internet. In such cases, it recommended to follow a few necessary requirements in order to make sure the services can be delivered safely and smoothly:

  • Web protocols that guarantee confidentiality and authentication of the processes (such as https and sftp), and keeping them up to date,
  • Double factor authentication,
  • No access to interfaces of non-secure servers,
  • Reviewing logs of access to remotely accessible services to detect suspicious behaviors,
  • Ensuring that the used equipment follows latest security patches.

The CNIL also offered some best practices for employees to follow in cases of working remotely, to give both sides pointers on how to deal with the changing situation.

Specifically, employees are being recommended to ensure their WIFI is secure by using encryption such as WPA 2 or WPA 3, along with a secure password. In addition, the CNIL recommends work equipment given by the employer, as well as using a VPN provided by the company. In the case of using own devices, a firewall and an anti-virus software are the necessary requirements to ensure security of the equipment, as well as updating the operating system and software to the newest patches.

Lastly, the CNIL warns of increased phishing attempts in relation to the COVID-19 outbreak.

Overall, the guidance and best practices the CNIL has published indicate a need for continuous and active vigilance in regards to teleworking, as well as the sharing of personal data in the process.

This guidance is in line with our past assessment of the remote working situation, which you are welcome to check out in the respective blogpost in our Series on Data Protection and Corona.

CNIL announces focus for Control Procedures in 2020

16. March 2020

The french Commission Nationale de l’Informatique et des Libertés (CNIL) has announced their focus in regards to the Control Procedures they intend to take in 2020.

Out of 300 Control Procedures done in one year, in 2020 at least 50 of those are going to be focused on three prioritized themes: health data security, geolocation and cookies compliance. The CNIL decided on prioritizing these areas because of the high relevance all of them have on the daily life of the french citizens.

Especially in regards to health data because of the sensitive nature of the data collected, as well as geological data, due to the never ending new solutions to transportation or enhancements to daily life, it is important to keep an eye on the scope of the data processing and the private sphere which is affected.

Regarding cookies and other tracers, CNIL continues to underline the importance in regards to profiled advertisement. On top of the planned Control Procedures, the CNIL intends to publish a recommendation in the spring of 2020 with regards to cookies. It will keep an eye on the implementation of the recommendation, and give companies a 6 months period to adjust and implement them.

The CNIL also stated that in addition they will continue to work together with other national Data Protection Authorities, in order to ensure the regulation of transnational data processing.

CNIL publishes recommendations on how to get users’ cookie consent

21. January 2020

On 14 January 2020, the French data protection authority (“CNIL”) published recommendations on practical modalities for obtaining the consent of users to store or read non-essential cookies and similar technologies on their devices. In addition, the CNIL also published a series of questions and answers on the recommendations.

The purpose of the recommendations is to help private and public organisations to implement the CNIL guidelines on cookies and similar technologies dated 4 July 2019. To this end, CNIL describes the practical arrangements for obtaining users’ consent, gives concrete examples of the user interface to obtain consent and presents “best practices” that also go beyond the rules.

In order to find pragmatic and privacy-friendly solutions, CNIL consulted with organisations representing industries in the ad tech ecosystem and civil society organisations in advance and discussed the issue with them. The recommendations are neither binding or prescriptive nor exhaustive. Organisations may use other methods to obtain user consent, as long as these methods are in accordance with the guidelines.

Among the most important recommendations are:

Information about the purpose of cookies
First, the purposes of the cookies should be listed. The recommendations contain examples of this brief description for the following purposes or types of cookies:
(1) targeted or personalised advertising;
(2) non-personalized advertising;
(3) personalised advertising based on precise geolocation;
(4) customization of content or products and services provided by the Web Publisher;
(5) social media sharing;
(6) audience measurement/analysis.
In addition, the list of purposes should be complemented by a more detailed description of these purposes, which should be directly accessible, e.g. via a drop-down button or hyperlink.

Information on the data controllers
An exhaustive list of data controllers should be directly accessible, e.g. via a drop-down button or hyperlink. When users click on this hyperlink or button, they should receive specific information on data controllers (name and link to their privacy policy). However, web publishers do not have to list all third parties that use cookies on their website or application, but only those who are also data controllers. Therefore, the role of the parties (data controller, joint data controller, or data processor) has to be assessed individually for each cookie. This list should be regularly updated and should be permanently accessible (e.g. through the cookie consent mechanism, which would be available via a static icon or hyperlink at the bottom of each web page). Should a “substantial” addition be made to the list of data controllers, users’ consent should be sought again.

Real choice between accepting or rejecting cookies
Users must be offered a real choice between accepting or rejecting cookies. This can be done by means of two (not pre-ticked) checkboxes or buttons (“accept” / “reject”, “allow” / “deny”, etc.) or equivalent elements such as “on”/”off” sliders, which should be disabled by default. These checkboxes, buttons or sliders should have the same format and be presented at the same level. Users should have such a choice for each type or category of cookie.

The ability for users to delay this selection
A “cross” button should be included so that users can close the consent interface and do not have to make a choice. If the user closes the interface, no consent cookies should be set. However, consent could be obtained again until the user makes a choice and accepts or rejects cookies.

Overall consent for multiple sites
It is acceptable to obtain user consent for a group of sites rather than individually for each site. However, this requires that users are informed of the exact scope of their consent (i.e., by providing them with a list of sites to which their consent applies) and that they have the ability to refuse all cookies on those sites altogether (e.g., if there is a “refuse all” button along with an “accept all” button). To this end, the examples given in the recommendations include three buttons: “Personalize My Choice” (where users can make a more precise selection based on the purpose or type of cookies), “Reject All” and “Accept All”.

Duration of validity of the consent
It is recommended that users re-submit their consent at regular intervals. CNIL considers a period of 6 months to be appropriate.

Proof of consent
Data controllers should be able to provide individual proof of users’ consent and to demonstrate that their consent mechanism allows a valid consent to be obtained.

The recommendations are open for public consultation until 25 February 2020. A new version of the recommendations will then be submitted to the members of CNIL for adoption during a plenary session. CNIL will carry out enforcement inspections six months after the adoption of the recommendations. The final recommendations may also be updated and completed over time to take account of new technological developments and the responses to the questions raised by professionals and individuals on this subject.

CNIL publishes report on facial recognition

21. November 2019

The French Data Protection Authority, Commission Nationale de l’Informatique et des Libertés (CNIL), has released guidelines concerning the experimental use of facial recognition software by the french public authorities.

Especially concerned with the risks of using such a technology in the public sector, the CNIL made it clear that the use of facial recognition has vast political as well as societal influences and risks. In its report, the CNIL explicitly stated the software can yield very biased results, since the algorithms are not 100% reliable, and the rate of false-positives can vary depending on the gender and on the ethnicity of the individuals that are recorded.

To minimize the chances of an unlawful use of the technology, the CNIL came forth with three main requirements in its report. It recommended to the public authorities, that are using facial recognition in an experimental phase, to comply with them in order to keep the chances of risks to a minimum.

The three requirements put forth in the report are as follows:

  • Facial recognition should only be put to experimental use if there is an established need to implement an authentication mechanism with a high level of reliability. Further, there should be no less intrusive methods applicable to the situation.
  • The controller must under all circumstances respect the rights of the individuals beig recorded. That extends to the necessity of consent for each device used, data subjects’ control over their own data, information obligation, and transparency of the use and purpose, etc.
  • The experimental use must follow a precise timeline and be at the base of a rigorous methodology in order to minimize the risks.

The CNIL also states that it is important to evaluate each use of the technology on a case by case basis, as the risks depending on the way the software is used can vary between controllers.

While the CNIL wishes to give a red lining to the use of facial recognition in the future, it has also made clear that it will fulfill its role by showing support concerning issues that may arise by giving counsel in regards to legal and methodological use of facial recognition in an experimental stage.

Category: EU · French DPA · GDPR · General
Tags: , , , ,

CNIL updates its FAQs for case of a No-Deal Brexit

24. September 2019

The French data protection authority “CNIL” updated its existing catalogue of questions and answers (“FAQs”) to inform about the impact of a no-deal brexit and how controllers should prepare for the transfer of data from the EU to the UK.

As things stand, the United Kingdom will leave the European Union on 1st of November 2019. The UK will then be considered a third country for the purposes of the European General Data Protection Regulation (“GDPR”). For this reason, after the exit, data transfer mechanisms become necessary to transfer personal data from the EU to the UK.

The FAQs recommend five steps that entities should take when transferring data to a controller or processor in the UK to ensure compliance with GDPR:

1. Identify processing activities that involve the transfer of personal data to the United Kingdom.
2. Determine the most appropriate transfer mechanism to implement for these processing activities.
3. Implement the chosen transfer mechanism so that it is applicable and effective as of November 1, 2019.
4. Update your internal documents to include transfers to the United Kingdom as of November 1, 2019.
5. If necessary, update relevant privacy notices to indicate the existence of transfers of data outside the EU and EEA where the United Kingdom is concerned.

CNIL also discusses the GDPR-compliant data transfer mechanisms (e.g., standard contractual clauses, binding corporate rules, codes of conduct) and points out that, whichever one is chosen, it must take effect on 1st of November. If controllers should choose a derogation admissible according to GDPR, CNIL stresses that this must strictly comply with the requirements of Art. 49 GDPR.

CNIL and ICO publish revised cookie guidelines

6. August 2019

The French data protection authority CNIL as well as the British data protection authority ICO have revised and published their guidelines on cookies.

The guidelines contain several similarities, but also differ in some respects.

Both France and the UK consider rules that apply to cookies to be also applicable to any device that stores or accesses information. In addition, both authorities stress that users must give specific, free and unambiguous consent before cookies are placed. Further scrolling of the website cannot be considered as consent. Likewise, obtaining consent from T&Cs is not lawful. This procedure violates Art. 7 (2) of the General Data Protection Regulation (GDPR), according to which the request for consent shall be presented in a manner which is clearly distinguishable from the other matters, in an intelligible and easily accessible form, using clear and plain language. In addition, all parties who place cookies must be named so that informed consent can be obtained. Finally, both authorities point out that browser settings alone are not a sufficient basis for valid consent.

With regard to the territorial scope, CNIL clarifies that the cookie rules apply only to the processing of cookies within the activities of an establishment of a controller or processor in France, regardless of whether the processing takes place in France. The English guideline does not comment on this.

Cookie walls are considered non-compliant with GDPR by the French data protection authority due to the negative consequences for the user in case of refusal. ICO, on the other hand, is of the opinion that a consent forced on the basis of a cookie wall is probably not valid. Nevertheless GDPR must be balanced with other rights. Insofar ICO has not yet delivered a clear position.

Regarding analytic cookies, CNIL explains that a consent is not always necessary, namely not if they correspond to a list of cumulative requirements created by CNIL. ICO, on the other hand, does not exempt cookies from the consent requirement even in the case of analytic cookies.

Finally, CNIL notes that companies have six months to comply with the rules. However, this period will only be set in motion by the publication of a statement by the CNIL, which is still pending. CNIL expects this statement to be finalised during the first quarter of 2020. The ICO does not foresee such a time limit.

CNIL fines French insurance company

26. July 2019

The French Data Protection Authority (CNIL) imposed a € 180.000 fine on a French insurance company for violating customer data security on their website.

Active Assurance is an insurance intermediary and distributor of motor insurances to customers. On their website, people can request offers, subscribe to contracts and access their personal space.

In 2018, CNIL received a complaint from an Active Assurance customer, saying that he had been able access other users’ data. The other accounts were accessible via hypertext links referred on a search engine. Customers’ documents were also available by slightly changing the URL. Among those records were drivers’ licences, bank statements and documents revealing whether someone has been subject of a licence withdrawal or hit and run.

CNIL informed the company about the violations and a few days later, the company stated that measures had been taken to rectify the infringements. After an on-site audit at the company’s premises, CNIL found that the measures taken were not sufficient and that Active Assurance violates Art. 32 GDPR. Active Assurance should have ensured that only authorized persons had access to the documents. The company should have also instructed the customers to use strong passwords and it should not have send them the passwords in plain text by e-mail.

Based on the seriousness of the breach and the number of people involved, CNIL imposed a fine of € 180.000.

Pages: 1 2 3 Next
1 2 3