Tag: CNIL

CNIL publishes report on facial recognition

21. November 2019

The French Data Protection Authority, Commission Nationale de l’Informatique et des Libertés (CNIL), has released guidelines concerning the experimental use of facial recognition software by the french public authorities.

Especially concerned with the risks of using such a technology in the public sector, the CNIL made it clear that the use of facial recognition has vast political as well as societal influences and risks. In its report, the CNIL explicitly stated the software can yield very biased results, since the algorithms are not 100% reliable, and the rate of false-positives can vary depending on the gender and on the ethnicity of the individuals that are recorded.

To minimize the chances of an unlawful use of the technology, the CNIL came forth with three main requirements in its report. It recommended to the public authorities, that are using facial recognition in an experimental phase, to comply with them in order to keep the chances of risks to a minimum.

The three requirements put forth in the report are as follows:

  • Facial recognition should only be put to experimental use if there is an established need to implement an authentication mechanism with a high level of reliability. Further, there should be no less intrusive methods applicable to the situation.
  • The controller must under all circumstances respect the rights of the individuals beig recorded. That extends to the necessity of consent for each device used, data subjects’ control over their own data, information obligation, and transparency of the use and purpose, etc.
  • The experimental use must follow a precise timeline and be at the base of a rigorous methodology in order to minimize the risks.

The CNIL also states that it is important to evaluate each use of the technology on a case by case basis, as the risks depending on the way the software is used can vary between controllers.

While the CNIL wishes to give a red lining to the use of facial recognition in the future, it has also made clear that it will fulfill its role by showing support concerning issues that may arise by giving counsel in regards to legal and methodological use of facial recognition in an experimental stage.

Category: EU · French DPA · GDPR · General
Tags: , , , ,

CNIL updates its FAQs for case of a No-Deal Brexit

24. September 2019

The French data protection authority “CNIL” updated its existing catalogue of questions and answers (“FAQs”) to inform about the impact of a no-deal brexit and how controllers should prepare for the transfer of data from the EU to the UK.

As things stand, the United Kingdom will leave the European Union on 1st of November 2019. The UK will then be considered a third country for the purposes of the European General Data Protection Regulation (“GDPR”). For this reason, after the exit, data transfer mechanisms become necessary to transfer personal data from the EU to the UK.

The FAQs recommend five steps that entities should take when transferring data to a controller or processor in the UK to ensure compliance with GDPR:

1. Identify processing activities that involve the transfer of personal data to the United Kingdom.
2. Determine the most appropriate transfer mechanism to implement for these processing activities.
3. Implement the chosen transfer mechanism so that it is applicable and effective as of November 1, 2019.
4. Update your internal documents to include transfers to the United Kingdom as of November 1, 2019.
5. If necessary, update relevant privacy notices to indicate the existence of transfers of data outside the EU and EEA where the United Kingdom is concerned.

CNIL also discusses the GDPR-compliant data transfer mechanisms (e.g., standard contractual clauses, binding corporate rules, codes of conduct) and points out that, whichever one is chosen, it must take effect on 1st of November. If controllers should choose a derogation admissible according to GDPR, CNIL stresses that this must strictly comply with the requirements of Art. 49 GDPR.

CNIL and ICO publish revised cookie guidelines

6. August 2019

The French data protection authority CNIL as well as the British data protection authority ICO have revised and published their guidelines on cookies.

The guidelines contain several similarities, but also differ in some respects.

Both France and the UK consider rules that apply to cookies to be also applicable to any device that stores or accesses information. In addition, both authorities stress that users must give specific, free and unambiguous consent before cookies are placed. Further scrolling of the website cannot be considered as consent. Likewise, obtaining consent from T&Cs is not lawful. This procedure violates Art. 7 (2) of the General Data Protection Regulation (GDPR), according to which the request for consent shall be presented in a manner which is clearly distinguishable from the other matters, in an intelligible and easily accessible form, using clear and plain language. In addition, all parties who place cookies must be named so that informed consent can be obtained. Finally, both authorities point out that browser settings alone are not a sufficient basis for valid consent.

With regard to the territorial scope, CNIL clarifies that the cookie rules apply only to the processing of cookies within the activities of an establishment of a controller or processor in France, regardless of whether the processing takes place in France. The English guideline does not comment on this.

Cookie walls are considered non-compliant with GDPR by the French data protection authority due to the negative consequences for the user in case of refusal. ICO, on the other hand, is of the opinion that a consent forced on the basis of a cookie wall is probably not valid. Nevertheless GDPR must be balanced with other rights. Insofar ICO has not yet delivered a clear position.

Regarding analytic cookies, CNIL explains that a consent is not always necessary, namely not if they correspond to a list of cumulative requirements created by CNIL. ICO, on the other hand, does not exempt cookies from the consent requirement even in the case of analytic cookies.

Finally, CNIL notes that companies have six months to comply with the rules. However, this period will only be set in motion by the publication of a statement by the CNIL, which is still pending. CNIL expects this statement to be finalised during the first quarter of 2020. The ICO does not foresee such a time limit.

CNIL fines French insurance company

26. July 2019

The French Data Protection Authority (CNIL) imposed a € 180.000 fine on a French insurance company for violating customer data security on their website.

Active Assurance is an insurance intermediary and distributor of motor insurances to customers. On their website, people can request offers, subscribe to contracts and access their personal space.

In 2018, CNIL received a complaint from an Active Assurance customer, saying that he had been able access other users’ data. The other accounts were accessible via hypertext links referred on a search engine. Customers’ documents were also available by slightly changing the URL. Among those records were drivers’ licences, bank statements and documents revealing whether someone has been subject of a licence withdrawal or hit and run.

CNIL informed the company about the violations and a few days later, the company stated that measures had been taken to rectify the infringements. After an on-site audit at the company’s premises, CNIL found that the measures taken were not sufficient and that Active Assurance violates Art. 32 GDPR. Active Assurance should have ensured that only authorized persons had access to the documents. The company should have also instructed the customers to use strong passwords and it should not have send them the passwords in plain text by e-mail.

Based on the seriousness of the breach and the number of people involved, CNIL imposed a fine of € 180.000.

CNIL publishes action plan on targeted online advertising

3. July 2019

On 29th June, the French data protection authority CNIL published its 2019-2020 action plan, which aims to set rules for targeted online advertising and guide companies in their compliance efforts.

The Action Plan consists of two main steps. First, new cookie guidelines will be published in July 2019. The last cookie policy dates back to 2013, for which CNIL stated that the policy is no longer valid and will be repealed due to the stricter approval requirements of the GDPR. In order to comply with the new cookie guidelines, companies will be given a transitional period of 12 months. During this period, it will still be possible to define further browsing of a website as consent to the use of cookies. However, CNIL requires that during this transition period Cookies will be set only after consent has been obtained.

As a second major step, working groups composed of CNIL officials and stakeholders from the adtech ecosystem will be formed to develop practical approaches to obtain consent. The draft recommendations developed on the basis of this discussion will be published by CNIL at the end of 2019 or at the latest at the beginning of 2020 in order to make them available for public consultation. CNIL will then implement the final version of the recommendations after a period of six months.

The reason for preparing the Action Plan was that CNIL received numerous complaints about online marketing practices from individuals, non-profit organisations, organisations and associations. In 2018, 21% of complaints related to these issues. At the same time, CNIL received numerous questions from industry professionals trying to better understand their GDPR obligations.

CNIL fines translation company for violating the French Data Protection Act

19. June 2019

The French Data Protection Authority (CNIL) recently fined UNIONTRAD COMPANY €20,000 for excessive video surveillance of employees.

UNIONTRAD COMPANY is a small French translation company with nine employees. Between 2013 and 2017, several employees complained that they were filmed at their workspaces. The CNIL alerted the company two times to the rules for installing cameras at the workspace, particularly that employees should not be filmed continuously and that information on present cameras should be given.

In an audit carried out at the company’s grounds in February 2018, the CNIL discovered among other things that the camera in the office of six translators filmed them constantly, no sufficient information about the cameras had been provided and the computer workspaces were not secured by a password.

In July 2018, the President of the CNIL issued a formal notice to the company, asking it to inter alia move the camera to no longer film the employees constantly; inform the employees about the cameras and implement appropriate security measures for access to computer workspaces.

A second audit in October 2018 showed that the company had not taken any actions for the violations. The CNIL now imposed a fine of €20,000 considering the size and financial situation of the company.

CNIL fines French real estate company for violating the GDPR

7. June 2019

The French Data Protection Authority “Commission Nationale de l’Informatique et des Libertés” (CNIL) issued a 400k euro fine for the French real estate company “Sergic” for violating the GDPR.
Sergic is specialized in real estate development, purchase, sale, rental and property management and has published the website www.sergic.com , which allows rental candidates to upload the necessary documents for preparing their file.

In August 2018, a Sergic user contacted the CNIL reporting that he had unencrypted access, from his personal space on the website, to other users’ uploaded files by slightly changing the URL. On September 7, 2018, an online check revealed that rental candidates’ uploaded documents were actually freely accessible for others without prior authentication. Among the documents were copies of identity cards, health cards, tax notices and divorce judgements. CNIL informed Sergic on the same day of this security incident and the violation of personal data. It became apparent that Sergic had been aware of this since March 2018 and, even though it had initiated IT developments to correct it, the final correction did not take place until September 17, 2018.

Based on the investigation, the responsible CNIL body found two violations of the GDPR. Firstly, Sergic had failed to fulfil its obligations according to Art. 32 GDPR, which obliges controllers to implement appropriate technical and organizational measures to ensure a secure level of protection of the personal data. This includes for example a procedure to ensure that personal documents cannot be accessed without prior authentication of the user. In addition, there is the time that the company took to correct the error.

Secondly, the CNIL found out that Sergic kept all the documents sent by candidates in active base, although they had not accessed rental accommodation for more than the time required to allocate housing. According to the GDPR, the controller has the obligation to delete data immediately if they are no longer necessary in relation to the purposes for which they were collected or otherwise processed and no other purpose justifies the storage of the data in an active database.

The CNIL imposed a fine of € 400.000 and decided to make its sanction public due to inter alia the seriousness of the breach, the lack of due diligence by the company and the fact that the documents revealed intimate aspects of people’s lives.

Category: Data breach · French DPA · GDPR
Tags: , ,

CNIL publishes model regulation on access control through biometric authentication at the workplace

9. April 2019

The French data protection authority CNIL has published a model regulation which regulates under which conditions devices for access control through biometric authentication may be introduced at the workplace.

Pursuant to Article 4 paragraph 14 of the General Data Protection Regulation (GDPR), biometric data are personal data relating to the physical, physiological or behavioural characteristics of a natural person, obtained by means of specific technical processes, which enable or confirm the unambiguous identification of that natural person. According to Article 9 paragraph 4 GDPR, the member states of the European Union may introduce or maintain additional conditions, including restrictions, as far as the processing of biometric data is concerned.

The basic requirement under the model regulation is that the controller proves that biometric data processing is necessary. To this end, the controller must explain why the use of other means of identification or organisational and technical safeguards is not appropriate to achieve the required level of security.

Moreover, the choice of biometric types must be specifically explained and documented by the employer. This also includes the justification for the choice of one biometric feature over another. Processing must be carried out for the purpose of controlling access to premises classified by the company as restricted or of controlling access to computer devices and applications.

Furthermore, the model regulation of the CNIL describes which types of personal data may be collected, which storage periods and conditions apply and which specific technical and organisational measures must be taken to guarantee the security of personal data. In addition, CNIL states that before implementing data processing, the controller must always carry out an impact assessment and a risk assessment of the rights and freedoms of the individual. This risk assessment must be repeated every three years for updating purposes.

The data protection authority also points out that the model regulation does not exempt from compliance with the regulations of the GDPR, since it is not intended to replace its regulations, but to supplement or specify them.

EDPB publishes information note on data transfer in the event of a no-deal Brexit

25. February 2019

The European Data Protection Board has published an information note to explain data transfer to organisations and facilitate preparation in the event that no agreement is reached between the EEA and the UK. In case of a no-deal Brexit, the UK becomes a third country for which – as things stand at present – no adequacy decision exists.

EDPB recommends that organisations transferring data to the UK carry out the following five preparation steps:

• Identify what processing activities will imply a personal data transfer to the UK
• Determine the appropriate data transfer instrument for your situation
• Implement the chosen data transfer instrument to be ready for 30 March 2019
• Indicate in your internal documentation that transfers will be made to the UK
• Update your privacy notice accordingly to inform individuals

In addition, EDPB explains which instruments can be used to transfer data to the UK:
– Standard or ad hoc Data Protection Clauses approved by the European Commission can be used.
– Binding Corporate Rules for data processing can be defined.
– A code of conduct or certification mechanism can be established.

Derogations are possible in the cases mentioned by article 49 GDPR. However, they are interpreted very restrictively and mainly relate to processing activities that are occasional and non-repetitive. Further explanations on available derogations and how to apply them can be found in the EDPB Guidelines on Article 49 of GDPR.

The French data protection authority CNIL has published an FAQ based on the information note of the EDPB, explaining the consequences of a no-deal Brexit for the data transfer to the UK and which preparations should be made.

CNIL fines Google for violation of GDPR

25. January 2019

On 21st of January 2019, the French Data Protection Authority CNIL imposed a fine of € 50 Million on Google for lack of transparency, inadequate information and lack of valid consent regarding the ads personalization.

On 25th and 28th of May 2018, CNIL received complaints from the associations None of Your Business (“NOYB”) and La Quadrature du Net (“LQDN”). The associations accused Google of not having a valid legal basis to process the personal data of the users of its services.

CNIL carried out online inspections in September 2018, analysing a user’s browsing pattern and the documents he could access.

The committee first noted that the information provided by Google is not easily accessible to a user. Essential information, such as the data processing purposes, the data storage periods or the categories of personal data used for the ads personalization, are spread across multiple documents. The user receives relevant information only after carrying out several steps, sometimes up to six are required. According to this, the scheme selected by Google is not compatible with the General Data Protection Regulation (GDPR). In addition, the committee noted that some information was unclear and not comprehensive. It does not allow the user to fully understand the extent of the processing done by Google. Moreover, the purposes of the processing are described too generally and vaguely, as are the categories of data processed for these purposes. Finally, the user is not informed about the storage periods of some data.

Google has stated that it always seeks the consent of users, in particular for the processing of data to personalise advertisements. However, CNIL declared that the consent was not valid. On the one hand, the consent was based on insufficient information. On the other hand, the consent obtained was neither specific nor unambiguous, as the user gives his or her consent for all the processing operations purposes at once, although the GDPR provides that the consent has to be given specifically for each purpose.

This is the first time CNIL has imposed a penalty under the GDPR. The authority justified the amount of the fine with the gravity of the violations against the essential principles of the GDPR: transparency, information and consent. Furthermore, the infringement was not a one-off, time-limited incident, but a continuous breach of the Regulation. In this regard, according to CNIL, the application of the new GDPR sanction limits is appropriate.

Update: Meanwhile, Google has appealed, due to this a court must decide on the fine in the near future.

Pages: 1 2 Next
1 2