Tag: GDPR

Advocate General releases opinion on the validity of SCCs in case of Third Country Transfers

19. December 2019

Today, Thursday 19 of December, the European Court of Justice’s (CJEU) Advocate General Henrik Saugmandsgaard Øe released his opinion on the validity of Standard Contractual Clauses (SCCs) in cases of personal data transfers to processors situated in third countries.

The background of the case, on which the opinion builds on, originates in the proceedings initiated by Mr. Maximillian Schrems, where he stepped up against Facebook’s business practice of transferring the personal data of its European subscribers to servers located in the United States. The case (Schrems I) led the CJEU on October 6, 2015, to invalidate the Safe Harbor arrangement, which up to that point governed data transfers between the EU and the U.S.A.

Following the ruling, Mr. Schrems decided to challenge the transfers performed on the basis of the EU SCCs, the alternative mechanism Facebook has chosen to rely on to legitimize its EU-U.S. data flows, on the basis of similar arguments to those raised in the Schrems I case. The Irish DPA brought proceedings before the Irish High Court, which referred 11 questions to the CJEU for a preliminary ruling, the Schrems II case.

In the newly published opinion, the Advocate General validates the established SCCs in case of a commercial transfer, despite the possibility of public authorities in the third country processing the personal data for national security reasons. Furthermore, the Advocate General states that the continuity of the high level of protection is not only guaranteed by the adequacy decision of the court, but just as well by the contractual safeguards which the exporter has in place that need to match that level of protection. Therefore, the SCCs represent a general mechanism applicable to transfers, no matter the third country and its adequacy of protection. In addition, and in light of the Charter, there is an obligation for the controller as well as the supervisory authority to suspend any third country transfer if, because of a conflict between the SCCs and the laws in the third country, the SCCs cannot be complied with.

In the end, the Advocate General also clarified that the EU-U.S. Privacy Shield decision of 12 July 2016 is not part of the current proceedings, since those only cover the SCCs under Decision 2010/87, taking the questions of the validity of the Privacy Shield off the table.

While the Advocate General’s opinion is not binding, it represents the suggestion of a legal solution for cases for which the CJEU is responsible. However, the CJEU’s decision on the matter is not expected until early 2020, setting the curiosity on the outcome of the case high.

Irish DPC updates Guidance on Data Processing’s Legal Bases

17. December 2019

The Irish Data Protection Commission (DPC) has updated their guidance on the legal bases for personal data processing. It focuses on data processing under the European General Data Protection Regulation (GDPR) as well as data processing requirements under the European Law Enforcement Directive.

The main points of the updates to the guidance are to make companies more sensitive of their reasons for processing personal data and choosing the right legal basis, as well as ensure that data subjects may be able to figure out if their data is being processed lawfully.

The guidance focuses on the different legal bases in Art.6 GDPR, namely consent, contracts, legal obligation, vital interests, public task or legitimate interests. The Irish DPC states that controllers do not only have to choose the right legal basis, but they also have to understand the obligations that come with the chosen one, which is why they wanted to go into further detail.

Overall, the guidance is made to aid both controllers and data subjects. It consists of a way to support a better understanding of the terminology, as well as the legal requirements the GDPR sets out for processing personal data.

Dutch DPA issued a statement regarding cookie consent

12. December 2019

The Dutch Data Protection Authority (Autoriteit Persoonsgegevens) has recently issued a statement regarding compliance with the rules on cookie consent. According to the statement the DPA has reviewed 175 websites and e-commerce platforms to see if they meet the requirements for the use of cookies. They found that almost half of the websites and nearly all e-commerce platforms do not meet the requirements for cookie consent.

The data protection authority has contacted the companies concerned and requested them to adjust their cookie usage.

In its statement, the Data Protection Authority also refers to the “Planet49case” of the Court of Justice of the European Union (“CJEU”) and clarifies that boxes that have already been clicked do not comply with the obligation to obtain the user’s consent. In addition, it is not equivalent to obtaining consent to the use of cookies if the user merely scrolls down the website. Cookies, which enable websites to track their users, always require explicit consent.

Lastly, the DPA recalls that cookie walls that prevent users, who have not consented to the use of cookies from accessing the website are not permitted.

Category: EU · GDPR · The Netherlands
Tags: ,

CNIL publishes report on facial recognition

21. November 2019

The French Data Protection Authority, Commission Nationale de l’Informatique et des Libertés (CNIL), has released guidelines concerning the experimental use of facial recognition software by the french public authorities.

Especially concerned with the risks of using such a technology in the public sector, the CNIL made it clear that the use of facial recognition has vast political as well as societal influences and risks. In its report, the CNIL explicitly stated the software can yield very biased results, since the algorithms are not 100% reliable, and the rate of false-positives can vary depending on the gender and on the ethnicity of the individuals that are recorded.

To minimize the chances of an unlawful use of the technology, the CNIL came forth with three main requirements in its report. It recommended to the public authorities, that are using facial recognition in an experimental phase, to comply with them in order to keep the chances of risks to a minimum.

The three requirements put forth in the report are as follows:

  • Facial recognition should only be put to experimental use if there is an established need to implement an authentication mechanism with a high level of reliability. Further, there should be no less intrusive methods applicable to the situation.
  • The controller must under all circumstances respect the rights of the individuals beig recorded. That extends to the necessity of consent for each device used, data subjects’ control over their own data, information obligation, and transparency of the use and purpose, etc.
  • The experimental use must follow a precise timeline and be at the base of a rigorous methodology in order to minimize the risks.

The CNIL also states that it is important to evaluate each use of the technology on a case by case basis, as the risks depending on the way the software is used can vary between controllers.

While the CNIL wishes to give a red lining to the use of facial recognition in the future, it has also made clear that it will fulfill its role by showing support concerning issues that may arise by giving counsel in regards to legal and methodological use of facial recognition in an experimental stage.

Category: EU · French DPA · GDPR · General
Tags: , , , ,

Berlin commissioner for data protection imposes fine on real estate company

6. November 2019

On October 30th, 2019, the Berlin Commissioner for Data Protection and Freedom of Information issued a fine of around 14.5 million euros against the real estate company Deutsche Wohnen SE for violations of the General Data Protection Regulation (GDPR).

During on-site inspections in June 2017 and March 2019, the supervisory authority determined that the company used an archive system for the storage of personal data of tenants that did not provide for the possibility of removing data that was no longer required. Personal data of tenants were stored without checking whether storage was permissible or even necessary. In individual cases, private data of the tenants concerned could therefore be viewed, even though some of them were years old and no longer served the purpose of their original survey. This involved data on the personal and financial circumstances of tenants, such as salary statements, self-disclosure forms, extracts from employment and training contracts, tax, social security and health insurance data and bank statements.

After the commissioner had made the urgent recommendation to change the archive system in the first test date of 2017, the company was unable to demonstrate either a cleansing of its database nor legal reasons for the continued storage in March 2019, more than one and a half years after the first test date and nine months after the GDPR came into force. Although the enterprise had made preparations for the removal of the found grievances, nevertheless these measures did not lead to a legal state with the storage of personal data. Therefore the imposition of a fine was compelling because of a violation of article 25 Abs. 1 GDPR as well as article 5 GDPR for the period between May 2018 and March 2019.

The starting point for the calculation of fines is, among other things, the previous year’s worldwide sales of the affected companies. According to its annual report for 2018, the annual turnover of Deutsche Wohnen SE exceeded one billion euros. For this reason, the legally prescribed framework for the assessment of fines for the established data protection violation amounted to approximately 28 million euros.

For the concrete determination of the amount of the fine, the commissioner used the legal criteria, taking into account all burdening and relieving aspects. The fact that Deutsche Wohnen SE had deliberately set up the archive structure in question and that the data concerned had been processed in an inadmissible manner over a long period of time had a particularly negative effect. However, the fact that the company had taken initial measures to remedy the illegal situation and had cooperated well with the supervisory authority in formal terms was taken into account as a mitigating factor. Also with regard to the fact that the company was not able to prove any abusive access to the data stored, a fine in the middle range of the prescribed fine framework was appropriate.

In addition to sanctioning this violation, the commissioner imposed further fines of between 6,000 and 17,000 euros on the company for the inadmissible storage of personal data of tenants in 15 specific individual cases.

The decision on the fine has not yet become final. Deutsche Wohnen SE can lodge an appeal against this decision.

German data protection authorities develop fining concept under GDPR

24. October 2019

In a press release, the German Conference of Data Protection Authorities (Datenschutzkonferenz, “DSK”) announced that it is currently developing a concept for the setting of fines in the event of breaches of the GDPR by companies. The goal is to guarantee a systematic, transparent and comprehensible fine calculation.

The DSK clarifies that this concept has not yet been adopted, but is still in draft stage and will be further worked on. At present it is practiced accompanying with current fine proceedings in order to test it for its practical suitability and aiming accuracy. However, the concrete decisions are nevertheless based on Art. 83 GDPR.

Art. 70 Para. 1 lit. k of the GDPR demands a harmonization of the fine setting within Europe. Therefore guidelines shall be elaborated. For this reason, the DSK draft will be brought into line with the concepts of other EU member states.

Also, at European level a European concept is currently being negotiated. This concept should then be laid down in a guideline, at least in principle. The DSK has also contributed its considerations on the assessment.

The fine concept will be discussed further on 6th and 7th November. After prior examination, a decision will be taken on whether the concept on the setting of fines shall be published.

Category: Data breach · EU · GDPR
Tags: , , ,

Belgian DPA announces GDPR fine

7. October 2019

The Belgian data protection authority (Gegevensbeschermingsautoriteit) has recently imposed a fine of €10,000 for violating the General Data Protection Regulation (GDPR). The case concerns a Belgian shop that provided the data subject with only one opportunity to get a customer card, namely the  electronic identity card (eID). The eID is a national identification card, which contains several information about the cardholder, so the authority considers that the use of this information without the valid consent of the customer is disproportionate to the service offered.

The Authority had learnt of the case following a complaint from a customer. He was denied a customer card because he did not want to provide his electronic identity card. Instead, he had offered the shop to send his data in writing.

According to the Belgian data protection authority, this action violates the GDPR in several respects. On the one hand, the principle of data minimisation is not respected. This requires that the duration and the quantity of the processed data are limited by the controller to the extent absolutely necessary for the pursued purpose.

In order to create the customer card, the controller has access to all the data stored on the eID, including name, address, a photograph and the barcode associated with the national registration number. The Authority therefore believes that the use of all eID data is disproportionate to the creation of a customer card.

The DPA also considers that there is no valid consent as a legal basis. According to the GDPR, the consent must be freely given, specific and informed. However, there is no voluntary consent in this case, since no other alternative is offered to the customer. If a customer refuses to use his electronic ID card, he will not receive a customer card and will therefore not be able to benefit from the shops’ discounts and advantages.

In view of these violations, the authority has imposed a fine of €10,000.

Category: Belgian DPA · Belgium · GDPR · General
Tags: ,

CJEU rules pre-checked Cookie consent invalid

2. October 2019

The Court of Justice of the European Union (CJEU) ruled on Tuesday, October 1rst, that storing Cookies on internet users’ devices requires active consent. This decision concerns the implementation of widely spread pre-checked boxes, which has been decided to be insufficient to fulfill the requirements of a lawful consent under the General Data Protection Regulation (GDPR).

The case to be decided concerned a lottery for advertizing purposes initiated by Planet49 GmbH. During the participation process internet users were confronted with two information texts and corresponding checkboxes. Within the first information text the users were asked to agree to be contacted by other companies for promotional offers, by ticking the respective checkbox. The second information text required the user to consent to the installation of Cookies on their devices, while the respective checkbox had already been pre-checked. Therefore users would have needed to uncheck the checkbox if they did not agree to give their consent accordingly (Opt-out).

The Federal Court of Justice in Germany raised and referred their questions to the CJEU regarding whether such a process of obtaining consent could be lawful under the relevant EU jurisprudence, in particular whether valid consent could have been obtained for the storage of information and Cookies on users devices, in case of such mechanisms.

Answering the questions, the CJEU decided, referring to the relevant provisions of Directive 95/46 and the GDPR that require an active behaviour of the user, that pre-ticked boxes cannot constitute a valid consent. Furthermore, in a statement following the decision, the CJEU clarified that consent must be specific, and that users should be informed about the storage period of the Cookies, as well as about third parties accessing users’ information. The Court also said that the “decision is unaffected by whether or not the information stored or accessed on the user’s equipment is personal data.”

In consequence of the decision, it is very likely that at least half of all websites that fall into the scope of the GDPR will need to consider adjustments of their Cookie Banners and, if applicable, procedures for obtaining consent with regard to performance-related and marketing and advertising Cookies in order to comply with the CJEU’s view on how to handle Cookie usage under the current data protection law.

Cookies, in general, are small files which are sent to and stored in the browser of a terminal device as part of the website user’s visit on a website. In case of performance-related and marketing and advertising Cookies, the website provider can then access the information that such Cookies collected about the user when visiting the website on a further occasion, in order to, e.g., facilitate navigation on the internet or transactions, or to collect information about user behaviour.

Following the new CJEU decision, there are multiple possibilities to ensure a GDPR compliant way to receive users’ active consent. In any case it is absolutely necessary to give the user the possibility of actively checking the boxes themselves. This means that pre-ticked boxes are no longer a possibility.

In regard to the obligation of the website controller to provide the user with particular information about the storage period and third party access, a possible way would be to include a passage about Cookie information within the website’s Privacy Policy. Another would be to include all the necessary information under a seperate tab on the website containing a Cookie Policy. Furthermore, this information needs to be easily accessible by the user prior to giving consent, either by including the information directly within the Cookie Banner or by providing a link therein.

As there are various different options depending on the types of the used Cookies, and due to the clarification made by the CJEU, it is recommended to review the Cookie activities on websites and the corresponding procedures of informing about those activities and obtaining consent via the Cookie Banner.

CNIL updates its FAQs for case of a No-Deal Brexit

24. September 2019

The French data protection authority “CNIL” updated its existing catalogue of questions and answers (“FAQs”) to inform about the impact of a no-deal brexit and how controllers should prepare for the transfer of data from the EU to the UK.

As things stand, the United Kingdom will leave the European Union on 1st of November 2019. The UK will then be considered a third country for the purposes of the European General Data Protection Regulation (“GDPR”). For this reason, after the exit, data transfer mechanisms become necessary to transfer personal data from the EU to the UK.

The FAQs recommend five steps that entities should take when transferring data to a controller or processor in the UK to ensure compliance with GDPR:

1. Identify processing activities that involve the transfer of personal data to the United Kingdom.
2. Determine the most appropriate transfer mechanism to implement for these processing activities.
3. Implement the chosen transfer mechanism so that it is applicable and effective as of November 1, 2019.
4. Update your internal documents to include transfers to the United Kingdom as of November 1, 2019.
5. If necessary, update relevant privacy notices to indicate the existence of transfers of data outside the EU and EEA where the United Kingdom is concerned.

CNIL also discusses the GDPR-compliant data transfer mechanisms (e.g., standard contractual clauses, binding corporate rules, codes of conduct) and points out that, whichever one is chosen, it must take effect on 1st of November. If controllers should choose a derogation admissible according to GDPR, CNIL stresses that this must strictly comply with the requirements of Art. 49 GDPR.

London’s King’s Cross station facial recognition technology under investigation by the ICO

11. September 2019

Initially reported by the Financial Times, London’s King’s Cross station is under crossfire for making use of a live face-scanning system across its 67 acres large site. Developed by Argent, it was confirmed that the system has been used to ensure public safety, being part of a number of detection and tracking methods used in terms of surveillance at the famous train station. While the site is privately owned, it is widely used by the public and houses various shops, cafes, restaurants, as well as office spaces with tenants like, for example, Google.

The controversy behind the technology and its legality stems from the fact that it records everyone in its parameters without their consent, analyzing their faces and compairing them to a database of wanted criminals, suspects and persons of interest. While Developer Argent defended the technology, it has not yet explained what the system is, how it is used and how long it has been in place.

A day before the ICO launched its investigation, a letter from King’s Cross Chief Executive Robert Evans reached Mayor of London Sadiq Khan, explaining the matching of the technology against a watchlist of flagged individuals. In effect, if footage is unmatched, it is blurred out and deleted. In case of a match, it is only shared with law enforcement. The Metropolitan Police Service has stated that they have supplied images for a database to carry out facial scans to system, though it claims to not have done so since March, 2018.

Despite the explanation and the distinct statements that the software is abiding by England’s data protection laws, the Information Commissioner’s Office (ICO) has launched an investigation into the technology and its use in the private sector. Businesses would need to explicitly demonstrate that the use of such surveillance technology is strictly necessary and proportionate for their legitimate interests and public safety. In her statement, Information Commissioner Elizabeth Denham further said that she is deeply concerned, since “scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all,” especially if its being done without their knowledge.

The controversy has sparked a demand for a law about facial recognition, igniting a dialogue about new technologies and future-proofing against the yet unknown privacy issues they may cause.

Category: GDPR · General · UK
Tags: , , , ,
Pages: 1 2 3 4 5 6 Next
1 2 3 6