Category: GDPR

Belgian DPA announces GDPR fine

7. October 2019

The Belgian data protection authority (Gegevensbeschermingsautoriteit) has recently imposed a fine of €10,000 for violating the General Data Protection Regulation (GDPR). The case concerns a Belgian shop that provided the data subject with only one opportunity to get a customer card, namely the  electronic identity card (eID). The eID is a national identification card, which contains several information about the cardholder, so the authority considers that the use of this information without the valid consent of the customer is disproportionate to the service offered.

The Authority had learnt of the case following a complaint from a customer. He was denied a customer card because he did not want to provide his electronic identity card. Instead, he had offered the shop to send his data in writing.

According to the Belgian data protection authority, this action violates the GDPR in several respects. On the one hand, the principle of data minimisation is not respected. This requires that the duration and the quantity of the processed data are limited by the controller to the extent absolutely necessary for the pursued purpose.

In order to create the customer card, the controller has access to all the data stored on the eID, including name, address, a photograph and the barcode associated with the national registration number. The Authority therefore believes that the use of all eID data is disproportionate to the creation of a customer card.

The DPA also considers that there is no valid consent as a legal basis. According to the GDPR, the consent must be freely given, specific and informed. However, there is no voluntary consent in this case, since no other alternative is offered to the customer. If a customer refuses to use his electronic ID card, he will not receive a customer card and will therefore not be able to benefit from the shops’ discounts and advantages.

In view of these violations, the authority has imposed a fine of €10,000.

Category: Belgian DPA · Belgium · GDPR · General
Tags: ,

CJEU rules pre-checked Cookie consent invalid

2. October 2019

The Court of Justice of the European Union (CJEU) ruled on Tuesday, October 1rst, that storing Cookies on internet users’ devices requires active consent. This decision concerns the implementation of widely spread pre-checked boxes, which has been decided to be insufficient to fulfill the requirements of a lawful consent under the General Data Protection Regulation (GDPR).

The case to be decided concerned a lottery for advertizing purposes initiated by Planet49 GmbH. During the participation process internet users were confronted with two information texts and corresponding checkboxes. Within the first information text the users were asked to agree to be contacted by other companies for promotional offers, by ticking the respective checkbox. The second information text required the user to consent to the installation of Cookies on their devices, while the respective checkbox had already been pre-checked. Therefore users would have needed to uncheck the checkbox if they did not agree to give their consent accordingly (Opt-out).

The Federal Court of Justice in Germany raised and referred their questions to the CJEU regarding whether such a process of obtaining consent could be lawful under the relevant EU jurisprudence, in particular whether valid consent could have been obtained for the storage of information and Cookies on users devices, in case of such mechanisms.

Answering the questions, the CJEU decided, referring to the relevant provisions of Directive 95/46 and the GDPR that require an active behaviour of the user, that pre-ticked boxes cannot constitute a valid consent. Furthermore, in a statement following the decision, the CJEU clarified that consent must be specific, and that users should be informed about the storage period of the Cookies, as well as about third parties accessing users’ information. The Court also said that the “decision is unaffected by whether or not the information stored or accessed on the user’s equipment is personal data.”

In consequence of the decision, it is very likely that at least half of all websites that fall into the scope of the GDPR will need to consider adjustments of their Cookie Banners and, if applicable, procedures for obtaining consent with regard to performance-related and marketing and advertising Cookies in order to comply with the CJEU’s view on how to handle Cookie usage under the current data protection law.

Cookies, in general, are small files which are sent to and stored in the browser of a terminal device as part of the website user’s visit on a website. In case of performance-related and marketing and advertising Cookies, the website provider can then access the information that such Cookies collected about the user when visiting the website on a further occasion, in order to, e.g., facilitate navigation on the internet or transactions, or to collect information about user behaviour.

Following the new CJEU decision, there are multiple possibilities to ensure a GDPR compliant way to receive users’ active consent. In any case it is absolutely necessary to give the user the possibility of actively checking the boxes themselves. This means that pre-ticked boxes are no longer a possibility.

In regard to the obligation of the website controller to provide the user with particular information about the storage period and third party access, a possible way would be to include a passage about Cookie information within the website’s Privacy Policy. Another would be to include all the necessary information under a seperate tab on the website containing a Cookie Policy. Furthermore, this information needs to be easily accessible by the user prior to giving consent, either by including the information directly within the Cookie Banner or by providing a link therein.

As there are various different options depending on the types of the used Cookies, and due to the clarification made by the CJEU, it is recommended to review the Cookie activities on websites and the corresponding procedures of informing about those activities and obtaining consent via the Cookie Banner.

CNIL updates its FAQs for case of a No-Deal Brexit

24. September 2019

The French data protection authority “CNIL” updated its existing catalogue of questions and answers (“FAQs”) to inform about the impact of a no-deal brexit and how controllers should prepare for the transfer of data from the EU to the UK.

As things stand, the United Kingdom will leave the European Union on 1st of November 2019. The UK will then be considered a third country for the purposes of the European General Data Protection Regulation (“GDPR”). For this reason, after the exit, data transfer mechanisms become necessary to transfer personal data from the EU to the UK.

The FAQs recommend five steps that entities should take when transferring data to a controller or processor in the UK to ensure compliance with GDPR:

1. Identify processing activities that involve the transfer of personal data to the United Kingdom.
2. Determine the most appropriate transfer mechanism to implement for these processing activities.
3. Implement the chosen transfer mechanism so that it is applicable and effective as of November 1, 2019.
4. Update your internal documents to include transfers to the United Kingdom as of November 1, 2019.
5. If necessary, update relevant privacy notices to indicate the existence of transfers of data outside the EU and EEA where the United Kingdom is concerned.

CNIL also discusses the GDPR-compliant data transfer mechanisms (e.g., standard contractual clauses, binding corporate rules, codes of conduct) and points out that, whichever one is chosen, it must take effect on 1st of November. If controllers should choose a derogation admissible according to GDPR, CNIL stresses that this must strictly comply with the requirements of Art. 49 GDPR.

London’s King’s Cross station facial recognition technology under investigation by the ICO

11. September 2019

Initially reported by the Financial Times, London’s King’s Cross station is under crossfire for making use of a live face-scanning system across its 67 acres large site. Developed by Argent, it was confirmed that the system has been used to ensure public safety, being part of a number of detection and tracking methods used in terms of surveillance at the famous train station. While the site is privately owned, it is widely used by the public and houses various shops, cafes, restaurants, as well as office spaces with tenants like, for example, Google.

The controversy behind the technology and its legality stems from the fact that it records everyone in its parameters without their consent, analyzing their faces and compairing them to a database of wanted criminals, suspects and persons of interest. While Developer Argent defended the technology, it has not yet explained what the system is, how it is used and how long it has been in place.

A day before the ICO launched its investigation, a letter from King’s Cross Chief Executive Robert Evans reached Mayor of London Sadiq Khan, explaining the matching of the technology against a watchlist of flagged individuals. In effect, if footage is unmatched, it is blurred out and deleted. In case of a match, it is only shared with law enforcement. The Metropolitan Police Service has stated that they have supplied images for a database to carry out facial scans to system, though it claims to not have done so since March, 2018.

Despite the explanation and the distinct statements that the software is abiding by England’s data protection laws, the Information Commissioner’s Office (ICO) has launched an investigation into the technology and its use in the private sector. Businesses would need to explicitly demonstrate that the use of such surveillance technology is strictly necessary and proportionate for their legitimate interests and public safety. In her statement, Information Commissioner Elizabeth Denham further said that she is deeply concerned, since “scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all,” especially if its being done without their knowledge.

The controversy has sparked a demand for a law about facial recognition, igniting a dialogue about new technologies and future-proofing against the yet unknown privacy issues they may cause.

Category: GDPR · General · UK
Tags: , , , ,

Greek Parliament passes bill to adopt GDPR into National Law

29. August 2019

On Monday, August 26th, the Greek Parliament passed a bill that will incorporate the European Union’s General Data Protection Regulation (GDPR) into national law. Originally, the adaptation of the EU regulation was supposed to take place until May 06, 2018. Greece failed to comply with the deadline.

The, now, fast-paced implementation of the regulation may have come as a result of the referral of Greece and Spain by the European Commission (EC) to the European Court of Justice on July 25th. Since they had failed to adopt the GDPR into national law up until then, Greece could have faced a fine of €5,287.50 for every day passed since May 06, in addition to a stiff fine of €1.3 million. In its statement, the EC declared that “the lack of transposition by Spain and Greece creates a different level of protection of peoples’ rights and freedoms, and hampers data exchanges between Greece and Spain on one side and other Member States, who transposed the Directive, on the other side”.

The EU countries are allowed to adopt certain derogations, exeptions and specifications under the GDPR. Greece has done so, in the approved bill, with adjusted provisions in regards to the age of consent, the process of appointing a Data Protection Officer, sensitive data processing, data repurposing, data deletion, certifications and criminal sanctions.

The legislation was approved by New Democracy, the main opposition SYRIZA, the center-left Movement for Change and leftist MeRA25, with an overwhelming majority. The GDPR has already been in effect since May 25th, 2018, with its main aim being to offer more control to individuals over their personal data that they provide to companies and services.

 

Category: EU · EU Commission · GDPR · General
Tags: , , ,

Swedish DPA imposed ist first GDPR fine

23. August 2019

The Swedish Data Protection Authority “datainspektionen” imposed its first fine since the General Data Protection Regulation (GDPR) has entered into force.

Affected is a high school in Skelleftea in the north of Sweden. In the school, 22 pupils were part of a pilot programme to monitor attendance times using facial recognition.

In January 2019, the IT company Tieto announced that it was testing the presence of students at the school with tags, spartphone apps and facial recognition software for automatic registration of students. In Sweden, it is mandatory for teachers to report the presence of all students in each lesson to the supervisors. According to Tieto, teachers at the school in Skelleftea spend around 18,000 hours a year on this registration. Therefore, a class was selected for the pilot project to test the registration for eight weeks using facial recognition. Parents and students were asked to give their consent.

However, the Swedish data protection authority has now said that the way in which consent was obtained violates the GDPR because of the clear imbalance between controller and data subject. Additionally the school failed to conduct an impact assessment including seeking prior consultation with datainspektionen.

Therefore, the DPA imposed a fine of SEK 200.000 (approximately EUR 20.000). In Sweden, public authorities can be fined up to SEK 20.000.000 (approximately EUR 1.000.000).

Irish DPC releases guide on Data Breach Notifications

15. August 2019

On Monday the Irish Data Protection Commission (IDPC) has released a quick guide on Data Breach Notifications. It is supposed to help controllers understand their obligations regarding notification and communication requirements, both to the responsible DPC and to the data subject.

The guide, which is supposed to be a quick overview of the requirements and obligations which fall on data controllers, refers to the Article 29 Working Party’s (now European Data Protection Board or EDPB), much more in depth and detailed, guidance in their guideline concerning Data Breach Notifications.

In summary, the IDPC categorizes a Data Breach as a “security incident that negatively impacts the confidentiality, integrity or availability of personal data; meaning that the controller is unable to ensure compliance with the principles relating to the processing of personal data as outlined in Art. 5 GDPR”. In this case, it falls to the controller to follow two primary obligations: (1) to notify the responsible DPC of the data breach, unless it is unlikely to result in a risk for the data subject, and (2) to communicate the data breach to the affected data subjects, when it is likely to result in a high risk.

The IDPC seeks to help controllers by providing a list of requirements in cases of notification to the DPC and data subjects, especially given the tight timeframe for notifications to be filed within 72 hours of awareness of the breach. It is hoping to eliminate confusion arising in the process, as well as problems that companies have had while filing a Data Breach Notification in the past.

Study shows behavior patterns of internet users regarding cookies

Research has been carried out to see how European consumers interact with the cookie consent mechanisms online.

The study focused in particular on how consumers react to different designs of cookie pop-ups and how different design choices can influence users’ data protection choices. The researchers collected around 5000 cookie notices from leading websites to get an idea of how different cookie consent mechanisms are currently being implemented. They also worked with a German e-commerce site over a period of four months to investigate how more than 82,000 users of the site interacted with the different cookie consent designs. The designs were adapted by the researchers to analyse how different preferences and designs affect the individual’s decision.

Their research showed that the majority of cookie consent notices are placed at the bottom of the screen (58%), do not block interaction with the site (93%) and offer no other option than the confirmation button (86%), leaving the user no choice.

The majority (57%) also tries to get users consent through the design, for example by highlighting the “Agreed” button with a color, while the link to “more options” is made less visible.

The research showed that interaction with consent notices varied widely, between 5-55%, depending on where they were positioned or what preferences were set. More users clicked the “Accept” button when it was highlighted by color (50.8% on mobile, 26.9% on desktop). In contrast, only 39.2% on mobile and 21.1% on desktop clicked the “Accept” button if it was displayed as a text link. As for third parties, around 30% of mobile users and 10% of desktop users accepted all third parties if the checkboxes were preselected. On the other hand, only a small fraction (< 0.1%) allowed all third parties when given the opt-in choice.

They also found that the more choices are given in a cookie notice, the more likely it is that the visitor will refuse the use of cookies. In fact, they concluded that with GDPR-compliant cookie notices, where the website user is free to choose which cookie categories to activate, only 0.1% of users would allow all cookie categories to be set.

ICO releases a draft Code of Practice to consult on the Use of Personal Data in Political Campaigning

14. August 2019

The United Kingdom’s Information Commissioner’s Office (ICO) plans to give consultations on a new framework code of practice regarding the use of personal data in relation to politcal campaigns.

ICO states that in any democratic society it is vital for political parties,  candidates and campaigners to be able to communicate effectively with voters. Equally vital, though, is that all organisations involved in political campaigning use personal data in a transparent, lawful way that is understood by the people.

Along with the internet, politcal campaigning has become increasingly sophisticated and innovative. Using new technologies and techniques to understand their voters and target them, political campaigning has changed, using social media, the electoral register or screening names for ethnicity and age. In a statement from June, ICO has adressed the risk that comes with innovation, which, intended or not, can undermine the democratic process by hidden manipulation through the processing of personal data that the people do not understand.

In this light, ICO expresses that their current guidance is outdated, since it has not been updated since the introduction of the General Data Protection Regulation (GDPR). It does not reflect modern campainging practices. However, the framework does not establish new requirements for campaigners, instead aims at explaining and clarifying data protection and electronic marketing laws as they already stand.

Before drafting the framework, the Information Commissioner launched a call for views in October 2018 in hopes of input from various people and organisations. The framework is hoped to have taken into account the responses the ICO had received in the process.

In hopes of being the basis of a statutory code of practice if the relevant legislation is introduced, the draft of the framework code of practice is now out for public consultation, and will remain open for public access until Ocotber 4th.

EDPB adopts Guidelines on processing of personal data through video devices

13. August 2019

Recently, the EDPB has adopted its Guidelines on processing of personal data through video devices (“the guidelines”). The guidelines provide assistance on how to apply the GDPR in cases of processing through video devices with several examples, which are not exhaustive but applicable for all areas of using video devices.

In a first step, the guidelines set the scope of application. The GDPR is only applicable for the use of video devices if

  • personal data is collected through the video device ( e.g. a person is identifiable on basis of their looks or other specific elements)
  • the processing is not carried out by competent authorities for the purposes of prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, or,
  • the so-called “household exemption” does not apply (processing by a natural person in the course of personal or household activity).

Before processing personal data through video devices, controllers must specify their legal basis for it. According to the guidelines, every legal ground under Article 6 (1) can provide a legal basis. The purposes for using video devices for processing personal data should be documented in writing and specified for every camera in use.

Another subject of the guidelines is the transparency of the processing. The controllers have to inform data subjects about the video surveillance. The EDPB recommends a layered approach and combining several methods to ensure transparency. The most important information should be written on the warning sign itself (first layer) and the other mandatory details may be provided by other means (second layer). The second layer must also be easily accessible for data subjects.

The guidelines also deal with storage periods and technical and organizational measures (TOMs). In some member states may be specific provisions for storing video surveillance footage, but it is recommended to – ideally automatically – delete the personal data after a few days. As with any kind of data processing, the controller must adequately secure it and therefore must have implemented technical and organizational measures. Examples provided are masking or scrambling areas that are not relevant to surveillance, or the editing out of images of third persons, when providing video footage to data subjects.

Until September 9th 2019, the guidelines will be open for public consultation and a final and revised version is planned for the end of 2019.

Pages: 1 2 3 4 5 6 7 8 9 10 Next
1 2 3 10