Tag: GDPR

CJEU rules pre-checked Cookie consent invalid

2. October 2019

The Court of Justice of the European Union (CJEU) ruled on Tuesday, October 1rst, that storing Cookies on internet users’ devices requires active consent. This decision concerns the implementation of widely spread pre-checked boxes, which has been decided to be insufficient to fulfill the requirements of a lawful consent under the General Data Protection Regulation (GDPR).

The case to be decided concerned a lottery for advertizing purposes initiated by Planet49 GmbH. During the participation process internet users were confronted with two information texts and corresponding checkboxes. Within the first information text the users were asked to agree to be contacted by other companies for promotional offers, by ticking the respective checkbox. The second information text required the user to consent to the installation of Cookies on their devices, while the respective checkbox had already been pre-checked. Therefore users would have needed to uncheck the checkbox if they did not agree to give their consent accordingly (Opt-out).

The Federal Court of Justice in Germany raised and referred their questions to the CJEU regarding whether such a process of obtaining consent could be lawful under the relevant EU jurisprudence, in particular whether valid consent could have been obtained for the storage of information and Cookies on users devices, in case of such mechanisms.

Answering the questions, the CJEU decided, referring to the relevant provisions of Directive 95/46 and the GDPR that require an active behaviour of the user, that pre-ticked boxes cannot constitute a valid consent. Furthermore, in a statement following the decision, the CJEU clarified that consent must be specific, and that users should be informed about the storage period of the Cookies, as well as about third parties accessing users’ information. The Court also said that the “decision is unaffected by whether or not the information stored or accessed on the user’s equipment is personal data.”

In consequence of the decision, it is very likely that at least half of all websites that fall into the scope of the GDPR will need to consider adjustments of their Cookie Banners and, if applicable, procedures for obtaining consent with regard to performance-related and marketing and advertising Cookies in order to comply with the CJEU’s view on how to handle Cookie usage under the current data protection law.

Cookies, in general, are small files which are sent to and stored in the browser of a terminal device as part of the website user’s visit on a website. In case of performance-related and marketing and advertising Cookies, the website provider can then access the information that such Cookies collected about the user when visiting the website on a further occasion, in order to, e.g., facilitate navigation on the internet or transactions, or to collect information about user behaviour.

Following the new CJEU decision, there are multiple possibilities to ensure a GDPR compliant way to receive users’ active consent. In any case it is absolutely necessary to give the user the possibility of actively checking the boxes themselves. This means that pre-ticked boxes are no longer a possibility.

In regard to the obligation of the website controller to provide the user with particular information about the storage period and third party access, a possible way would be to include a passage about Cookie information within the website’s Privacy Policy. Another would be to include all the necessary information under a seperate tab on the website containing a Cookie Policy. Furthermore, this information needs to be easily accessible by the user prior to giving consent, either by including the information directly within the Cookie Banner or by providing a link therein.

As there are various different options depending on the types of the used Cookies, and due to the clarification made by the CJEU, it is recommended to review the Cookie activities on websites and the corresponding procedures of informing about those activities and obtaining consent via the Cookie Banner.

CNIL updates its FAQs for case of a No-Deal Brexit

24. September 2019

The French data protection authority “CNIL” updated its existing catalogue of questions and answers (“FAQs”) to inform about the impact of a no-deal brexit and how controllers should prepare for the transfer of data from the EU to the UK.

As things stand, the United Kingdom will leave the European Union on 1st of November 2019. The UK will then be considered a third country for the purposes of the European General Data Protection Regulation (“GDPR”). For this reason, after the exit, data transfer mechanisms become necessary to transfer personal data from the EU to the UK.

The FAQs recommend five steps that entities should take when transferring data to a controller or processor in the UK to ensure compliance with GDPR:

1. Identify processing activities that involve the transfer of personal data to the United Kingdom.
2. Determine the most appropriate transfer mechanism to implement for these processing activities.
3. Implement the chosen transfer mechanism so that it is applicable and effective as of November 1, 2019.
4. Update your internal documents to include transfers to the United Kingdom as of November 1, 2019.
5. If necessary, update relevant privacy notices to indicate the existence of transfers of data outside the EU and EEA where the United Kingdom is concerned.

CNIL also discusses the GDPR-compliant data transfer mechanisms (e.g., standard contractual clauses, binding corporate rules, codes of conduct) and points out that, whichever one is chosen, it must take effect on 1st of November. If controllers should choose a derogation admissible according to GDPR, CNIL stresses that this must strictly comply with the requirements of Art. 49 GDPR.

London’s King’s Cross station facial recognition technology under investigation by the ICO

11. September 2019

Initially reported by the Financial Times, London’s King’s Cross station is under crossfire for making use of a live face-scanning system across its 67 acres large site. Developed by Argent, it was confirmed that the system has been used to ensure public safety, being part of a number of detection and tracking methods used in terms of surveillance at the famous train station. While the site is privately owned, it is widely used by the public and houses various shops, cafes, restaurants, as well as office spaces with tenants like, for example, Google.

The controversy behind the technology and its legality stems from the fact that it records everyone in its parameters without their consent, analyzing their faces and compairing them to a database of wanted criminals, suspects and persons of interest. While Developer Argent defended the technology, it has not yet explained what the system is, how it is used and how long it has been in place.

A day before the ICO launched its investigation, a letter from King’s Cross Chief Executive Robert Evans reached Mayor of London Sadiq Khan, explaining the matching of the technology against a watchlist of flagged individuals. In effect, if footage is unmatched, it is blurred out and deleted. In case of a match, it is only shared with law enforcement. The Metropolitan Police Service has stated that they have supplied images for a database to carry out facial scans to system, though it claims to not have done so since March, 2018.

Despite the explanation and the distinct statements that the software is abiding by England’s data protection laws, the Information Commissioner’s Office (ICO) has launched an investigation into the technology and its use in the private sector. Businesses would need to explicitly demonstrate that the use of such surveillance technology is strictly necessary and proportionate for their legitimate interests and public safety. In her statement, Information Commissioner Elizabeth Denham further said that she is deeply concerned, since “scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all,” especially if its being done without their knowledge.

The controversy has sparked a demand for a law about facial recognition, igniting a dialogue about new technologies and future-proofing against the yet unknown privacy issues they may cause.

Category: GDPR · General · UK
Tags: , , , ,

Greek Parliament passes bill to adopt GDPR into National Law

29. August 2019

On Monday, August 26th, the Greek Parliament passed a bill that will incorporate the European Union’s General Data Protection Regulation (GDPR) into national law. Originally, the adaptation of the EU regulation was supposed to take place until May 06, 2018. Greece failed to comply with the deadline.

The, now, fast-paced implementation of the regulation may have come as a result of the referral of Greece and Spain by the European Commission (EC) to the European Court of Justice on July 25th. Since they had failed to adopt the GDPR into national law up until then, Greece could have faced a fine of €5,287.50 for every day passed since May 06, in addition to a stiff fine of €1.3 million. In its statement, the EC declared that “the lack of transposition by Spain and Greece creates a different level of protection of peoples’ rights and freedoms, and hampers data exchanges between Greece and Spain on one side and other Member States, who transposed the Directive, on the other side”.

The EU countries are allowed to adopt certain derogations, exeptions and specifications under the GDPR. Greece has done so, in the approved bill, with adjusted provisions in regards to the age of consent, the process of appointing a Data Protection Officer, sensitive data processing, data repurposing, data deletion, certifications and criminal sanctions.

The legislation was approved by New Democracy, the main opposition SYRIZA, the center-left Movement for Change and leftist MeRA25, with an overwhelming majority. The GDPR has already been in effect since May 25th, 2018, with its main aim being to offer more control to individuals over their personal data that they provide to companies and services.

 

Category: EU · EU Commission · GDPR · General
Tags: , , ,

Study shows behavior patterns of internet users regarding cookies

15. August 2019

Research has been carried out to see how European consumers interact with the cookie consent mechanisms online.

The study focused in particular on how consumers react to different designs of cookie pop-ups and how different design choices can influence users’ data protection choices. The researchers collected around 5000 cookie notices from leading websites to get an idea of how different cookie consent mechanisms are currently being implemented. They also worked with a German e-commerce site over a period of four months to investigate how more than 82,000 users of the site interacted with the different cookie consent designs. The designs were adapted by the researchers to analyse how different preferences and designs affect the individual’s decision.

Their research showed that the majority of cookie consent notices are placed at the bottom of the screen (58%), do not block interaction with the site (93%) and offer no other option than the confirmation button (86%), leaving the user no choice.

The majority (57%) also tries to get users consent through the design, for example by highlighting the “Agreed” button with a color, while the link to “more options” is made less visible.

The research showed that interaction with consent notices varied widely, between 5-55%, depending on where they were positioned or what preferences were set. More users clicked the “Accept” button when it was highlighted by color (50.8% on mobile, 26.9% on desktop). In contrast, only 39.2% on mobile and 21.1% on desktop clicked the “Accept” button if it was displayed as a text link. As for third parties, around 30% of mobile users and 10% of desktop users accepted all third parties if the checkboxes were preselected. On the other hand, only a small fraction (< 0.1%) allowed all third parties when given the opt-in choice.

They also found that the more choices are given in a cookie notice, the more likely it is that the visitor will refuse the use of cookies. In fact, they concluded that with GDPR-compliant cookie notices, where the website user is free to choose which cookie categories to activate, only 0.1% of users would allow all cookie categories to be set.

ICO releases a draft Code of Practice to consult on the Use of Personal Data in Political Campaigning

14. August 2019

The United Kingdom’s Information Commissioner’s Office (ICO) plans to give consultations on a new framework code of practice regarding the use of personal data in relation to politcal campaigns.

ICO states that in any democratic society it is vital for political parties,  candidates and campaigners to be able to communicate effectively with voters. Equally vital, though, is that all organisations involved in political campaigning use personal data in a transparent, lawful way that is understood by the people.

Along with the internet, politcal campaigning has become increasingly sophisticated and innovative. Using new technologies and techniques to understand their voters and target them, political campaigning has changed, using social media, the electoral register or screening names for ethnicity and age. In a statement from June, ICO has adressed the risk that comes with innovation, which, intended or not, can undermine the democratic process by hidden manipulation through the processing of personal data that the people do not understand.

In this light, ICO expresses that their current guidance is outdated, since it has not been updated since the introduction of the General Data Protection Regulation (GDPR). It does not reflect modern campainging practices. However, the framework does not establish new requirements for campaigners, instead aims at explaining and clarifying data protection and electronic marketing laws as they already stand.

Before drafting the framework, the Information Commissioner launched a call for views in October 2018 in hopes of input from various people and organisations. The framework is hoped to have taken into account the responses the ICO had received in the process.

In hopes of being the basis of a statutory code of practice if the relevant legislation is introduced, the draft of the framework code of practice is now out for public consultation, and will remain open for public access until Ocotber 4th.

CNIL and ICO publish revised cookie guidelines

6. August 2019

The French data protection authority CNIL as well as the British data protection authority ICO have revised and published their guidelines on cookies.

The guidelines contain several similarities, but also differ in some respects.

Both France and the UK consider rules that apply to cookies to be also applicable to any device that stores or accesses information. In addition, both authorities stress that users must give specific, free and unambiguous consent before cookies are placed. Further scrolling of the website cannot be considered as consent. Likewise, obtaining consent from T&Cs is not lawful. This procedure violates Art. 7 (2) of the General Data Protection Regulation (GDPR), according to which the request for consent shall be presented in a manner which is clearly distinguishable from the other matters, in an intelligible and easily accessible form, using clear and plain language. In addition, all parties who place cookies must be named so that informed consent can be obtained. Finally, both authorities point out that browser settings alone are not a sufficient basis for valid consent.

With regard to the territorial scope, CNIL clarifies that the cookie rules apply only to the processing of cookies within the activities of an establishment of a controller or processor in France, regardless of whether the processing takes place in France. The English guideline does not comment on this.

Cookie walls are considered non-compliant with GDPR by the French data protection authority due to the negative consequences for the user in case of refusal. ICO, on the other hand, is of the opinion that a consent forced on the basis of a cookie wall is probably not valid. Nevertheless GDPR must be balanced with other rights. Insofar ICO has not yet delivered a clear position.

Regarding analytic cookies, CNIL explains that a consent is not always necessary, namely not if they correspond to a list of cumulative requirements created by CNIL. ICO, on the other hand, does not exempt cookies from the consent requirement even in the case of analytic cookies.

Finally, CNIL notes that companies have six months to comply with the rules. However, this period will only be set in motion by the publication of a statement by the CNIL, which is still pending. CNIL expects this statement to be finalised during the first quarter of 2020. The ICO does not foresee such a time limit.

CNIL fines French insurance company

26. July 2019

The French Data Protection Authority (CNIL) imposed a € 180.000 fine on a French insurance company for violating customer data security on their website.

Active Assurance is an insurance intermediary and distributor of motor insurances to customers. On their website, people can request offers, subscribe to contracts and access their personal space.

In 2018, CNIL received a complaint from an Active Assurance customer, saying that he had been able access other users’ data. The other accounts were accessible via hypertext links referred on a search engine. Customers’ documents were also available by slightly changing the URL. Among those records were drivers’ licences, bank statements and documents revealing whether someone has been subject of a licence withdrawal or hit and run.

CNIL informed the company about the violations and a few days later, the company stated that measures had been taken to rectify the infringements. After an on-site audit at the company’s premises, CNIL found that the measures taken were not sufficient and that Active Assurance violates Art. 32 GDPR. Active Assurance should have ensured that only authorized persons had access to the documents. The company should have also instructed the customers to use strong passwords and it should not have send them the passwords in plain text by e-mail.

Based on the seriousness of the breach and the number of people involved, CNIL imposed a fine of € 180.000.

Hackers steal millions of Bulgarians’ financial data

18. July 2019

After a cyberattack on the Bulgarian’s tax agency (NRA) millions of taxpayers’ financial data has been stolen. In an estimate, it is said that most working adults in the 7 million country are affected by some of their data being compromised. The stolen data included names, adresses, income and social security information.

The attack happened in June, but an E-mail from the self-proclaimed perpetrator was sent to Bulgarian media on Monday. It stated that more than 110 databases of the agency had been compromised, the hacker calling the NRA’s cybersecurity a parody. The Bulgarian media were further offered access to the stolen data. One stolen file, e-mailed to the newspaper 24 Chasa,  contained up to 1,1 million personal identification numbers with income, social security and healthcare figures.

The country’s finance minister Vladislav Goranov has appologized in parliament and to the Bulgarian citizens, adding that about 3% of the tax agency’s database had been affected. He made clear that whoever attempted to exploit the stolen data would fall under the impact of Bulgarian law.

In result to this hacking attack, the Bulgarian tax agency now faces a fine of up to 20 million euros by the Commission of Personal Data Protection (CPDP). In addition, the issue has reignited an old debate about the lax cybersecurity standards in Bulgaria, and its adjustement to the modern times.

Google data breach notification sent to IDPC

Google may face further investigations under the General Data Protection Regulation(GDPR), after unauthorized audio recordings have been forwarded to subcontractors. The Irish Data Protection Commission (IDPC) has confirmed through a spokesperson that they have received a data breach notification concerning the issue last week.

The recordings were exposed by the Belgian broadcast VRT, said to affect 1000 clips of conversations in the region of Belgium and the Netherlands. Being logged by Google Assistant, the recordings were then sent to Google’s subcontractors for review. At least 153 of those recordings were not authorized by Google’s wake phrase “Ok/Hey, Google,” and were never meant to be recorded in the first place. They contained personal data reaching from family conversations over bedroom chatter to business calls with confidential information.

Google has addressed this violation of their data security policies in a blog post. It said that the audio recordings were sent to experts, who understand nuances and accents, in order to refine Home’s linguistic abilities, which is a critical part in the process of building speech technology. Google stresses that the storing of recorded data on its services is turned off by default, and only sends audio data to Google once its wake phrase is said. The recordings in question were most likely initiated by the users saying a phrase that sounded similar to “Ok/Hey, Google,” therefore confusing Google Assistant and turning it on.

According to Google’s statement, Security and Privacy teams are working on the issue and will fully review its safeguards to prevent this sort of misconduct from happening again. If, however, following investigations by the IDPC discover a GDPR violation on the matter, it could result in significant financial penalty for the tech giant.

Pages: Prev 1 2 3 4 5 6 7 8 9 10 Next
1 4 5 6 7 8 10