Category: GDPR

ICO releases a draft Code of Practice to consult on the Use of Personal Data in Political Campaigning

14. August 2019

The United Kingdom’s Information Commissioner’s Office (ICO) plans to give consultations on a new framework code of practice regarding the use of personal data in relation to politcal campaigns.

ICO states that in any democratic society it is vital for political parties,  candidates and campaigners to be able to communicate effectively with voters. Equally vital, though, is that all organisations involved in political campaigning use personal data in a transparent, lawful way that is understood by the people.

Along with the internet, politcal campaigning has become increasingly sophisticated and innovative. Using new technologies and techniques to understand their voters and target them, political campaigning has changed, using social media, the electoral register or screening names for ethnicity and age. In a statement from June, ICO has adressed the risk that comes with innovation, which, intended or not, can undermine the democratic process by hidden manipulation through the processing of personal data that the people do not understand.

In this light, ICO expresses that their current guidance is outdated, since it has not been updated since the introduction of the General Data Protection Regulation (GDPR). It does not reflect modern campainging practices. However, the framework does not establish new requirements for campaigners, instead aims at explaining and clarifying data protection and electronic marketing laws as they already stand.

Before drafting the framework, the Information Commissioner launched a call for views in October 2018 in hopes of input from various people and organisations. The framework is hoped to have taken into account the responses the ICO had received in the process.

In hopes of being the basis of a statutory code of practice if the relevant legislation is introduced, the draft of the framework code of practice is now out for public consultation, and will remain open for public access until Ocotber 4th.

EDPB adopts Guidelines on processing of personal data through video devices

13. August 2019

Recently, the EDPB has adopted its Guidelines on processing of personal data through video devices (“the guidelines”). The guidelines provide assistance on how to apply the GDPR in cases of processing through video devices with several examples, which are not exhaustive but applicable for all areas of using video devices.

In a first step, the guidelines set the scope of application. The GDPR is only applicable for the use of video devices if

  • personal data is collected through the video device ( e.g. a person is identifiable on basis of their looks or other specific elements)
  • the processing is not carried out by competent authorities for the purposes of prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, or,
  • the so-called “household exemption” does not apply (processing by a natural person in the course of personal or household activity).

Before processing personal data through video devices, controllers must specify their legal basis for it. According to the guidelines, every legal ground under Article 6 (1) can provide a legal basis. The purposes for using video devices for processing personal data should be documented in writing and specified for every camera in use.

Another subject of the guidelines is the transparency of the processing. The controllers have to inform data subjects about the video surveillance. The EDPB recommends a layered approach and combining several methods to ensure transparency. The most important information should be written on the warning sign itself (first layer) and the other mandatory details may be provided by other means (second layer). The second layer must also be easily accessible for data subjects.

The guidelines also deal with storage periods and technical and organizational measures (TOMs). In some member states may be specific provisions for storing video surveillance footage, but it is recommended to – ideally automatically – delete the personal data after a few days. As with any kind of data processing, the controller must adequately secure it and therefore must have implemented technical and organizational measures. Examples provided are masking or scrambling areas that are not relevant to surveillance, or the editing out of images of third persons, when providing video footage to data subjects.

Until September 9th 2019, the guidelines will be open for public consultation and a final and revised version is planned for the end of 2019.

CNIL and ICO publish revised cookie guidelines

6. August 2019

The French data protection authority CNIL as well as the British data protection authority ICO have revised and published their guidelines on cookies.

The guidelines contain several similarities, but also differ in some respects.

Both France and the UK consider rules that apply to cookies to be also applicable to any device that stores or accesses information. In addition, both authorities stress that users must give specific, free and unambiguous consent before cookies are placed. Further scrolling of the website cannot be considered as consent. Likewise, obtaining consent from T&Cs is not lawful. This procedure violates Art. 7 (2) of the General Data Protection Regulation (GDPR), according to which the request for consent shall be presented in a manner which is clearly distinguishable from the other matters, in an intelligible and easily accessible form, using clear and plain language. In addition, all parties who place cookies must be named so that informed consent can be obtained. Finally, both authorities point out that browser settings alone are not a sufficient basis for valid consent.

With regard to the territorial scope, CNIL clarifies that the cookie rules apply only to the processing of cookies within the activities of an establishment of a controller or processor in France, regardless of whether the processing takes place in France. The English guideline does not comment on this.

Cookie walls are considered non-compliant with GDPR by the French data protection authority due to the negative consequences for the user in case of refusal. ICO, on the other hand, is of the opinion that a consent forced on the basis of a cookie wall is probably not valid. Nevertheless GDPR must be balanced with other rights. Insofar ICO has not yet delivered a clear position.

Regarding analytic cookies, CNIL explains that a consent is not always necessary, namely not if they correspond to a list of cumulative requirements created by CNIL. ICO, on the other hand, does not exempt cookies from the consent requirement even in the case of analytic cookies.

Finally, CNIL notes that companies have six months to comply with the rules. However, this period will only be set in motion by the publication of a statement by the CNIL, which is still pending. CNIL expects this statement to be finalised during the first quarter of 2020. The ICO does not foresee such a time limit.

Hackers steal millions of Bulgarians’ financial data

18. July 2019

After a cyberattack on the Bulgarian’s tax agency (NRA) millions of taxpayers’ financial data has been stolen. In an estimate, it is said that most working adults in the 7 million country are affected by some of their data being compromised. The stolen data included names, adresses, income and social security information.

The attack happened in June, but an E-mail from the self-proclaimed perpetrator was sent to Bulgarian media on Monday. It stated that more than 110 databases of the agency had been compromised, the hacker calling the NRA’s cybersecurity a parody. The Bulgarian media were further offered access to the stolen data. One stolen file, e-mailed to the newspaper 24 Chasa,  contained up to 1,1 million personal identification numbers with income, social security and healthcare figures.

The country’s finance minister Vladislav Goranov has appologized in parliament and to the Bulgarian citizens, adding that about 3% of the tax agency’s database had been affected. He made clear that whoever attempted to exploit the stolen data would fall under the impact of Bulgarian law.

In result to this hacking attack, the Bulgarian tax agency now faces a fine of up to 20 million euros by the Commission of Personal Data Protection (CPDP). In addition, the issue has reignited an old debate about the lax cybersecurity standards in Bulgaria, and its adjustement to the modern times.

Google data breach notification sent to IDPC

Google may face further investigations under the General Data Protection Regulation(GDPR), after unauthorized audio recordings have been forwarded to subcontractors. The Irish Data Protection Commission (IDPC) has confirmed through a spokesperson that they have received a data breach notification concerning the issue last week.

The recordings were exposed by the Belgian broadcast VRT, said to affect 1000 clips of conversations in the region of Belgium and the Netherlands. Being logged by Google Assistant, the recordings were then sent to Google’s subcontractors for review. At least 153 of those recordings were not authorized by Google’s wake phrase “Ok/Hey, Google,” and were never meant to be recorded in the first place. They contained personal data reaching from family conversations over bedroom chatter to business calls with confidential information.

Google has addressed this violation of their data security policies in a blog post. It said that the audio recordings were sent to experts, who understand nuances and accents, in order to refine Home’s linguistic abilities, which is a critical part in the process of building speech technology. Google stresses that the storing of recorded data on its services is turned off by default, and only sends audio data to Google once its wake phrase is said. The recordings in question were most likely initiated by the users saying a phrase that sounded similar to “Ok/Hey, Google,” therefore confusing Google Assistant and turning it on.

According to Google’s statement, Security and Privacy teams are working on the issue and will fully review its safeguards to prevent this sort of misconduct from happening again. If, however, following investigations by the IDPC discover a GDPR violation on the matter, it could result in significant financial penalty for the tech giant.

Hearing on the legal challenge of SCC and US-EU Privacy Shield before CJEU

17. July 2019

On Tuesday last week, the European Court of Justice (CJEU) held the hearing on case 311/18, commonly known as “Schrems II”, following a complaint to the Irish Data Protection Commission (DPC) by Maximilian Schrems about the transfer of his personal data from Facebook Ireland to Facebook in the U.S. The case deals with two consecutive questions. The initial question refers to whether U.S. law, the Foreign Intelligence Service Act (FISA), that consists a legal ground for national security agencies to access the personal data of citizens of the European Union (EU) violates EU data protection laws. If confirmed, this would raise the second question namely whether current legal data transfer mechanisms could be invalid (we already reported on the backgrounds).

If both, the US-EU Privacy Shield and the EU Standard Contractual Clauses (SCCs) as currently primeraly used transfer mechanisms, were ruled invalid, businesses would probably have to deal with a complex and diffucult scenario. As Gabriela Zanfir-Fortuna, senior counsel at Future of Privacy Forum said, the hearing would have had a particularly higher impact than the first Schrems/EU-US Safe Harbor case, because this time it could affect not only data transfers from the EU to the U.S., but from the EU to all countries around the world where international data transfers are based on the SCCs.

This is what also Facebook lawyer, Paul Gallagher, argued. He told the CJEU that if SCCs were hold invalid, “the effect on trade would be immense.” He added that not all U.S. companies would be covered by FISA – that would allow them to provide the law enforcement agencies with EU personal data. In particular, Facebook could not be hold responsible for unduly handing personal data over to national security agencies, as there was no evidence of that.

Eileen Barrington, lawyer of the US government assured, of course, by referring to a “hypothetical scenario” in which the US would tap data streams from a cable in the Atlantic, it was not about “undirected” mass surveillance. But about “targeted” collection of data – a lesson that would have been learned from the Snowden revelations according to which the US wanted to regain the trust of Europeans. Only suspicious material would be filtered out using particular selectors. She also had a message for the European feeling of security: “It has been proven that there is an essential benefit to the signal intelligence of the USA – for the security of American as well as EU citizens”.

The crucial factor for the outcome of the proceedings is likely to be how valid the CJEU considers the availability of legal remedies to EU data subjects. Throughout the hearing, there were serious doubts about this. The monitoring of non-US citizens data is essentially based on a presidential directive and an executive order, i.e. government orders and not on formal laws. However, EU citizens will be none the wiser, as particularly, referring to many critisists’ conlusion, they do not know whether they will be actually surveilled or not. It remains the issue regarding the independence of the ombudsperson which the US has committed itself to establish in the Privacy Shield Agreement. Of course, he or she may be independent in terms of the intelligence agencies, but most likely not of the government.

However, Henrik Saugmandsgaard Øe, the Advocate General responsible for the case, intends to present his proposal, which is not binding on the Judges, on December 12th. The court’s decision is then expected in early 2020. Referring to CJEU judge and judge-rapporteur in the case, Thomas von Danwitz, the digital services and networking would be considerably compromised, anyways, if the CJEU would declare the current content of the SCC ineffective.

 

 

EU-US Privacy Shield and SCCs facing legal challenge before the EU High Courts

3. July 2019

Privacy Shield, established between the European Union (EU) and the United States of America (US) as a replacement of the fallen Safe Harbor agreement, has been under scrutiny from the moment it entered into effect. Based on the original claims by Max Schrems in regards to Safe Harbor (C-362/14), the EU-US data transfer agreement has been challenged in two cases, one of which will be heard by the Court of Justice of the European Union (CJEU) in early July.

In this case, as in 2015, Mr. Schrems bases his claims elementally on the same principles. The contention is the unrestricted access of US agencies to European’s personal data. Succeeding hearings in 2017, the Irish High Court found and raised 11 questions in regards to the adequacy of the level of protection to the CJEU. The hearing before the CJEU is scheduled for July 9th. The second case, originally planned to be heard on July 1st and 2nd, has been brought to the General Court of the European Union by the French digital rights group La Quadrature du Net in conjunction with the French Data Net and Fédération FDN. Their concerns revolve around the inadequacy of the level of protection given by the Privacy Shield and its mechanisms.
This hearing, however, has been cancelled by the General Court of the EU only days prior to its date, which was announced by La Quadrature du Net through tweet.

Despite the criticism of the agreement, the European Commission has noted improvements to the level of security of the Privacy Shield in their second review of the agreement dating from December 2018. The US Senate confirmed Keith Krach as Under Secretary for Economic Growth, Energy and Environment, with his duties to include being the permanent ombudsman in regards to the Privacy Shield and the EU data protection, on June 20th 2019.

As it is, both cases are apt to worry companies that rely on being certified by the Privacy Shield or the use of SCCs. With the uncertainty that comes with these questions, DPOs will be looking for new ways to ensure the data flow between Europe and the US. The European Commission stated that it wants to make it easier for companies in the future to comply with data transfers under the GDPR. It plans to update the SCCs to the requirements of the GDPR, providing a contractual mechanism for international transfers. Nonetheless, it is unclear when those updates are happening, and they may be subject to legal challenge based on the future Schrems ruling.

CNIL publishes action plan on targeted online advertising

On 29th June, the French data protection authority CNIL published its 2019-2020 action plan, which aims to set rules for targeted online advertising and guide companies in their compliance efforts.

The Action Plan consists of two main steps. First, new cookie guidelines will be published in July 2019. The last cookie policy dates back to 2013, for which CNIL stated that the policy is no longer valid and will be repealed due to the stricter approval requirements of the GDPR. In order to comply with the new cookie guidelines, companies will be given a transitional period of 12 months. During this period, it will still be possible to define further browsing of a website as consent to the use of cookies. However, CNIL requires that during this transition period Cookies will be set only after consent has been obtained.

As a second major step, working groups composed of CNIL officials and stakeholders from the adtech ecosystem will be formed to develop practical approaches to obtain consent. The draft recommendations developed on the basis of this discussion will be published by CNIL at the end of 2019 or at the latest at the beginning of 2020 in order to make them available for public consultation. CNIL will then implement the final version of the recommendations after a period of six months.

The reason for preparing the Action Plan was that CNIL received numerous complaints about online marketing practices from individuals, non-profit organisations, organisations and associations. In 2018, 21% of complaints related to these issues. At the same time, CNIL received numerous questions from industry professionals trying to better understand their GDPR obligations.

German Data Protection Authority of Baden-Württemberg fines an employee of a public body

24. June 2019

According to an announcement of the LfDI Baden-Würtemberg, which is one of the 16 German State Data Protection Authorities (DPA), a first fine of 1,400 Euro has been filed against an employee of a public body. The police officer unlawfully retrieved personal data in the context of his job solely for private purposes. Referring to the DPA’s statement, this was the first fine imposed on an employee of a public body after the EU General Data Protection Regulation (GDPR) has become applicable.

The police officer used his official user ID to file a request for the owner data relating to a license plate of a private coincidental aquaintance, without reference to his official duties. Using the personal data obtained in this way, he then carried out a second enquiry with the Federal Network Agency. By doing so, he then not only requested the personal data of the data subject, but also the fixed line and mobile numbers. Without the data subject’s official request or consent, the police officer finally used the obtained mobile number to contact the injured party by phone.

By filing the aforementioned requests for private puproses and the usage of the mobile number obtained in this way to make a private contact, the police officer autonomously processed personal data for non-legal purposes. Therefore, the police officer’s department cannot be hold responsible, as this action has not been taken in the course of the official duties of the police officer. As a consequence, the police officer is responsible for this breach of the GDPR as an individual. Neither the public body, which cannot be subject to sanctions according to the State Data Protection Act, is responsible nor the police officer is classified as a public body in the sense of this law.

Taking into account that it was the first of such a violation of the data protection laws, a fine of 1,400 Euro pursuant to Article 83 Paragraph 5 GDPR was considered to be appropriate. However, this case shows that even an employee of a public body might become subject to a fine if this person unlawfully processes personal data for private purposes only.

Category: GDPR · General
Tags:

FTC takes action against companies claiming to participate in EU-U.S. Privacy Shield and other international privacy agreements

The Federal Trade Commission (FTC) announced that it had taken action against several companies that pretended to be compliant with the EU-U.S. Privacy Shield and other international privacy agreements.

According to the FTC, SecureTest, Inc., a background screening company, has falsely claimed on its website to have participated in the EU-U.S. Privacy Shield and Swiss-U.S. Privacy Shield. These framework agreements allow companies to transfer consumer data from member states of the European Union and Switzerland to the United States in accordance with EU or Swiss law.

In September 2017, the company applied to the U.S. Department of Commerce for Privacy Shield certification. However, it did not take the necessary steps to be certified as compliant with the framework agreements.

Following the FTC’s complaint, the FTC and SecureTest, Inc. have proposed a settlement agreement. This proposal includes a prohibition for SecureTest to misrepresent its participation in any privacy or security program sponsored by any government or self-regulatory or standardization organization. The proposed agreement will be published in the Federal Register and subject to public comment for 30 days. Afterwards the FTC will make a determination regarding whether to make the proposed consent order final.

The FTC has also sent warning letters to 13 companies that falsely claimed to participate in the U.S.-EU Safe Harbor and the U.S.-Swiss Safe Harbor frameworks, which were replaced in 2016 by the EU-U.S. Privacy Shield and Swiss-U.S. Privacy Shield frameworks. The FTC asked companies to remove from their websites, privacy policies or other public documents any statements claiming to participate in a safe harbor agreement. If the companies fail to take action within 30 days, the FTC warned that it would take appropriate legal action.

The FTC also sent warning letters with the same request to two companies that falsely claimed in their privacy policies that they were participants in the Asia-Pacific Economic Cooperation (APEC) Cross-Border Privacy Rules (CBPR) system. The APEC CBPR system is an initiative to improve the protection of consumer data moving between APEC member countries through a voluntary but enforceable code of conduct implemented by participating companies. To become a certified participant, a designated third party, known as an APEC-approved Accountability Agent, must verify and confirm that the company meets the requirements of the CBPR program.

Pages: Prev 1 2 3 ... 14 15 16 17 18 19 20 21 22 23 24 Next
1 14 15 16 17 18 24