Category: General Data Protection Regulation

CNIL publishes action plan on targeted online advertising

3. July 2019

On 29th June the French data protection authority CNIL published its 2019-2020 action plan, which aims to set rules for targeted online advertising and guide companies in their compliance efforts.

The Action Plan consists of two main steps. First, new cookie guidelines are to be published in July 2019. The last cookie policy dates back to 2013, for which CNIL stated that the policy is no longer valid and will be repealed due to the stricter approval requirements of the GDPR. In order to comply with the new cookie guidelines, companies will be given a transitional period of 12 months. During this period, it will still be possible to define further browsing of a website as consent to the use of cookies. However, CNIL requires that during this transition period cookies be set only after consent has been obtained.

As a second major step, working groups composed of CNIL officials and stakeholders from the adtech ecosystem will be formed to develop practical approaches to obtaining consent. The draft recommendations developed on the basis of this discussion will be published by CNIL at the end of 2019 or at the latest at the beginning of 2020 in order to make them available for public consultation. CNIL will then implement the final version of the recommendations after a period of six months.

The reason for preparing the Action Plan was that CNIL received numerous complaints about online marketing practices from individuals, non-profit organisations, organisations and associations. In 2018, 21% of complaints related to these issues. At the same time, CNIL received numerous questions from industry professionals trying to better understand their GDPR obligations.

FTC takes action against companies claiming to participate in EU-U.S. Privacy Shield and other international privacy agreements

24. June 2019

The Federal Trade Commission (FTC) announced that it had taken action against several companies that pretended to be compliant with the EU-U.S. Privacy Shield and other international privacy agreements.

According to the FTC, SecureTest, Inc., a background screening company, has falsely claimed on its website to have participated in the EU-U.S. Privacy Shield and Swiss-U.S. Privacy Shield. These framework agreements allow companies to transfer consumer data from member states of the European Union and Switzerland to the United States in accordance with EU or Swiss law.

In September 2017, the company applied to the U.S. Department of Commerce for Privacy Shield certification. However, it did not take the necessary steps to be certified as compliant with the framework agreements.

Following the FTC’s complaint, the FTC and SecureTest, Inc. have proposed a settlement agreement. This proposal includes a prohibition for SecureTest to misrepresent its participation in any privacy or security program sponsored by any government or self-regulatory or standardization organization. The proposed agreement will be published in the Federal Register and subject to public comment for 30 days. Afterwards the FTC will make a determination regarding whether to make the proposed consent order final.

The FTC has also sent warning letters to 13 companies that falsely claimed to participate in the U.S.-EU Safe Harbor and the U.S.-Swiss Safe Harbor frameworks, which were replaced in 2016 by the EU-U.S. Privacy Shield and Swiss-U.S. Privacy Shield frameworks. The FTC asked companies to remove from their websites, privacy policies or other public documents any statements claiming to participate in a safe harbor agreement. If the companies fail to take action within 30 days, the FTC warned that it would take appropriate legal action.

The FTC also sent warning letters with the same request to two companies that falsely claimed in their privacy policies that they were participants in the Asia-Pacific Economic Cooperation (APEC) Cross-Border Privacy Rules (CBPR) system. The APEC CBPR system is an initiative to improve the protection of consumer data moving between APEC member countries through a voluntary but enforceable code of conduct implemented by participating companies. To become a certified participant, a designated third party, known as an APEC-approved Accountability Agent, must verify and confirm that the company meets the requirements of the CBPR program.

Spanish DPA imposes fine on Spanish football league

13. June 2019

The Spanish data protection authority Agencia Española de Protección de Datos (AEPD) has imposed a fine of 250.000 EUR on the organisers of the two Spanish professional football leagues for data protection infringements.

The organisers, Liga Nacional de Fútbol Profesional (LFP), operate an app called “La Liga”, which aims to uncover unlicensed performances of games broadcasted on pay-TV. For this purpose, the app has recorded a sample of the ambient sounds during the game times to detect any live game transmissions and combined this with the location data. Privacy-ticker already reported.

AEPD criticized that the intended purpose of the collected data had not been made transparent enough, as it is necessary according to Art. 5 paragraph 1 GDPR. Users must approve the use explicitly and the authorization for the microphone access can also be revoked in the Android settings. However, AEPD is of the opinion that La Liga has to warn the user of each data processing by microphone again. In the resolution, the AEPD points out that the nature of the mobile devices makes it impossible for the user to remember what he agreed to each time he used the La Liga application and what he did not agree to.

Furthermore, AEPD is of the opinion that La Liga has violated Art. 7 paragraph 3 GDPR, according to which the user has the possibility to revoke his consent to the use of his personal data at any time.

La Liga rejects the sanction because of injustice and will proceed against it. It argues that the AEPD has not made the necessary efforts to understand how the technology works. They explain that the technology used is designed to produce only one particular acoustic fingerprint. This fingerprint contains only 0.75% of the information. The remaining 99.25% is discarded, making it technically impossible to interpret human voices or conversations. This fingerprint is also converted into an alphanumeric code (hash) that is not reversible to the original sound. Nevertheless, the operators of the app have announced that they will remove the controversial feature as of June 30.

Belgian DPA imposes first fine since GDPR

11. June 2019

On 28 May 2019, the Belgian Data Protection Authority (DPA) imposed the first fine since the General Data Protection Regulation (GDPR) came into force. The Belgian DPA fined a Belgian mayor 2.000 EUR for abusing use of personal data.

The Belgian DPA received a complaint from the data subjects alleging that their personal data collected for local administrative purposes had been further used by the mayor for election campaign purposes. The parties were then heard by the Litigation Chamber of the Belgian DPA. Finally, the Belgian DPA ruled that the mayor’s use of the plaintiff’s personal data violated the purpose limitation principle of the GDPR, since the personal data was originally collected for a different purpose and was incompatible with the purpose for which the mayor used the data.

In deciding on the amount of the fine, the Belgian DPA took into account the limited number of data subjects, the nature, gravity and duration of the infringement, resulting in a moderate sum of 2.000 EUR. Nevertheless, the decision conveys the message that compliance with the GDPR is the responsibility of each data controller, including public officials.

Royal family uses GDPR to protect their privacy

22. May 2019

Last week Prince Harry and Meghan Markle could claim another victory in the royal family’s never ending struggle with paparazzi photographers, securing “a substantial sum” in damages from an agency that released intimate photos of the Oxfordshire home the Duke and Duchess of Sussex rented to the media. In a statement, Splash News apologized for and acknowledged that this situation would represent “an error of judgement”.

The paparazzi agency “Splash News” took photos and footage of the couple’s former Cotswolds home — including their living room, dining area, and bedroom — using a helicopter and promptly sold to different news outlets. The lawyers of Prince Harry argued that this situation caused a breach of his right to privacy according to Art. 7 and 8 ECHR as well as a breach of the General Data Protection Regulation (GDPR) and Data Protection Act 2018 (DPA).

Considering the strategy of the Duke’s lawyers, it looks like the royal family have found a potentially attractive alternative to claims of defamation of invasion of privacy. Since in contrast to such a claim, a claimant relying on data protection law neither needs to prove that a statement is at least defamatory and met the threshold for serious harm to reputation nor that the information is private.

However, the (new) European data protection legislation grants all data subjects, regardless of their position and/or fame, a right of respect for their privacy and family lives and protection of their personal data. In particular, the GDPR requires organisations, according to its Article 5, to handle personal data (such as names, pictures and stories relating to them) fairly and in a transparent manner while also using it for a legitimate purpose.

Moreover, when obtaining pictures and footage of an individual’s private or even the intimite sphere, the organization using such materials need a specific reason like some kind of contract, the individual’s consent or be able to argue that using this photos and footage was “in the public interest” or for a “legitimate interest”. As a contract and a consent can be excluded here, the only basis that might be considerd could be a public interest or a legitimate interest of the organization itself. Taking into account the means and the way how these photos and footage of the Duke and Dutchess were created, both of these interest cannot withstand the interest  in protecting the rights and freedom of individuals’ private and intimite sphere.

Referring to this case, it seems pretty likely that the European data protection regime changed the way in how celebrities and the courts enforce the heavy-contested threshold of whether the public is allowed to see and be informed about certain parts and aspects of famous people’s lives or not.

 

 

EDPB: One year – 90.000 Data Breach Notifications

20. May 2019

Because of the GDPR’s first anniversary the EDPB published a new report that looks back on the first year GDPR.

Besides other findings of the report, the EDPB states that the national supervisory authorities received in total 281.088 complaints. 89.271 data breach notifications, 144.376 GDPR-related complaints and 47.441 other. Three month ago the number of received complaints were in total 206.326, 64.484 data breach notifications, 94.622 GDPR-related complaints from data subjects and 47.020 other. These number of complaints prove that the complaints have (on average) increased in the last three month.

At the time of the EDPB report 37% of the complaints are ongoing and 0,1% of the fined companies appealed against the decision of the supervisory authority. The other 62,9% were already closed. This proves that in contrast to the report after nine month, 2/3 of the complaints have been processed in the meantime. Three month ago only 52% were closed.

Referring to the EDPB report from three month ago, fines totalling € 55.955.871 were awarded for the detected violations by 11 authorities. With this high sum, however, it must be noted that € 50 million was imposed on Google alone. The current EDPB-report does not include a passage on fines.

All in all, the increase in queries and complaints, compared to the previous years, confirm the risen awareness on data protection. According to the Eurobarometer 67% of EU citizens have heard of the GDPR, 36% indicated that they are aware of the GDPR entails and 57% know about the existence of a public authority.

Dutch DPA publishes recommendations for privacy policies

26. April 2019

Recently, the Dutch Data Portection Authority (Autoriteit Personensgegevens) published six recommendations for companies when outlining their privacy policies for the purpose of Art. 24 para 2 of the General Data Protection Regulation (the “GDPR”).

The authorities’ recommendations are a result of their investigation of companies’ privacy policies, which focused on companies that mainly process special categories of personal data, e.g. health data or data relating to individuals’ political beliefs.

The Dutch DPA reviewed privacy policies of several companies such as blood banks or local political parties and it focused on three main points 1) the description of the categories of the personal data 2) the description of the purposes of the processing and 3) the information about data subjects’ rights. They discovered that the descriptions of the data categories and purposes were incomplete or too superficial and thus released six recommendations that companies shall take into consideration when outlining privacy policies.

Those are the six recommendations:

  • Companies should evaluate whether they have to implement privacy policies (taking into account the nature, scope, context and purposes of the processing, as well as the risks for the rights and freedoms of natural persons)
  • Companies should consult internal and/or external expertise such as data protection officers when implementing privacy policies
  • The policy should be outlined in a single document to avoid fragmentation of information
  • The policy should be concrete and specific and therefore not only repeating the provisions of the GDPR
  • The DPA recommends to publish the privacy policies so that data subjects are aware of how the company handles personal data
  • The DPA also suggests to draft a privacy policy even if it is not mandatory to demonstrate that the company is willing to protect personal data

CNIL publishes model regulation on access control through biometric authentication at the workplace

9. April 2019

The French data protection authority CNIL has published a model regulation which regulates under which conditions devices for access control through biometric authentication may be introduced at the workplace.

Pursuant to Article 4 paragraph 14 of the General Data Protection Regulation (GDPR), biometric data are personal data relating to the physical, physiological or behavioural characteristics of a natural person, obtained by means of specific technical processes, which enable or confirm the unambiguous identification of that natural person. According to Article 9 paragraph 4 GDPR, the member states of the European Union may introduce or maintain additional conditions, including restrictions, as far as the processing of biometric data is concerned.

The basic requirement under the model regulation is that the controller proves that biometric data processing is necessary. To this end, the controller must explain why the use of other means of identification or organisational and technical safeguards is not appropriate to achieve the required level of security.

Moreover, the choice of biometric types must be specifically explained and documented by the employer. This also includes the justification for the choice of one biometric feature over another. Processing must be carried out for the purpose of controlling access to premises classified by the company as restricted or of controlling access to computer devices and applications.

Furthermore, the model regulation of the CNIL describes which types of personal data may be collected, which storage periods and conditions apply and which specific technical and organisational measures must be taken to guarantee the security of personal data. In addition, CNIL states that before implementing data processing, the controller must always carry out an impact assessment and a risk assessment of the rights and freedoms of the individual. This risk assessment must be repeated every three years for updating purposes.

The data protection authority also points out that the model regulation does not exempt from compliance with the regulations of the GDPR, since it is not intended to replace its regulations, but to supplement or specify them.

German Court’s Decision on the Right of Access

Just recently, a German Labour Court (LAG Baden-Württemberg) has decided on the extent of Article 15 of the European General Data Protection Regulation (GDPR) with regard to the information that is supposed to be handed out to the data subject in case such a claim is made.

The decision literally reflects the wording of Art. 15 (1) GDPR which, amongst other things, requires information on

  • the purposes of data processing,
  • the categories of personal data concerned,
  • the recipients or categories of recipient to whom the personal data have been or will be disclosed
  • where possible, the envisaged period for which the personal data will be stored, or, if not possible, the criteria used to determine that period,
  • where the personal data are not collected from the data subject, any available information as to their source.

In contrast to the previous views of the local data protection authorities, which – in the context of information about recipients of personal data – deem sufficient that the data controller discloses recipient categories, the LAG Baden-Württemberg also obliged the data controller to provide the data subject with information about each individual recipient.

In addition, the LAG Baden-Württemberg ordered the data controller to make available to the data subject a copy of all his personal performance data. However, the court did not comment on the extent of copies that are to be made. It is therefore questionable whether, in addition to information from the systems used in the company, copies of all e-mails containing personal data of the person concerned must also be made available to the data subject.

Since the court has admitted the appeal to the Federal Labour Court (BAG) regarding this issue, it remains to be seen whether such an approach will still be valid after a Federal Labour Court decision.

Dutch DPA published update on policy on administrative fines

The Dutch Data Protection Authority, Autoriteit Persoonsgegevens (Dutch DPA), announced an update on its policy regarding administrative fines.

In addition to the Dutch GDPR implementation law the published policy provides insides on how the Dutch DPA will use its fining powers. According to the policy the DPA differentiats three or four categories of infringements. Each infringement is fined with a basic fine and a specific penalty bandwidth.

The DPA calculates the fine in two steps. First the basic fine is applied, second the basic fine is increased or decreased according to the classification to the different categories. Various aspects are included in the calculation of the fine, such as:

  • the nature, the seriousness and duration of the violation,
  • the number of data subjects affected,
  • the extent of the damage and of the data compromised,
  • the intentional or negligent nature of the violation,
  • the measures adopted to mitigate the damages,
  • the measures that were implemented to ensure compliance with the GDPR, including information security measures,
  • prior violations,
  • the level of cooperation with the DPA,
  • the types of data involved,
  • how the DPA became aware of the violation, including whether (and if so, to what extent) the data controller or processor reported the violation,
  • adherence to approved codes of conduct an certification mechanisms,
  • any other applicable aggravating or mitigating factors.

The maximum amount in general is €1.000.000,00, but the fine can be higher in case the Dutch DPA decides that the calculated maximum amount is inappropriate in the particular case.

Pages: 1 2 3 4 5 6 7 8 9 Next
1 2 3 9