Tag: EU

EU-US Privacy Shield and SCCs facing legal challenge before the EU High Courts

3. July 2019

Privacy Shield, established between the European Union (EU) and the United States of America (US) as a replacement of the fallen Safe Harbor agreement, has been under scrutiny from the moment it entered into effect. Based on the original claims by Max Schrems in regards to Safe Harbor (C-362/14), the EU-US data transfer agreement has been challenged in two cases, one of which will be heard by the Court of Justice of the European Union (CJEU) in early July.

In this case, as in 2015, Mr. Schrems bases his claims elementally on the same principles. The contention is the unrestricted access of US agencies to European’s personal data. Succeeding hearings in 2017, the Irish High Court found and raised 11 questions in regards to the adequacy of the level of protection to the CJEU. The hearing before the CJEU is scheduled for July 9th. The second case, originally planned to be heard on July 1st and 2nd, has been brought to the General Court of the European Union by the French digital rights group La Quadrature du Net in conjunction with the French Data Net and Fédération FDN. Their concerns revolve around the inadequacy of the level of protection given by the Privacy Shield and its mechanisms.
This hearing, however, has been cancelled by the General Court of the EU only days prior to its date, which was announced by La Quadrature du Net through tweet.

Despite the criticism of the agreement, the European Commission has noted improvements to the level of security of the Privacy Shield in their second review of the agreement dating from December 2018. The US Senate confirmed Keith Krach as Under Secretary for Economic Growth, Energy and Environment, with his duties to include being the permanent ombudsman in regards to the Privacy Shield and the EU data protection, on June 20th 2019.

As it is, both cases are apt to worry companies that rely on being certified by the Privacy Shield or the use of SCCs. With the uncertainty that comes with these questions, DPOs will be looking for new ways to ensure the data flow between Europe and the US. The European Commission stated that it wants to make it easier for companies in the future to comply with data transfers under the GDPR. It plans to update the SCCs to the requirements of the GDPR, providing a contractual mechanism for international transfers. Nonetheless, it is unclear when those updates are happening, and they may be subject to legal challenge based on the future Schrems ruling.

Spanish DPA imposes fine on Spanish football league

13. June 2019

The Spanish data protection authority Agencia Española de Protección de Datos (AEPD) has imposed a fine of 250.000 EUR on the organisers of the two Spanish professional football leagues for data protection infringements.

The organisers, Liga Nacional de Fútbol Profesional (LFP), operate an app called “La Liga”, which aims to uncover unlicensed performances of games broadcasted on pay-TV. For this purpose, the app has recorded a sample of the ambient sounds during the game times to detect any live game transmissions and combined this with the location data. Privacy-ticker already reported.

AEPD criticized that the intended purpose of the collected data had not been made transparent enough, as it is necessary according to Art. 5 paragraph 1 GDPR. Users must approve the use explicitly and the authorization for the microphone access can also be revoked in the Android settings. However, AEPD is of the opinion that La Liga has to warn the user of each data processing by microphone again. In the resolution, the AEPD points out that the nature of the mobile devices makes it impossible for the user to remember what he agreed to each time he used the La Liga application and what he did not agree to.

Furthermore, AEPD is of the opinion that La Liga has violated Art. 7 paragraph 3 GDPR, according to which the user has the possibility to revoke his consent to the use of his personal data at any time.

La Liga rejects the sanction because of injustice and will proceed against it. It argues that the AEPD has not made the necessary efforts to understand how the technology works. They explain that the technology used is designed to produce only one particular acoustic fingerprint. This fingerprint contains only 0.75% of the information. The remaining 99.25% is discarded, making it technically impossible to interpret human voices or conversations. This fingerprint is also converted into an alphanumeric code (hash) that is not reversible to the original sound. Nevertheless, the operators of the app have announced that they will remove the controversial feature as of June 30.

Advocate General: No Valid Cookie Consent When Checkbox Is Pre-ticked

25. March 2019

On 21 of March Maciej Szpunar, Advocate General of the European Court of Justice, delivered his Opinion in the case of Planet24 GmbH against Bundesverband Verbraucherzentralen und Vebraucherverbände – Verbaucherzentrale Bundesverband e.V. (Federal Association of Consumer Organisations). In the Opinion, Szpunar explains how to obtain valid consent for the use of cookies.

In the case in question, Planet24 GmbH has organised a lottery campaign on the internet. When registering to participate in the action lottery, two checkboxes appeared. The first checkbox, which did not contain a pre-selected tick, concerned permission for sponsors and cooperation partners to contact the participant in order to inform him of their offers. The second checkbox, which was already ticked off, concerned the consent to the setting of cookies, which evaluate the user’s surfing and usage behaviour.

The Federal Association held that the clauses used infringed german law, in particular Article 307 of the BGB, Article 7(2), point 2, of the UWG and Article 12 et seq. of the TMG and filed a lawsuit in 2014 after an unsuccessful warning.

In the course of the instances, the case ended up at the German Federal Supreme Court in 2017. The German Federal Court considers that the success of the case depends on the interpretation of Articles 5(3) and 2(f) of Directive 2002/58, read in conjunction with Article 2(h) of Directive 95/46, and of Article 6(1)(a) of Regulation 2016/679. For that reason, it asked the European Court of Justice the following questions for a preliminary ruling:

(1) Does consent given on the basis of a pre-ticked box meet the requirements for valid consent under the ePrivacy Directive, the EU Data Protection Directive and the EU General Data Protection Regulation (the GDPR)?

(2) What information does the service provider have to provide to the user and does this include the duration of the use of cookies and whether third parties have access to the cookies?

According to the Advocate General, there is no valid consent if the checkbox is already ticked. In such case, the user must remove the tick, i.e. become active if he/she does not agree to the use of cookies. However, this would contradict the requirement of an active act of consent by the user. It is necessary for the user to explicitly consent to the use of cookies. Therefore, it is also not sufficient if one checkbox is used to deal with both the use of cookies and participation in the action lottery. Consent must be given separately. Otherwise the user is not in the position to freely give a separate consent.

In addition, Szpunar explains that the user must be provided with clear and comprehensive information that enables the user to easily assess the consequences of his consent. This requires that the information provided is unambiguous and cannot be interpreted. For this purpose, the information must contain details such as the duration of the operation of cookies, as well as whether third parties have access to the cookies.

European Commission adopts adequacy decision on Japan

28. January 2019

The European Commission adopted an adequacy decision for Japan on the 23rd of January 2019, enabling data flows to take place freely and safely. The exchange of personal data is based on strong safeguards that Japan has put in place in advance of the adequacy decision to ensure that the transfer of data complies with EU standards.

The additional safeguards include:

– A set of rules (Supplementary Rules), which will cover the differences between the two data protection systems. This should strengthen the protection of sensitive data, the exercise of personal rights and the conditions under which EU data can be further transferred to another third country. These additional rules are binding in particular on Japanese companies importing data from the EU. They can also be enforced by the independent Japanese data protection authority (PPC) as well as by courts.

– Also, safeguards have been established concerning access by Japanese authorities for law enforcement and national security purposes. In this regard, the Japanese Government has given assurances to the Commission and has ensured that the use of personal data is limited to what is necessary and proportionate and is subject to independent supervision and redress.

– A complaint handling mechanism to investigate and resolve complaints from Europeans regarding Japanese authorities’ access to their data. This new mechanism will be managed and monitored by Japan’s independent data protection authority.

The adequacy decision has been in force since 23rd of January 2019. After two years, the functioning of the framework will be reviewed for the first time. The subsequent reviews will take place at least every four years.

The adequacy decision also complements the EU-Japan Economic Partnership Agreement, which will enter into force in February 2019. European companies will benefit from free data flows as well as privileged access to the 127 million Japanese consumers.

 

EU Commission: Draft for adoption of adequacy decision for Japan

6. September 2018

The EU Commission has drafted the adequacy decision for Japan including next steps Japan has to undertake in order to ensure protection for the transfer of personal data from the EU to Japan. This includes additional safeguards Japan should apply, as well as commitments regarding access to personal data by Japanese public authorities.

Japan has committed to implement several safeguards that are necessary for the protection of the transfer of personal data before the actual adoption of the adequacy decision. These include,

  • a set of rules providing additional safeguards for transferred personal data of EU individuals (addressing inter alia the topics protection of sensitive data and the further transfer of personal data from Japan to another third country),
  • safeguards concerning the access to personal data by Japanese public authorities for criminal law enforcement and national security purposes,
  • a complaint-handling mechanism for Europeans regarding the access of Japanese authorities to their personal data.

The Commissioner for Justice, Consumers and Gender Equality, Věra Jourová, said: “We are creating the world’s largest area of safe data flows. Personal data will be able to travel safely between the EU and Japan to the benefit of both our citizens and our economies. Our partnership will promote global standards for data protection and set an example for future partnerships in this key area.”

The next step in the adoption procedure of the adequacy decision is the European Data Protection Board (EDPB), which will be asked for his opinion.

Category: EU · EU Commission · General
Tags: ,

European Commission: €110 million fine for Facebook

23. May 2017

According to an European Commission Press release from the 18 May 2017, Facebook was fined €110 million by the Commission for providing misleading information about the takeover of WhatsApp.

Facebook acquired WhatsApp in 2014. Back then Facebook informed the European Commission that it would not be able to establish reliable automated matching between the users of Facebook and WhatsApp. Two years later, in August 2016, Facebook announced an update to its terms of service and privacy policy. The update included the possibility to link phone numbers of WhatsApp users with their respective Facebook accounts.

According to the Press release and contrary to the statement given by Facebook during the merger process 2014, the Commission has found that the possibility of automated linking of Facebook and WhatsApp users already existed in 2014.

Commissioner Margrethe Vestager, who is in charge of the competition policy, said: “Today’s decision sends a clear signal to companies that they must comply with all aspects of EU merger rules, including the obligation to provide correct information.”

It is the first time that the European Commission has imposed a fine on a company for the provision of misleading information since the Merger Regulation came into force in 2004.

EU Member States address issues on encryption in criminal investigations

30. November 2016

Recently, Italy, Latvia, Poland, Hungary and Croatia, have proposed a new legislation, which could facilitate police investigators to access the different entities’ encrypted information in order to make it easier to crack open encryption technology.

According to the Polish officials, “One of the most crucial aspects will be adopting new legislation that allows acquisition of data stored in EU countries in the cloud”.

European countries were asked by the Slovakian government (which holds the current presidency of the EU Council) to identify the way, in which their law enforcement authorities deal with technology preventing from the communication interception as long as they are not authorised to get the information.

Via a freedom of information request, twelve countries, amongst others Finland, Italy, Swedem or Poland, responded to the Dutch internet rights NGO Bits of Freedom, that they frequently encounter encrypted data while carrying out criminal investigations. The UK and Latvia indicated that it happens ‘almost always’.

Ultimately a dispute on prohibiting or creating backdoors in order to weaken encryption for digital and telecommunication services has raised among Germany and European Union.

Even though Germany has dismissed charges that the government is pushing companies to create encryption backdoors in their products, Angela Merkel has announced that investigators will pay more attention to tracing criminals who use the darknet and encryption, especially since the shooting in Munich in July.

So far however, Europol, ENISA and the Commission´s vice president Andrus Ansip oppose creating the backdoors weakening encryption.

Ten relevant practical consequences of the upcoming General Data Protection Regulation

22. January 2016

After several negotiations, the European Parliament, the European Council and the European Commission finally reached a consensus in December 2015 on the final version of the General Data Protection Regulation (GDPR), which is expected to be approved by the European Parliament in April 2016. The consolidated text of the GDPR involves the following practical consequences:

1) Age of data subject´s consent: although a specific, freely-given, informed and unambiguous consent was also required according to the Data Protection Directive (95/46 EC), the GDPR determines that the minimum age for providing a legal consent for the processing of personal data is 16 years. Nevertheless, each EU Member State can determine a different age to provide consent for the processing of personal data, which should not be below 13 years (Arts. 7 and 8 GDPR).

2) Appointment of a Data Protection Officer (DPO): the appointment of a DPO will be mandatory for public authorities and for data controllers whose main activity involves a regular monitoring of data subjects on a large scale or the processing of sensitive personal data (religion, health matters, origin, race, etc.). The DPO should have expert knowledge in data protection in order to ensure compliance, to be able to give advice and to cooperate with the DPA. In a group of subsidiaries, it will be possible to appoint a single DPO, if he/she is accessible from each establishment (Art. 35 ff. GDPR).

3) Cross-border data transfers: personal data transfers outside the EU may only take place if a Commission decision is in place, if the third country ensures an adequate level of protection and guarantees regarding the protection of personal data (for example by signing Standard Contractual Clauses) or if binding corporate rules have been approved by the respective Data Protection Authority (Art. 41 ff. GDPR).

4) Data security: the data controller should recognize any existing risks regarding the processing of personal data and implement adequate technical and organizational security measures accordingly (Art. 23 GDPR). The GDPR imposes strict standards related to data security and the responsibility of both data controller and data processor. Security measures should be implemented according to the state of the art and the costs involved (Art. 30 GDPR). Some examples of security measures are pseudonymization and encryption, confidentiality, data access and data availability, data integrity, etc.

5) Notification of personal data breaches: data breaches are defined and regulated for the first time in the GDPR (Arts. 31 and 32). If a data breach occurs, data controllers are obliged notify the breach to the corresponding Data Protection Authority within 72 hours after having become aware of it. In some cases, an additional notification to the affected data subjects may be mandatory, for example if sensitive data is involved.

6) One-stop-shop: if a company has several establishments across the EU, the competent Data Protection Authority, will be the one where the controller or processor’s main establishment is located. If an issue affects only to a certain establishment, the competent DPA, is the one where this establishment is located.

7) Risk-based approach: several compliance obligations are only applicable to data processing activities that involve a risk for data subjects.

8) The role of the Data Protection Authorities (DPA): the role of the DPA will be enforced. They will be empowered to impose fines for incompliances. Also, the cooperation between the DPA of the different Member States will be reinforced.

9) Right to be forgotten: after the sentence of the ECJ from May 2014, the right to be forgotten has been consolidated in Art. 17 of the GDPR. The data subject has the right to request from the data controller the erasure of his/her personal data if certain requirements are fulfilled.

10) Data Protection Impact Assesment (PIA): this assessment should be conducted by the organization with support of the DPO. Such an assessment should belong to every organization’s strategy. A PIA should be carried out before starting any data processing operations (Art. 33 GDPR).

 

Pages: Prev 1 2
1 2