Category: GDPR

EPRS publishes report on post-Brexit EU-UK Data Transfer Mechanisms

20. April 2021

On April 9th, 2021, the European Parliamentary Research Service (EPRS) published a report on data transfers in the private sector between the EU and the U.K. following Brexit.

The report reviews and assesses trade dealings, adequacy challenges and transfer instruments under the General Data Protection Regulation (GDPR). The report is intended to help take regulatory and business decisions, and in the Press Release the European Parliament stated that “a clear understanding of the state of play and future prospects for EU-UK transfers of personal data is indispensable”.

The report provides in-depth analysis of an adequacy decision for the UK as a viable long-term solution for data flows between the U.K. and the EU, also considering possible mechanisms for data transfer in the potential absence of an adequacy decision, such as Standard Contractual Clauses, Binding Corporate Rules, codes of conduct, and certification mechanism.

In this analysis the EPRS also sheds light on adequacy concerns such as U.K. surveillance laws and practices, shortcomings of the implementation of the GDPR, weak enforcement of data protection laws, and wavering commitment to EU data protection standards.

As part of its conclusion, the EPRS stated that the European Data Protection Board’s (‘EDPB’) opinion on the draft decision, which has just been published (please see our blogpost here), will likely scrutinise the Commission’s approach and provide recommendations on next steps.

EDPB adopts opinion on draft UK adequacy decisions

16. April 2021

In accordance with its obligation under Article 70 (1) (s) GDP, on April 13th, 2021, the European Data Protection Board (“EDPB”) adopted its opinions on the EU Commissions (“EC”) draft UK adequacy decision (please see our blog post). “Opinion 14/2021” is based on the GDPR and assesses both general data protection aspects and the public authority access to personal data transferred from the EEA for law enforcement and national security purposes contained in the draft adequacy decision, a topic the EC also discussed in detail. At the same time, the EDPB also issued “Opinion 15/2021” on the transfer of personal data under the Law Enforcement Directive (LED).

The EDPB notes that there is a strong alignment between the EU and the UK data protection regimes, especially in the principles relating to the processing of personal data. It expressly praises the fact that the adequacy decision is to apply for a limited period, as the EDPB also sees the danger that the UK could change its data protection laws. Andrea Jelinek, EDPB Chair, is quoted:

“The UK data protection framework is largely based on the EU data protection framework. The UK Data Protection Act 2018 further specifies the application of the GDPR in UK law, in addition to transposing the LED, as well as granting powers and imposing duties on the national data protection supervisory authority, the ICO. Therefore, the EDPB recognises that the UK has mirrored, for the most part, the GDPR and LED in its data protection framework and when analysing its law and practice, the EDPB identified many aspects to be essentially equivalent. However, whilst laws can evolve, this alignment should be maintained. So we welcome the Commission’s decision to limit the granted adequacy in time and the intention to closely monitor developments in the UK.”

But the EDPB also highlights areas of concern that need to be further monitored by the EC.

1. The immigration exemption, which restricts the rights of those data subjects affected.

2. How the transfer of personal data from the EEA to the UK could undermine EU data protection rules, for example on basis of future UK adequacy decisions.

3. Access to personal data by public authorities is given a lot of space in the opinion. For example, the Opinion analyses in detail the Investigatory Powers Act 2016 and related case law. The EDPB welcomes the numerous oversight and redress mechanisms in the UK but identifies a number of issues that need “further clarification and/or oversight”, namely bulk searches, independent assessment and oversight of the use of automated processing tools, and the safeguards provided under UK law when it comes to disclosure abroad, particularly with regard to the application of national security exemptions.

In summary, this EDPB opinion does not put any obstacles in the way of an adequacy deciding and recognises that there are many areas where the UK and EU regimes converge. Nevertheless, it highlights very clearly that there are deficiencies, particularly in the UK’s system for monitoring national security, which need to be reviewed and kept under review.

As for the next steps, the draft UK adequacy decisions will now be assessed by representatives of the EU Member States under the “comitology procedure“. The Commission can then adopt the draft UK adequacy decisions. A bridging period during which free data transfer to the UK is permitted even without an adequacy decision ends in June 2021 (please see our blog post).

Thailand: Another delay of the Personal Data Protection Act

9. April 2021

On May 28th, 2019, the Personal Data Protection Act (“PDPA”) became law in Thailand. It is the country’s very first legislation governing data protection. Originally, a one-year grace period was determined for implementation of the requirements so that companies could prepare for the prospective liabilities in order to become compliant with the PDPA. However, on May 21st, 2020, a Royal Decree extended the implementation of the PDPA’s key provisions for another year, until June 1st, 2021 (we reported). Currently, a further postponement of the PDPA’s enforcement date is being considered.

According to new Digital Economy and Society (“DES”) Minister, consideration may be given to deferring or amending the PDPA, if the public has negative views about it. The aim is to support small and medium-sized businesses affected by the legislation since most of them are still unprepared for the new obligations and have not adjusted their internal processes yet. In addition, there is an unfortunate lack of willingness among companies concerned, as deputy permanent secretary at the DES Ministry stated. These shortcomings are reflected by the fact that some associations, including the travel and automotive industries, have already requested the deferral of the PDPA’s enforcement.

Contrary to what was initially planned, the appointment of members to the Personal Data Protection Committee is also expected to be delayed further. The Committee plays a decisive role in the approval of subsidiary legislation. The drafts for this concern consent procedures, complaint reception and expert panels.

According to the current status, the PDPA needs further adjustments and necessary regulations still need to be drafted, as many issues have been raised for consultation with regard to the PDPA since it came into effect. The main priorities on which the government intends to focus are as follows:

  • Supporting people’s access to innovation and technology
  • Creating an ecosystem conducive to a digital economy
  • Gearing up for digital infrastructure development, particularly 5G and smart city projects
  • Legal development and enforcement to create a trusted digital ecosystem, especially for the PDPA and issues related to electronic transactions and cybersecurity
  • Protecting the public from abuse on social media and the internet.

The DES Ministry expects that full enforcement of the PDPA will likely be delayed until the end of this year.

Facebook data leak affects more than 500 million users

7. April 2021

Confidential data of 533 million Facebook users has surfaced in a forum for cybercriminals. A Facebook spokesperson told Business Insider that the data came from a leak in 2019.

The leaked data includes Facebook usernames and full name, date of birth, phone number, location and biographical information, and in some cases, the email address of the affected users. Business Insider has verified the leaked data through random sampling. Even though some of the data may be outdated, the leak poses risks if, for example, email addresses or phone numbers are used for hacking. The leak was made public by the IT security firm Hudson Rock. Their employees noticed that the data sets were offered by a bot for money in a hacking forum. The data set was then offered publicly for free and thus made accessible to everyone.

The US magazine Wired points out that Facebook is doing more to confuse than to help clarify. First, Facebook referred to an earlier security vulnerability in 2019, which we already reported. This vulnerability was patched in August last year. Later, a blog post from a Facebook product manager confirmed that it was a major security breach. However, the data had not been accessed through hacking, but rather the exploitation of a legitimate Facebook feature. In addition, the affected data was so old that GDPR and U.S. privacy laws did not apply, he said. In the summer of 2019, Facebook reached an agreement with the U.S. Federal Trade Commission (FTC) to pay a $5 billion fine for all data breaches before June 12, 2019. According to Wired, the current database is not congruent with the one at issue at the time, as the most recent Facebook ID in it is from late May 2019.

Users can check whether they are affected by the data leak via the website HaveIBeenPwned.

CNIL plans to start enforcement on Ad Tracker Guideline

Starting from April 1st, 2021, the French supervisory authority the Commission Nationale de l’Informatique et des Libertés (CNIL) is planning on starting its enforcement of Ad Tracker usage across the internet.

Following its Ad Tracker Guideline, the CNIL gave companies a time frame to adjust ad tracker usage and ensure compliance with the Guideline as well as the GDPR. This chance for the companies to adjust their ad tracker usage has ended on March 31st, 2021.

The new rules on cookies and ad trackers mainly revolve around the chance for the user to give active, free and informed consent. User consent for advertising cookies must be granted by a “clear and positive act”. This encompasses actions such as clicking an “I accept” button and no longer can be agreed to by simply continuing to use the website.

In addition, cookie banners must not only give the option to accept, they also have to give the option to reject. The act to reject cookie has to be as simple and easy as the act to accept cookies. Referring to “Cookie Options” is no longer a valid form of rejection, as it makes the user have to go through an extra step which may dissuade them from rejecting cookies. A valid option remains rejecting cookies by closing the Cookie Banner, but it has to be ensured that unless the cookies are indeed accepted, none but the essential cookies are activated.

Lastly, the Cookie Banner has to give a short information on the usage of the cookies. The CNIL’s Guideline allows for a more detailed information to be linked in the Cookie Banner, however companies should also give a short information in the Cookie Banner in order to be able to obtain “informed” consent.

At the beginning of March, the CNIL announced that “compliance with the rules applicable to cookies and other trackers” would be one of its three priorities for 2021, along with cybersecurity and the protection of health data. In a first act to follow that goal, the CNIL will now begin to conduct checks to ensure websites are in compliance with advertising tracker guidelines.

It is expected that companies that did not adjust their cookie and ad tracker usages will face fines according to the level of lacking compliance.

EU and South Korea complete adequacy talks

6. April 2021

On March 30th, 2021, EU Justice Commissioner Didier Reynders and Chairperson of the Personal Information Protection Commission of the Republic of Korea Yoon Jong In announced the successful conclusion of adequacy talks between the EU und the Republic of Korea (“South Korea”). These adequacy discussions began in 2017, and there was already initially a high level of convergence between the EU and the Republic of Korea on data protection issues, which has been further enhanced by additional safeguards to further strengthen the level of protection in South Korea. Recently, South Korea’s Personal Information Protection Act (“PIPA”) took effect and the investigative and enforcement powers of South Korea’s data protection authority, the Personal Information Protection Commission (“PIPC”), were strengthened.

In the GDPR, this adequacy decision is based on Art. 45 GDPR. Article 45(3) GDPR empowers the EU Commission to adopt an implementing act to determine that a non-EU country ensures an “adequate level of protection”. This means a level of protection for personal data that is substantially equivalent to the level of protection within the EU. Once it has been determined that a non-EU country provides an “adequate level of protection”, transfers of personal data from the EU to that non-EU country can take place without further requirements. South Korea will be the 13th country to which personal data may be transferred on the basis of an adequacy decision. An adequacy decision covering both commercial providers and the public sector will enable free and secure data flows between the EU and the Republic of Korea and it will complement the EU-Republic of Korea Free Trade Agreement.

Until the free flow of data can occur, the EU Commission must initiate the procedure for adopting its adequacy finding. In this procedure, the European Data Protection Board will issue an opinion and a committee composed of representatives of the EU member states must agree. The EU Commission may then adopt the adequacy decision.

EDPB released a new Guidance on Virtual Voice Assistants

31. March 2021

In recent years, Virtual Voice Assistants (VVA) have enjoyed increased popularity among technophile consumers. VVAs are integrated in modern smartphones like Siri on Apple or Google Assistant on Android mobile devices, but can also be found in seperate terminal devices like Alexa on the Amazon Echo device. With Smart Homes trending, VVAs are finding their ways into many homes.

However, in light of their general mode of operation and their specific usage, VVAs potentially have access to a large amount of personal data. They furthermore use new technologies such as machine learning and artificial intelligence in order to improve their services.

As both private households and corporate businesses are increasingly using VVAs and questions on data protection arise, the European Data Protection Board (EDPB) sought to provide guidance to the relevant data controllers. Therefore, the EDPB published a guidance on Virtual Voice Assistants earlier this month.

In its guidance, the EDPB specifically addresses VVA providers and VVA application developers. It encourages them to take considerations of data protection into account when designing their VVA service, as layed out by the principle of data protection by design and default under Art. 25 GDPR. The EDPB suggests that, for example, controllers could fulfil their information obligations pursuant to Art. 13/14 GDPR using voice based notifications if the VVA works with a screenless terminal device. VVA designers could also enable users to initiate a data subject request though easy-to-follow voice commands.

Moreover, the EDPB states that in their opinion, providing VVA services will require a Data Protection Impact Assessment according to Art. 35 GDPR. The guidance also gives further advice on complying with general data protection principles and is still open for public consultation until 23 April 2021.

Data Breach made 136,000 COVID-19 test results publicly accessible

18. March 2021

Personal health data are considered a special category of personal data under Art. 9 of the GDPR and are therefore given special protections. A group of IT experts, including members of the German Chaos Computer Club (CCC), has now revealed security gaps in the software for test centres by which more than 136,000 COVID-19 test results of more than 80,000 data subjects have apparently been unprotected on the internet for weeks.

The IT-Security experts’ findings concern the software “SafePlay” of the Austrian company Medicus AI. Many test centres use this software to allocate appointments and to make test results digitally available to those tested. In fact, more than 100 test centres and mobile test teams in Germany and Austria are affected by the recent data breach. These include public facilities in Munich, Berlin, Mannheim as well as fixed and temporary testing stations in companies, schools and daycare centres.

In order to view the test results unlawfully, one only needed to create an account for a COVID-19 test. The URL for the test result contained the number of the test. If this number was simply counted up or down, the “test certificates” of other people became freely accessible. In addition to the test result, the test certificate also contained the name, date of birth, private address, nationality and ID number of the person concerned.

It remains unresolved whether the vulnerabilities have been exploited prior to the discovery by the CCC. The CCC notified both Medius AI and the Data Protection Authorities about the leak which led to a quick response by the company. However, IT experts and Privacy-focused NGOs commented that Medicus AI was irresponsible and grossly negligent with respect to their security measures leading to the potential disclosure of an enormous amount of sensitive personal health data.

French Government seeks to disregard CJEU data retention of surveillance data ruling

9. March 2021

On March 3rd, POLITICO reported that the French government seeks to bypass the Court of Justice of the European Union’s (CJEU) ruling on limiting member states’ surveillance activities of phone and internet data, stating governments can only retain mass amounts of data when facing a “serious threat to national security”.

According to POLITICO, the French government has requested the country’s highest administrative court, the Council of State, to not follow the CJEU’s ruling in the matter.

Last year in October, the CJEU ruled that several national data retention rules were not compliant with EU law. This ruling included retention times set forth by the French government in matters of national security.

The French case in question opposes the government against digital rights NGOs La Quadrature du Net and Privacy International. After the CJEU’s ruling, it is now in the hands of the Council of State in France, which will have to decide on the matter.

A hearing date has not yet been decided, however POLITICO sources state that the French government is trying to bypass the CJEU’s ruling by presenting the argument of the ruling going against the country’s “constitutional identity”. This argument, first used back in 2006, is seldomly used, however can be referred to in order to avoid applying EU law at national level.

In addition, the French government accuses the CJEU to have ruled out of its competence, as matters of national security remain solely part of national competence.

The French government did not want to comment on the ongoing process, however has had a history of refusing to adopt EU court rulings into national law.

AEPD issues highest fine for GDPR violations

5. March 2021

The Spanish Data Protection Authority, the Agencia Española de Protección de Datos (AEPD), imposed a fine of EUR 6.000.000 on CaixaBank, Spain’s leading retail bank, for unlawfully processing customers’ personal data and not providing sufficient information regarding the processing of their personal data. It is the largest financial penalty ever issued by the AEPD under the GDPR, surpassing the EUR 5.000.000 fine imposed on BBVA in December 2020 for information and consent failures.

In the opinion of the AEPD, CaixaBank violated Art. 6 GDPR in many regards. The bank had not provided sufficient justification of the legal basis for the processing activities, in particular with regard to those based on the company’s legitimate interest. Furthermore, deficiencies had been identified in the processes for obtaining customers’ consent to the processing of their personal data. The bank had also failed to comply with the requirements established for obtaining valid consent as a specific, unequivocal and informed expression of intention. Moreover, the AEPD stated that the transfer of personal data to companies within the CaixaBank Group was considered an unauthorized disclosure. According to Art. 83 (5) lit. a GDPR, an administrative fine of EUR 4.000.000 EUR was issued.

Additionally, the AEPD found that CaixaBank violated Art. 13, 14 GDPR. The bank had not complied with the information obligations since the information regarding the categories of personal data concerned had not been sufficient and the information concerning the purposes of and the legal basis for the processing had been missing entirely. What’s more, the information provided in different documents and channels had not been consistent. The varying information concerned data subjects’ rights, the possibility of lodging a complaint with the AEPD, the existence of a data protection officer and his contact details as well as data retention periods. Besides, the AEPD disapproved of the use of inaccurate terminology to define the privacy policy. Following Art. 83 (5) lit. b GDPR, a fine of EUR 2.000.000 was imposed.

In conclusion, the AEPD ordered CaixaBank to bring its data processing operations into compliance with the legal requirements mentioned within six months.

Pages: 1 2 3 4 5 6 7 8 9 10 ... 17 18 19 Next
1 2 3 19