Category: General Data Protection Regulation

Facebook data leak affects more than 500 million users

7. April 2021

Confidential data of 533 million Facebook users has surfaced in a forum for cybercriminals. A Facebook spokesperson told Business Insider that the data came from a leak in 2019.

The leaked data includes Facebook usernames and full name, date of birth, phone number, location and biographical information, and in some cases, the email address of the affected users. Business Insider has verified the leaked data through random sampling. Even though some of the data may be outdated, the leak poses risks if, for example, email addresses or phone numbers are used for hacking. The leak was made public by the IT security firm Hudson Rock. Their employees noticed that the data sets were offered by a bot for money in a hacking forum. The data set was then offered publicly for free and thus made accessible to everyone.

The US magazine Wired points out that Facebook is doing more to confuse than to help clarify. First, Facebook referred to an earlier security vulnerability in 2019, which we already reported. This vulnerability was patched in August last year. Later, a blog post from a Facebook product manager confirmed that it was a major security breach. However, the data had not been accessed through hacking, but rather the exploitation of a legitimate Facebook feature. In addition, the affected data was so old that GDPR and U.S. privacy laws did not apply, he said. In the summer of 2019, Facebook reached an agreement with the U.S. Federal Trade Commission (FTC) to pay a $5 billion fine for all data breaches before June 12, 2019. According to Wired, the current database is not congruent with the one at issue at the time, as the most recent Facebook ID in it is from late May 2019.

Users can check whether they are affected by the data leak via the website HaveIBeenPwned.

EU and South Korea complete adequacy talks

6. April 2021

On March 30th, 2021, EU Justice Commissioner Didier Reynders and Chairperson of the Personal Information Protection Commission of the Republic of Korea Yoon Jong In announced the successful conclusion of adequacy talks between the EU und the Republic of Korea (“South Korea”). These adequacy discussions began in 2017, and there was already initially a high level of convergence between the EU and the Republic of Korea on data protection issues, which has been further enhanced by additional safeguards to further strengthen the level of protection in South Korea. Recently, South Korea’s Personal Information Protection Act (“PIPA”) took effect and the investigative and enforcement powers of South Korea’s data protection authority, the Personal Information Protection Commission (“PIPC”), were strengthened.

In the GDPR, this adequacy decision is based on Art. 45 GDPR. Article 45(3) GDPR empowers the EU Commission to adopt an implementing act to determine that a non-EU country ensures an “adequate level of protection”. This means a level of protection for personal data that is substantially equivalent to the level of protection within the EU. Once it has been determined that a non-EU country provides an “adequate level of protection”, transfers of personal data from the EU to that non-EU country can take place without further requirements. South Korea will be the 13th country to which personal data may be transferred on the basis of an adequacy decision. An adequacy decision covering both commercial providers and the public sector will enable free and secure data flows between the EU and the Republic of Korea and it will complement the EU-Republic of Korea Free Trade Agreement.

Until the free flow of data can occur, the EU Commission must initiate the procedure for adopting its adequacy finding. In this procedure, the European Data Protection Board will issue an opinion and a committee composed of representatives of the EU member states must agree. The EU Commission may then adopt the adequacy decision.

EDPB released a new Guidance on Virtual Voice Assistants

31. March 2021

In recent years, Virtual Voice Assistants (VVA) have enjoyed increased popularity among technophile consumers. VVAs are integrated in modern smartphones like Siri on Apple or Google Assistant on Android mobile devices, but can also be found in seperate terminal devices like Alexa on the Amazon Echo device. With Smart Homes trending, VVAs are finding their ways into many homes.

However, in light of their general mode of operation and their specific usage, VVAs potentially have access to a large amount of personal data. They furthermore use new technologies such as machine learning and artificial intelligence in order to improve their services.

As both private households and corporate businesses are increasingly using VVAs and questions on data protection arise, the European Data Protection Board (EDPB) sought to provide guidance to the relevant data controllers. Therefore, the EDPB published a guidance on Virtual Voice Assistants earlier this month.

In its guidance, the EDPB specifically addresses VVA providers and VVA application developers. It encourages them to take considerations of data protection into account when designing their VVA service, as layed out by the principle of data protection by design and default under Art. 25 GDPR. The EDPB suggests that, for example, controllers could fulfil their information obligations pursuant to Art. 13/14 GDPR using voice based notifications if the VVA works with a screenless terminal device. VVA designers could also enable users to initiate a data subject request though easy-to-follow voice commands.

Moreover, the EDPB states that in their opinion, providing VVA services will require a Data Protection Impact Assessment according to Art. 35 GDPR. The guidance also gives further advice on complying with general data protection principles and is still open for public consultation until 23 April 2021.

Data Breach made 136,000 COVID-19 test results publicly accessible

18. March 2021

Personal health data are considered a special category of personal data under Art. 9 of the GDPR and are therefore given special protections. A group of IT experts, including members of the German Chaos Computer Club (CCC), has now revealed security gaps in the software for test centres by which more than 136,000 COVID-19 test results of more than 80,000 data subjects have apparently been unprotected on the internet for weeks.

The IT-Security experts’ findings concern the software “SafePlay” of the Austrian company Medicus AI. Many test centres use this software to allocate appointments and to make test results digitally available to those tested. In fact, more than 100 test centres and mobile test teams in Germany and Austria are affected by the recent data breach. These include public facilities in Munich, Berlin, Mannheim as well as fixed and temporary testing stations in companies, schools and daycare centres.

In order to view the test results unlawfully, one only needed to create an account for a COVID-19 test. The URL for the test result contained the number of the test. If this number was simply counted up or down, the “test certificates” of other people became freely accessible. In addition to the test result, the test certificate also contained the name, date of birth, private address, nationality and ID number of the person concerned.

It remains unresolved whether the vulnerabilities have been exploited prior to the discovery by the CCC. The CCC notified both Medius AI and the Data Protection Authorities about the leak which led to a quick response by the company. However, IT experts and Privacy-focused NGOs commented that Medicus AI was irresponsible and grossly negligent with respect to their security measures leading to the potential disclosure of an enormous amount of sensitive personal health data.

European Commission publishes draft UK adequacy decisions

25. February 2021

On February 19th, 2021, the European Commission (EC) has published the draft of two adequacy decisions for the transfer of personal data to the United Kingdom (UK), one under the General Data Protection Regulation (GDPR) and the second for the Law Enforcement Directive. If approved, the decisions would confer adequacy status on the UK and ensure that personal data from the EU can continue to flow freely to the UK. In the EC’s announcement launching the process to adopt the newly drafted adequacy decisions Didier Reynders, Commissioner for Justice, is quoted:

We have thoroughly checked the privacy system that applies in the UK after it has left the EU. Now European Data Protection Authorities will thoroughly examine the draft texts. EU citizens’ fundamental right to data protection must never be compromised when personal data travel across the Channel. The adequacy decisions, once adopted, would ensure just that.

In the GDPR, this adequacy decision is based on Art. 45 GDPR. Article 45(3) GDPR empowers the EU Commission to adopt an implementing act to determine that a non-EU country ensures an “adequate level of protection”. This means a level of protection for personal data that is substantially equivalent to the level of protection within the EU. Once it has been determined that a non-EU country provides an “adequate level of protection”, transfers of personal data from the EU to that non-EU country can take place without further requirements. In the UK, the processing of personal data is governed by the “UK GDPR” and the Data Protection Act 2018, which are based on the EU GDPR. The UK is and has committed to remain part of the European Convention on Human Rights and “Convention 108” of the Council of Europe. “Convention 108” is a binding treaty under international law to protect individuals from abuses in the electronic processing of personal data, and in particular provides for restrictions on cross-border data flows where data is to be transferred to states where no comparable protection exists.

The GDPR adequacy decision draft addresses several areas of concern. One of these is the power of intelligence services in the UK. In this respect, the draft focuses on legal bases, restrictions and safeguards for the collection of information for national security purposes. It also details the oversight structure over the intelligence services and the remedies available to those affected. Another aspect discussed is the limitation of data subjects’ rights in the context of UK immigration law. The EC concludes that interference with individuals’ fundamental rights is limited to what is strictly necessary to achieve a legitimate purpose and that there is effective legal protection against such interference. As the UK GDPR is based on the GDPR and therefore the UK privacy laws should provide an adequate level of protection for data subjects, the main risks for EU data subjects do not lie in the current status of these laws but in possible changes of these laws in the future. For this reason, the EU Commission has built a fixed period of validity into the draft adequacy decision. If adopted, this decision would be valid for a period of four years and the adequacy finding could be extended for a further four years if the level of protection in the UK remains adequate. However, this extension would not be automatic, but subject to a thorough review. This draft marks the first time that the EU has imposed a time limit on an adequacy decision. Other adequacy decisions are subject to monitoring and regular review but are not time-limited by default.

The UK government welcomed the EC’s draft in a statement, while also calling on the EU to “swiftly complete” the process for adopting and formalizing the adequacy decisions, as the “bridging mechanism” will only remain in force until June 30th. Under the EU-UK Trade and Cooperation Agreement, the EU and UK agreed on a transition period of up to six months from January 1st, 2021, during which the UK is treated as an adequate jurisdiction (please see our blog post). The draft adequacy decisions address the flow of data from the EU to the UK. The flow of data from the UK to the EU is governed by UK legislation that has applied since 1 January 2021. The UK has decided that the EU ensures an adequate level of protection and that data can therefore flow freely from the UK to the EU.

Next, the non-binding opinion of the European Data Protection Board is sought (Art. 70 GDPR). After hearing the opinion of the European Data Protection Board, the representatives of the member states must then confirm the draft in the so-called comitology procedure. This procedure is used when the EC is given the power to implement legal acts that lay down conditions for the uniform application of a law. A series of procedures ensure that EU countries have a say in the implementing act. After the comitology procedure, the EC is free to adopt the drafts.

University fined for omitted notification of a data breach

4. February 2021

The President of the Personal Data Protection Office in Poland (UODO) imposed a fine on the Medical University of Silesia in the amount of PLN 25.000 (approx. EUR 5.600). The university had suffered a data breach of which it should have notified the supervisory authority and the data subjects according to Articles 33, 34 GDPR, but failed to do so.

First indications of the data breach reached UODO in early June 2020. It was related to exams held at the end of May 2020 by videoconference on an e-learning platform. These were also being recorded. Before the exam, students were identified by their IDs or student cards, so a large amount of their personal data was documented on the recordings. After the exam was completed, the recordings were made available on the platform. However, not only the examinees had access to the platform, but also a wider group of people, about which the students had not been informed. In addition, using a direct link, any extern person could access the recordings and therefore the data of the examinees. Many students, fearing that the video would be deleted to cover up the incident, secured the file or took photographs of the computer screens to protect evidence. Eventually, the chancellor (being the decision-making unit) expressed the position that the incident of 200 people viewing the IDs of some 100-150 other people cannot be considered a personal data breach.

The controller, who was requested to clarify the situation by UODO, did not dispute the data breach. In fact, the virtual room of the platform is only available to the exam group and only those people have access to the recordings. The violation occurred because one of the employees did not close access to the virtual room after the exam. Though, the controller stated that no notification was required. In his opinion the risk to the rights or freedoms of the data subjects was low. Moreover, after the incident, the system was modified to prevent students from downloading the exam files. The controller also indicated that he identified the individuals who had done so and informed them about their criminal liability for disseminating the data.

Despite several letters from UODO, the university still omitted to report the data breach and notify the data subjects. Therefore, administrative proceedings were initiated. UODO found that the controller failed to comply with his obligations to notify both the supervisory authority and affected data subjects as well as improperly assessed the risk involved.

When imposing the fine, the President of UODO took into account the duration of the infringement (several months), the intentional action of the controller and his unsatisfactory cooperation with the supervisory authority. The fine will serve not only a repressive but also a preventive function, as it shows that the obligations arisen in connection with data breaches cannot be ignored. All the more so because an inappropriate approach to the obligations imposed by the GDPR may lead to negative consequences for those affected by the breaches.

Clubhouse Data Protection issues

28. January 2021

Clubhouse is a new social networking app by the US company Alpha Exploration Co. available for iOS devices. Registered users can open rooms for others to talk about various topics. Participation is possible both as a speaker and as a mere listener. These rooms can be available for the public or as closed groups. The moderators speak live in the rooms and the listeners can then join the virtual room. Participants are initially muted and can be unmuted by the moderators to talk. In addition, the moderators can also mute the participants or exclude them from the respective room. As of now, new users need to be invited by other users, the popularity of these invitations started to rise in autumn 2020 when US celebrities started to use the app. With increasing popularity also in the EU, Clubhouse has come under criticism from a data protection perspective.

As mentioned Clubhouse can only be used upon an invitation. To use the option to invite friends, users must share their address book with Clubhouse. In this way, Alpha Exploration can collect personal data from contacts who have not previously consented to the processing of their data and who do not use the app. Not only Alpha Exploration, but also users may be acting unlawfully when they give the app access to their contacts. The user may also be responsible for the data processing associated with the sharing of address books. Therefore, it is not only the responsibility of Alpha Exploration, but also of the user to ensure that consent has been obtained from the contacts whose personal data is being processed. From a data protection perspective, it is advisable not to grant the Clubhouse app access to this data unless the consent of the respective data subjects has been obtained and ideally documented. Currently, this data is transferred to US servers without the consent of the data subjects in the said address books. Furthermore, it is not apparent in what form and for what purposes the collected contact and account information of third parties is processed in the USA.

Under Clubouse’s Terms of Service, and in many cases according to several national laws, users are prohibited from recording or otherwise storing conversations without the consent of all parties involved. Nevertheless, the same Terms of Service include the sentence “By using the service, you consent to having your audio temporarily recorded when you speak in a room.” According to Clubhouse’s Privacy Policy, these recordings are used to punish violations of the Terms of Service, the Community Guidelines and legal regulations. The data is said to be deleted when the room in question is closed without any violations having been reported. Again, consent to data processing should be treated as the general rule. This consent must be so-called informed consent. In view of the fact that the scope and purpose of the storage are not apparent and are vaguely formulated, there are doubts about this. Checking one’s own platform for legal violations is in principle, if not a legal obligation in individual cases, at least a so-called legitimate interest (Art. 6 (1) (f) GDPR) of the platform operator. As long as recordings are limited to this, they are compliant with the GDPR. The platform operator who records the conversations is primarily responsible for this data processing. However, users who use Clubhouse for conversations with third parties may be jointly responsible, even though they do not record themselves. This is unlikely to play a major role in the private sphere, but all the more so if the use is in a business context.

It is suspected that Clubhouse creates shadow profiles in its own network. These are profiles for people who appear in the address books of Clubhouse users but are not themselves registered with Clubhouse. For this reason, Clubhouse considers numbers like “Mobile-Box” to be well-connected potential users. So far, there is no easy way to object to Clubhouse’s creation of shadow profiles that include name, number, and potential contacts.

Clubhouse’s Terms of Use and Privacy Policy do not mention the GDPR. There is also no address for data protection information requests in the EU. However, this is mandatory, as personal data of EU citizens is also processed. In addition, according to Art. 14 GDPR, EU data subjects must be informed about how their data is processed. This information must be provided to data subjects before their personal data is processed. That is, before the data subject is invited via Clubhouse and personal data is thereby stored on Alpha Exploration’s servers. This information does not take place. There must be a simple opt-out option, it is questionable whether one exists. According to the GDPR, companies that process data of European citizens must also designate responsible persons for this in Europe. So far, it is not apparent that Clubhouse even has such data controllers in Europe.

The german “Verbraucherzentrale Bundesverband” (“VZBV”), the german federate Consumer Organisation, has issued a written warning (in German) to Alpha Exploration, complaining that Clubhouse is operated without the required imprint and that the terms of use and privacy policy are only available in English, not in German as required. The warning includes a penalty-based cease-and-desist declaration relating to Alpha Exploration’s claim of the right to extensive use of the uploaded contact information. Official responses from European data protection authorities regarding Clubhouse are currently not available. The main data protection authority in this case is the Irish Data Protection Commissioner.

So far, it appears that Clubhouse’s data protection is based solely on the CCPA and not the GDPR. Business use of Clubhouse within the scope of the GDPR should be done with extreme caution, if at all.

Norwegian DPA intends to fine Grindr

26. January 2021

The Norwegian Data Protection Authority “Datatilsynet” (in the following “DPA”) announced recently that it intends to fine the online dating provider “Grindr LLC” (in the following “Grindr”) for violations of the GDPR an administrative fine of € 9.6 Mio. (NOK 100 Mio.).

Grindr is a popular and widely used Dating App for gay, bi, trans and queer people and uses a location-based technology to connect the users. Thus, Grindr processes beside personal data also sensitive data like the sexual orientation of the users. The latter are subject to a high level of protection due to the requirements of the GDPR.

The DPA came to the conclusion that Grindr transferred personal data of its users to third parties for marketing purposes without having a legal basis for doing so. In particular, Grindr neither informed the data subjects in accordance with the GDPR nor have obtained consent from the concerned data subject. Datatilsynet considers this case as serious, because the users were not able to exercise real and effective control over the sharing of their data.

Datatilsynet has set a deadline of February 15th, 2021 for Grindr to submit its comments on the case and will afterwards make its final decision.

CJEU Advocate General’s opinion on GDPR’s One-Stop-Shop mechanism

On January 13, 2021, the Advocate General (“AG”) of the Court of Justice of the European Union (“CJEU”) published an opinion in the case of Facebook Ireland Limited, Facebook INC, Facebook Belgium BVBA v the Belgian Data Protection Authority “Gegevensbeschermingsautoriteit” (“Belgian DPA”), addressing the General Data Protection Regulation’s (“GDPR”) One-Stop-Shop mechanism.

In 2015, the Belgian DPA initiated several legal proceedings against Facebook Group members in local courts. The allegation was that Facebook placed cookies on devices of Belgian users without their consent, thereby collecting data in an excessive manner. Facebook argued that with the GDPR becoming applicable in 2018, the Belgian DPA lost its competence to continue the legal proceedings, as Facebook’s lead supervisory authority under the GDPR is the Irish Data Protection Commission. The Belgian Court of Appeal referred several questions to the CJEU, including whether the GDPR’s One-Stop-Shop regime prevented national DPA’s from initiating proceedings in the national courts when it is not the lead DPA.

The AG responded that, in his opinion, the lead DPA has the general jurisdiction over cross-border data processing, while a national DPA may exceptionally bring proceedings before its own national courts. The national DPA’s right is subject to the One-Stop-Shop regime and cooperation and consistency mechanism of the GDPR. Thus, each national DPA has the competence to initiate proceedings against possible infringements affecting its territory, the significant regulatory role of the lead DPA limits this competence with respect to cross-border data processing.

One of the concerns expressed by the Belgian DPA was the risk of insufficient enforcement if only lead DPA’s may act against organizations that do not comply with the GDPR. In this regard, the GA emphasizes that Art. 61 GDPR specifically provides for appropriate mechanisms to address such concerns. National DPA’s have the possibility to ask the lead DPA for assistance in investigations, and if such assistance is not provided, the national DPA concerned may take action itself.

In certain circumstances, the AG sees the possibility for national DPAs not acting as lead DPA to initiate proceedings before their national court, if

  • the DPA is acting outside of the material scope of the GDPR; e.g., because the processing does not involve personal data;
  • cross-border data processing is carried out by public authorities, in the public interest, or to comply with legal obligations;
  • the processor is not established in the EU;
  • there is an urgent need to act to protect the rights and freedoms of data subjects (Art. 66 GDPR);
  • the lead DPA has decided not to process a case.

With regards to data subjects, the AG notes that data subjects can bring action against any controller or processor before the court of their Member State and may file a complaint with their Member State’s DPA, regardless of which Member State’s DPA is the lead DPA.

The AG’s opinion is not legally binding on the CJEU, although the CJEU will take it into account. A final judgment of the CJEU is expected in the coming months. Thereafter, the Belgian Court of Appeal will have to decide its case in accordance with the CJEU’s judgment. The CJEU’s decision will most likely have a lasting impact on the division of roles between lead DPAs and other national DPAs, as well as on the ability of national DPAs to take enforcement actions into their own hands.

Swedish court confirms Google’s violations of the GDPR

16. December 2020

The Administrative Court of Stockholm announced on November 23rd, 2020, that it had rejected Google LLC’s appeal against the decision of the Swedish Data Protection Authority (Datainspektionen) determining Google’s violations of the GDPR. Google as a search engine operator had not fulfilled its obligations regarding the right to be forgotten (RTBF). However, the court reduced the fine from a total of SEK 75 million (approx. € 7,344,000) to SEK 52 million (approx. € 5,091,000).

Background to the case was the Swedish DPA’s audit in 2017 concerning Google’s handling of requests on delisting, which means removal of certain results from a search engine. The DPA concluded the inspection by ordering Google to delist certain individuals’ names due to inaccuracy, irrelevance and superfluous information. In 2018 the DPA initiated a follow-up audit because of indications that Google had not fully complied with the previously issued order. It resulted in issuing an administrative fine of SEK 75 million in March 2020.

The DPA raised attention to the fact that the GDPR increases the obligations of data controllers and data processors as well as strengthens the rights of individuals, which include the right to have their search result delisted. Though, Google has not been fully complying with its obligations, as it has not properly removed two of the search result listings that the DPA had ordered to delete. In one case Google has done a too narrow interpretation of what web addresses to remove, in the other case Google has failed to remove it without undue delay.

Moreover, the DPA criticized Google’s procedure of managing delisting requests and found it to be undermining data subjects’ rights. Following the removal of a search result listing, Google notifies the website to which the link is directed. The delisting request form, directed to the data subject raising the request, states that information on the removed web addresses can be provided to the webmaster. This information has to be seen as misleading since the data subject is made to understand that its consent to the notification is required in order to process the request. Therefore, such practice might result in individuals refraining from exercising their right to request delisting, which violates Art. 5 (1) lit. a) GDPR. What’s more, in the opinion of the DPA the delisting notifications to the webmasters are not covered by legal obligations according to Art. 6 (1) lit. c), 17 (2) GDPR, nor legitimate interests pursuant to Art. 6 (1) lit. f) GDPR. Also, Google’s routine of regularly sending information to webmasters constitutes processing of personal data being incompatible with the purpose for which the data was originally collected. This practice infringes Art. 5 (1) lit. b), 6 (4) GDPR.

Google appealed the decision of the DPA. Though, the Swedish Administrative Court of Stockholm reaffirmed the DPA’s opinion and confirmed Google’s violations of the GDPR.

The court stated that the process concerning delisting requests must facilitate for the individual to exercise its rights. That means, any process that restricts the individuals’ rights may violate Art. 15 through 22 GDPR. The court also specified why the personal data had been processed beyond their original purpose. Since the notifications are only sent after Google has removed a search result, the purpose of the processing has already expired when the notification is sent. Thus, the notification cannot be considered effective in achieving the purpose specified by Google.

Google shall now delist specific search results and cease to inform webmasters of requests. Also, Google must adapt its data subject rights procedure within eight weeks after the court’s judgment has gained legal force.

Pages: Prev 1 2 3 4 5 6 7 8 9 10 ... 14 15 16 Next
1 2 3 4 5 16