Category: Personal Data

Norwegian DPA intends to fine Grindr

26. January 2021

The Norwegian Data Protection Authority “Datatilsynet” (in the following “DPA”) announced recently that it intends to fine the online dating provider “Grindr LLC” (in the following “Grindr”) for violations of the GDPR an administrative fine of € 9.6 Mio. (NOK 100 Mio.).

Grindr is a popular and widely used Dating App for gay, bi, trans and queer people and uses a location-based technology to connect the users. Thus, Grindr processes beside personal data also sensitive data like the sexual orientation of the users. The latter are subject to a high level of protection due to the requirements of the GDPR.

The DPA came to the conclusion that Grindr transferred personal data of its users to third parties for marketing purposes without having a legal basis for doing so. In particular, Grindr neither informed the data subjects in accordance with the GDPR nor have obtained consent from the concerned data subject. Datatilsynet considers this case as serious, because the users were not able to exercise real and effective control over the sharing of their data.

Datatilsynet has set a deadline of February 15th, 2021 for Grindr to submit its comments on the case and will afterwards make its final decision.

WhatsApp’s privacy policy update halted

22. January 2021

Already at the beginning of December 2020, first indications came up signaling that WhatsApp will change its terms of service and privacy policy. Earlier this year, users received the update notice when launching the app on their device. It stated that the new terms concern additional information on how WhatsApp processes user data and how businesses can use Facebook hosted services to store and manage their WhatsApp chats. The terms should be accepted by February 8th, 2021, to continue using the chat service. Otherwise, the deletion of the account was suggested, because it will not be possible to use WhatsApp without accepting the changes. The notice has caused all sorts of confusion and criticism, because it has mistakenly made many users believe that the agreement allows WhatsApp to share all collected user data with company parent Facebook, which had faced repeated privacy controversies in the past.

Users’ fears in this regard are not entirely unfounded. As a matter of fact, outside the EU, WhatsApp user data has already been flowing to Facebook since 2016 – for advertising purposes, among other things. Though, for the EU and the United Kingdom, other guidelines apply without any data transfer.

The negative coverage and user reactions caused WhatsApp to hastily note that the changes explicitly do not affect EU users. Niamh Sweeney, director of policy at WhatsApp, said via Twitter that it remained the case that WhatsApp did not share European user data with Facebook for the purpose of using this data to improve Facebook’s products or ads.

However, since the topic continues to stir the emotions, WhatsApp felt compelled to provide clarification with a tweet and a FAQ. The statements make it clear once again that the changes are related to optional business features and provide further transparency about how the company collects and uses data. The end-to-end encryption, with which chat content is only visible to the participating users, will not be changed. Moreover, the new update does not expand WhatsApp’s ability to share data with Facebook.

Nevertheless, despite all efforts, WhatsApp has not managed to explain the changes in an understandable way. It has even had to accept huge user churn in recent days. The interest in messenger alternatives has increased enormously. Eventually, the public backlash led to an official announcement that the controversial considered update will be delayed until May 15th, 2021. Due to misinformation and concern, users shall be given more time to review the policy on their own in order to understand WhatsApp’s privacy and security principles.

EU-UK Trade Deal in light of Data Protection

4. January 2021

Almost fit to be called a Christmas miracle, the European Union (EU) and the United Kingdom (UK) came to an agreement on December 24th, 2020. The Trade Agreement, called in full length “EU-UK Trade and Cooperation Agreement“, is set out to define new rules from the date of the UK Exit from the EU, January 1st, 2021.

President of the European Commission, Ursula von der Leyen, claimed it was a deal worth fighting for, “because we now have a fair and balanced agreement with the UK, which will protect our European interests, ensure fair competition, and provide much needed predictability for our fishing communities. Finally, we can leave Brexit behind us and look to the future. Europe is now moving on.

In light of Data Protection however, the new Trade Deal has not given much certainty of what is to come next.

Both sides are aware that an adequacy decision by the EU Commission is very important with regard to data protection and cross-border data flows. Accordingly, the EU has agreed to allow a period of four months, extendable by a further two months, during which data can be transferred between EU Member States and the UK without additional safeguards. This period was granted to give the Commission enough time to make an adequacy decision. Accordingly, data transfers can continue as before until possibly mid-2021. However, this arrangement is only valid if the UK does not change its data protection laws in the meantime.

With regard to direct marketing, the situation has not changed either: for individuals, active consent must be given unless there was a prior contractual relationship and the advertising relates to similar products as the prior contract. Furthermore, the advertising must also be precisely recognisable as such, and the possibility of revoking consent must be given in every advertising mail.

However, much else has yet to be clarified. Questions such as the competence of the UK Data Protection Authority, the Information Commissioner’s Office (ICO), as well as the fate of its ongoing investigations, have not yet been answered. As of now, companies with their original EU Headquarters in the UK will have to designate a new Lead Supervisory Authority (Art. 56 GDPR) for their business in the EU.

The upcoming months will determine if questions with high relevance to businesses’ day to day practice will be able to be answered reassuringly.

Swedish court confirms Google’s violations of the GDPR

16. December 2020

The Administrative Court of Stockholm announced on November 23rd, 2020, that it had rejected Google LLC’s appeal against the decision of the Swedish Data Protection Authority (Datainspektionen) determining Google’s violations of the GDPR. Google as a search engine operator had not fulfilled its obligations regarding the right to be forgotten (RTBF). However, the court reduced the fine from a total of SEK 75 million (approx. € 7,344,000) to SEK 52 million (approx. € 5,091,000).

Background to the case was the Swedish DPA’s audit in 2017 concerning Google’s handling of requests on delisting, which means removal of certain results from a search engine. The DPA concluded the inspection by ordering Google to delist certain individuals’ names due to inaccuracy, irrelevance and superfluous information. In 2018 the DPA initiated a follow-up audit because of indications that Google had not fully complied with the previously issued order. It resulted in issuing an administrative fine of SEK 75 million in March 2020.

The DPA raised attention to the fact that the GDPR increases the obligations of data controllers and data processors as well as strengthens the rights of individuals, which include the right to have their search result delisted. Though, Google has not been fully complying with its obligations, as it has not properly removed two of the search result listings that the DPA had ordered to delete. In one case Google has done a too narrow interpretation of what web addresses to remove, in the other case Google has failed to remove it without undue delay.

Moreover, the DPA criticized Google’s procedure of managing delisting requests and found it to be undermining data subjects’ rights. Following the removal of a search result listing, Google notifies the website to which the link is directed. The delisting request form, directed to the data subject raising the request, states that information on the removed web addresses can be provided to the webmaster. This information has to be seen as misleading since the data subject is made to understand that its consent to the notification is required in order to process the request. Therefore, such practice might result in individuals refraining from exercising their right to request delisting, which violates Art. 5 (1) lit. a) GDPR. What’s more, in the opinion of the DPA the delisting notifications to the webmasters are not covered by legal obligations according to Art. 6 (1) lit. c), 17 (2) GDPR, nor legitimate interests pursuant to Art. 6 (1) lit. f) GDPR. Also, Google’s routine of regularly sending information to webmasters constitutes processing of personal data being incompatible with the purpose for which the data was originally collected. This practice infringes Art. 5 (1) lit. b), 6 (4) GDPR.

Google appealed the decision of the DPA. Though, the Swedish Administrative Court of Stockholm reaffirmed the DPA’s opinion and confirmed Google’s violations of the GDPR.

The court stated that the process concerning delisting requests must facilitate for the individual to exercise its rights. That means, any process that restricts the individuals’ rights may violate Art. 15 through 22 GDPR. The court also specified why the personal data had been processed beyond their original purpose. Since the notifications are only sent after Google has removed a search result, the purpose of the processing has already expired when the notification is sent. Thus, the notification cannot be considered effective in achieving the purpose specified by Google.

Google shall now delist specific search results and cease to inform webmasters of requests. Also, Google must adapt its data subject rights procedure within eight weeks after the court’s judgment has gained legal force.

16 Million brazilian COVID-19 patients’ personal data exposed online

7. December 2020

In November 2020, personal and sensitive health data of about 16 Million brazilian COVID-19 patients has been leaked on the online platform GitHub. The cause was a hospital employee, that uploaded a spreadsheet with usernames, passwords, and access keys to sensitive government systems on the online platforms. Under those affected were also the brazilian President Jair Bolsonaro and his family as well as seven ministers and 17 provincial governors.

Under the exposed systems were two government databases used to store information on COVID-19 patients. The first “E-SUS-VE” was used for recording COVID-19 patients with mild symptoms, while the second “Sivep-Gripe” was used to keep track of hospitalized cases across the country.

However, both systems contained highly sensitive personal information such as patient names, addresses, telephone numbers, individual taxpayer’s ID information, but also healthcare records such as medical history and medication regimes.

The leak was discovered after a GitHub user spotted the spreadsheet containing the password information on the personal GitHub account of an employee of the Albert Einstein Hospital in Sao Paolo. The user informed the Brazilian newspaper Estadao, which analysed the information shared on the platform before it notified the hospital and the health ministry of Brazil.

The spreadsheet was ultimately removed from GitHub, while government officials changed passwords and revoked access keys to secure their systems after the leak.

However, Estadao reporters confirmed that the leaked data included personal data of Brazilians across all 27 states.

New Zealand’s Privacy Act 2020 comes into force

4. December 2020

New Zealand’s Office of the Privacy Commissioner announced the Privacy Act 2020 has taken effect. Certain aspects of the Privacy Act came into force on July 1st, 2020, with most operative provisions commencing from December 1st, 2020. The new law affords better privacy protections and greater obligations for organisations and businesses when handling personal information. It also gives the Privacy Commissioner greater powers to ensure the agencies comply with the Privacy Act.

Notably, the updated legislation features new breach reporting obligations, criminal penalties and provisions on international data transfers.

Part 6. of the Privacy Act 2020 covers notifiable privacy breaches and compliance notices. It introduces a new mandatory reporting requirement. When an agency becomes aware of a privacy breach that it is reasonable to believe has caused serious harm to an affected individual or individuals or is likely to do so (unless a specific limited exception applies), the agency must notify the Privacy Commissioner and affected individuals as soon as practicable. In addition, the Privacy Commissioner may issue a compliance notice to an agency to require it to do something or stop doing something to comply with the Privacy Act. For the sake of completeness, it should be mentioned that there is no distinction between a data controller and a data processor. The term “agencies” refers to all data processing bodies.

Furthermore, new criminal offences have been incorporated into Part 9. of the Privacy Act (Section 212). It is now an offence to mislead an agency for the purpose of obtaining access to someone else’s personal information – for example, by impersonating an individual or falsely pretending to be an individual or to be acting under the authority of an individual. The Privacy Act also creates a new offence of destroying any document containing personal information, knowing that a request has been made in respect of that information. The penalty for these offences is a fine of up to $ 10,000.

Moreover, in accordance with Part 5. of the Privacy Act (Section 92), the Privacy Commissioner may direct an agency to confirm whether it holds any specified personal information about an individual and to provide the individual access to that information in any manner that the Privacy Commissioner considers appropriate.

What’s more, a new Information Privacy Principle (IPP) has been added to Part 3. of the Privacy Act (Section 22), which governs the disclosure of personal information outside New Zealand. Under IPP 12, an agency may disclose personal information to a foreign person or entity only if the receiving agency is subject to privacy laws that, overall, provide comparable safeguards to those in the Privacy Act.

Apart from that, pursuant to Part 1. of the Privacy Act (Section 4), the privacy obligations also apply to overseas agencies within the meaning of Section 9 that are “carrying on business” in New Zealand, even if they do not have a physical presence there. This will affect businesses located offshore.

Privacy Commissioner John Edwards welcomes the Privacy Act, noting that the new law reflects the changes in New Zealand’s wider economy and society as well as a modernised approach to privacy:

The new Act brings with it a wider range of enforcement tools to encourage best practice, which means we are now able to take a different approach to the way we work as a regulator.

Since the Privacy Act 2020 replaces the Privacy Act 1993, which will still be relevant to privacy complaints about actions that happened before December 1st, a guidance has been issued on which act applies and when. The Office of the Privacy Commissioner has also published a compare chart that shall help navigate between the acts.

Admonition for revealing a list of people quarantined in Poland

27. November 2020

The President of the Personal Data Protection Office in Poland (UODO) imposed an admonition on a company dealing with waste management liable for a data breach and ordered to notify the concerned data subjects. The admonition is based on a violation of personal data pertaining to data subjects under medical quarantine. The city name, street name, building/flat number and the fact of remaining under quarantine of the affected data subjects have been provided by the company to unauthorized recipients. The various recipients were required to verify whether, in a given period, waste was to be collected from places determined in the above-mentioned list.

The incident already happened in April 2020. Back then, a list of data subjects was made public, containing information on who had been quarantined by the administrative decision of the District Sanitary-Epidemiological Station (PPIS) in Gniezno as well as information on quarantined data subjects in connection with crossing the country border and on data subjects undergoing home isolation due to a confirmed SARS-CoV-2 infection. After becoming aware of the revelation, the Director of PPIS notified the relevant authorities – the District Prosecutor’s Office and the President of UODO – about the incident.

PPIS informed them that it had carried out explanatory activities showing that the source of disclosure of these data was not PPIS. These data were provided to the District Police Headquarters, the Head of the Polish Post Office, Social Welfare Centres and the Headquarters of the State Fire Service. Considering the fact that these data had been processed by various parties involved, it was necessary to establish in which of them the breach may have occurred.

UODO took steps to clarify the situation. In the course of the proceedings, it requested information from a company dealing with waste management being one of the recipients of the personal data. The company, acting as the data controller, had to explain whether, when establishing the procedures related to the processing of personal data, it had carried out an assessment of the impact of the envisaged processing operations on the protection of personal data according to Art. 35 GDPR. The assessment persists in an analysis of the distribution method in electronic and paper form in terms of risks related to the loss of confidentiality. Furthermore, the data controller had to inform UODO about the result of this analysis.

The data controller stated that it had conducted an analysis considering the circumstances related to non-compliance with the procedures in force by data processors and circumstances related to theft or removal of data. Moreover, the data controller expressed the view that the list, received from the District Police Headquarters, only included administrative (police) addresses and did not contain names, surnames and other data allowing the identification of a natural person. Thus, the GDPR would not apply, because the data has to be seen as anonymized. However, from the list also emerged the fact that residents of these buildings/flats were placed in quarantine, which made it possible to identify them. It came out that the confidentiality of the processed data had been violated in the course of the performance of employee duties of the data processor, who had left the printed list on the desk without proper supervision. During this time, another employee had recorded the list in the form of a photo and had shared it with another person.

Following the review of the entirety of the collected material in this case, UODO considered that the information regarding the city name, street name, building/flat number and placing a data subject in medical quarantine, constitute personal data within the meaning of Art. 4 (1) GDPR, while the last comprises a special category of personal data concerning health according to Art. 9 (1) GDPR. Based on the above, it is possible to identify the data subjects, and therefore the data controller is bound to the obligations arising from the GDPR.

In the opinion of UODO, the protective measures indicated in the risk analysis are general formulations, which do not refer to specific activities undertaken by authorized employees. The measures are insufficient and inadequate to the risks of processing special categories of data. In addition, the data controller should have considered factors, such as recklessness and carelessness of employees and a lack of due diligence.

According to Art. 33 (1) GDPR, the data controller shall without undue delay and, where feasible, not later than 72 hours after having become aware of the data breach, notify it to the competent supervisory authority. Moreover, in a situation of high risk to the rights and freedoms of the data subjects, resulting from the data breach (which undoubtedly arose from the disclosure), the data controller is obliged to inform the data subject without undue delay in accordance with Art. 34 (1) GDPR. Despite this, the company did not report the infringement, neither to the President of UODO nor to the concerned data subjects.

China issued new Draft for Personal Information Protection Law

23. November 2020

At the end of October 2020, China issued a draft for a new „Personal Information Protection Law” (PIPL). This new draft is the introduction of a comprehensive system in terms of data protection, which seems to have taken inspiration from the European General Data Protection Regulation (GDPR).

With the new draft, China’s regulations regarding data protection will be consisting of China’s Cybersecurity Law, Data Security Law (draft) and Draft PIPL. The new draft legislation contains provisions relating to issues presented by new technology and applications, all of this in around 70 articles. The fines written in the draft for non-compliance are quite high, and will bring significant impact to companies with operations in China or targeting China as a market.

The data protection principles drawn out in the draft PIPL include transparency, fairness, purpose limitation, data minimization, limited retention, data accuracy and accountability. The topics that are covered include personal information processing, the cross-border transfer of personal information, the rights of data subjects in relation to data processing, obligations of data processors, the authority in charge of personal information as well as legal liabilities.

Unlike China’s Cybersecurity Law, which provides limited extraterritorial application, the draft PIPL proposes clear and specific extraterritorial application to overseas entities and individuals that process the personal data of data subjects in China.

Further, the definition of “personal data” and “processing” under the draft PIPL are very similar to its equivalent term under the GDPR. Organizations or individuals outside China that fall into the scope of the draft PIPL are also required to set up a dedicated organization or appoint a representative in China, in addition to also report relevant information of their domestic organization or representative to Chinese regulators.

In comparison to the GDPR, the draft PIPL extends the term of “sensitive data” to also include nationality, financial accounts, as well as personal whereabouts. However, sensitive personal information is defined as information that once leaked or abused may cause damage to personal reputation or seriously endanger personal and property safety, which opens the potential for further interpretation.

The draft legislation also regulates cross-border transfers of personal information, which shall be possible if it is certified by recognized institutions, or the data processor executes a cross-border transfer agreement with the recipient located outside of China, to ensure that the processing meets the protection standard provided under the draft PIPL. Where the data processor is categorized as a critical information infrastructure operator or the volume of data processed by the data processor exceeds the level stipulated by the Cyberspace Administration of China (CAC), the cross-border transfer of personal information must pass a security assessment conducted by the CAC.

It further to keep in mind that the draft PIPL enlarges the range of penalties beyond those provided in the Cybersecurity Law, which will put a much higher pressure on liabilities for Controllers operating in China.

Currently, the period established to receive open comments on the draft legislation has ended, but the next steps have not yet been reported, and it not yet sure when the draft legislation will come into full effect.

California Voters approve new Privacy Legislation CPRA

20. November 2020

On November 3rd 2020, Californian citizens were able to vote on the California Privacy Rights Act of 2020 (“CPRA”) in a state ballot (we reported). As polls leading up to the vote already suggested, California voters approved the new Privacy legislation, also known as “Prop 24”. The CPRA was passed with 56.2% of Yes Votes to 43.8% of No Votes. Most provisions of the CPRA will enter into force on 1 January 2021 and will become applicable to businesses on 1 January 2023. It will, at large, only apply to information collected from 1 January 2022.

The CPRA will complement and expand privacy rights of California citizens considerably. Among others, the amendments will include:

  • Broadening the term “sale” of personal information to “sale or share” of private information,
  • Adding new requirements to qualify as a “service provider” and defining the term “contractor” anew,
  • Defining the term “consent”,
  • Introducing the category of “Sensitive Information”, including a consumer’s Right to limit the use of “Sensitive Information”,
  • Introducing the concept of “Profiling” and granting consumers the Right to Opt-out of the use of the personal information for Automated Decision-Making,
  • Granting consumers the Right to correct inaccurate information,
  • Granting consumers the Right to Data Portability, and
  • Establishing the California Privacy Protection Agency (CalPPA) with a broad scope of responsibilities and enforcement powers.

Ensuring compliance with the CPRA will require proper preparation. Affected businesses will have to review existing processes or implement new processes in order to guarantee the newly added consumer rights, meet the contractual requirements with service providers/contractors, and show compliance with the new legislation as a whole.

In an interview after the passage of the CPRA, the initiator of the CCPA and the CPRA Alastair Mactaggard commented that

Privacy legislation is here to stay.

He hopes that California Privacy legislation will be a model for other states or even the U.S. Congress to follow, in order to offer consumers in other parts of the country the same Privacy rights as there are in California now.

Canadian Government proposes new federal privacy law

18. November 2020

On November 17th, Navdeep Bains, the Canadian Minister of Information Science and Economic Development, introduced Bill C-11, which is intended to modernize and reshape the Canadian privacy framework and to comply with EU and U.S. legislation. Its short title is Digital Charter Implementation Act,2020 (DCIA). A fact sheet accompanying the DCIA states:

“… If passed, the DCIA would significantly increase protections to Canadians’ personal information by giving Canadians more control and greater transparency when companies handle their personal information. The DCIA would also provide significant new consequences for non-compliance with the law, including steep fines for violations. …”

Part one of the DCIA is the Consumer Privacy Protection Act (CPPA), which is intended to establish a new privacy law in the Canadian private sector. New consent rules are to be adopted, data portability is introduced as a requirement, the subject’s access to its personal data is enhanced as well as their rights to erase personal data. Data subjects further have the right to request businesses to explain how a prediction, recommendation, or decision was reached that was made by an automated decision-making system. Furthermore, they have the right to know how personal data is being used, as well as the right to review and challenge the amount of personal data that is being collected by a company or government. On demand, a privacy management program must be provided to the Canadian Office of the Privacy Commissioner (OPC). For non-compliance companies face possible fines up to 5% of the company’s global revenue, or C$25 Million, whichever is higher. According to Bains, these are the highest fines in all the G7-nations. Businesses can ask the OPC to approve their codes of practice and certification systems, and in socially beneficial cases, disclose de-identified data with public entities.

Bill C-11 further contains the “Personal Information and Privacy Protection Tribunal Act”, which is supposed to make enforcement of privacy rights faster and more efficient. For that purpose, more resources are committed to the OPC. The OPC can now issue “orders”, which have the same effect as Federal Court orders. Further, the OPC may force companies to comply or order them to stop collecting and using personal data. The newly formed Data Protection Tribunal can raise penalties and hear appeals regarding orders issued by the OPC.

Lastly, a private right of action is also included in the bill. This allows individuals to sue companies within two years after the commissioner issues a finding of privacy violation that is upheld by the Tribunal.

Pages: Prev 1 2 3 4 5 6 7 8 9 10 ... 17 18 19 Next
1 2 3 4 5 6 19