Happy New Year 2021!

1. January 2021

Dear readers,

We, the team of the privacy-ticker.com, wish you a happy new year.

As in the extraordinary and challenging year of 2020, we are delighted to keep you updated in the new year on judicial and supervisory decisions on international data protection law, as well as other news in the areas of data protection and data security.

We look forward to 2021 with excitement and hope and wish you only the best for the new year. Stay safe and healthy!

Your team of privacy-ticker.com

Category: General

European Commission proposes draft “Digital Service Act” and “Digital Market Act”

21. December 2020

On December 15th, the European Commission published drafts on the “Digital Service Act” (“DSA”) and the “Digital Market Act” (“DMA”), which are intended to restrict large online platforms and stimulate competition.

The DSA is intended to rework the 20-year-old e-Commerce Directive and introduce a paradigm shift in accountability. Under the DSA, platforms would have to prove that they acted in a timely manner in removing or blocking access to illegal content, or that they have no actual knowledge of such content. Violators would face fines of up to 6% of annual revenue. Authorities could order providers to take action against specific illegal content, after which they must provide immediate feedback on what action was taken and when. Providing false, incomplete or misleading information as part of the reporting requirement or failing to conduct an on-site inspection could result in fines of up to 1% of annual revenue. The scope of said illegal content is to include for example, criminal hate comments, discriminatory content, depictions of child sexual abuse, non-consensual sharing of private images, unauthorized use of copyrighted works, and terrorist content. Hosting providers will be required to establish efficient notice and action mechanisms that allow individuals to report and take action against posts they deem illegal. Platforms would not only be required to remove illegal content, but also explain to users why the content was blocked and give them the opportunity to complain.

Any advertising on ad-supported platforms would be required to be clearly identifiable as advertising and clearly state who sponsored it. Exceptions are to apply to smaller journalistic portals and bloggers, while even stricter rules would apply to large platforms. For example, platforms with more than 45 million active users in the EU could be forced to grant comprehensive access to stored data, provided that trade secrets are not affected, and to set up archives that make it possible to identify disinformation and illegal advertising.

Social network operators would have to conduct annual risk assessments and review how they deal with systemic threats, such as the spread of illegal content. They would also be required to provide clear, easy-to-understand and detailed reports at least once a year on the content moderation they have carried out during that period.

Newly appointed “Digital Service Coordinators” in each EU-Member-State are supposed to enforce the regulation, for example by ordering platforms to share data with researchers who shall investigate the platforms relevant activities, while a new European committee is to ensure that the DSA is applied uniformly across the EU. On demand of the Digital Service Coordinators platforms would have to provide researchers with key data, so they can investigate the platforms relevant activities.

The DMA includes a list of competition requirements for large platforms, so called “gatekeepers”, that have a monopoly-like status. The regulations aim to strengthen smaller competitors and prevent the large gatekeepers from using their dominance to impose practices perceived as unfair. They would neither be allowed to exclusively pre-install their own applications, nor to force other operating system developers or hardware manufacturers to have programs pre-installed exclusively by the gatekeeper’s company. In addition, preventing users from uninstalling included applications would be prohibited. Other common measures of self-preference would also be prohibited. For example, gatekeepers would no longer be allowed to use data generated by their services for their own commercial activities without also making the information available to other commercial users. If a provider wanted to merge data generated by different portals, he would have to obtain explicit consent from users to do so.

The publication of the DSA and the DMA is the next step in the European Commission’s 2020 European strategy for data, following the proposal of the Data Governance Act in November. Like the Data Governance Act, the DSA and DMA aim to push back the dominance of tech giants, particularly those from the U.S. and China, while promoting competition.

EDPS considers Privacy Shield replacement unlikely for a while

18. December 2020

The data transfer agreements between the EU and the USA, namely Safe Harbor and its successor Privacy Shield, have suffered a hard fate for years. Both have been declared invalid by the European Court of Justice (CJEU) in the course of proceedings initiated by Austrian lawyer and privacy activist Max Schrems against Facebook. In either case, the court came to the conclusion that the agreements did not meet the requirements to guarantee equivalent data protection standards and thus violated Europeans’ fundamental rights due to data transfer to US law enforcement agencies enabled by US surveillance laws.

The judgement marking the end of the EU-US Privacy Shield (“Schrems II”) has a huge impact on EU companies doing business with the USA, which are now expected to rely on Standard Contractual Clauses (SCCs). However, the CJEU tightened the requirements for the SCCs. When using them in the future, companies have to determine whether there is an adequate level of data protection in the third country. Therefore, in particular cases, there may need to be taken additional measures to ensure a level of protection that is essentially the same as in the EU.

Despite this, companies were hoping for a new transatlantic data transfer pact. Though, the European Data Protection Supervisor (EDPS) Wojciech Wiewiórowski expressed doubts on an agreement in the near future:

I don’t expect a new solution instead of Privacy Shield in the space of weeks, and probably not even months, and so we have to be ready that the system without a Privacy Shield like solution will last for a while.

He justified his skepticism with the incoming Biden administration, since it may have other priorities than possible changes in the American national security laws. An agreement upon a new data transfer mechanism would admittedly depend on leveling US national security laws with EU fundamental rights.

With that in mind, the EU does not remain inactive. It is also trying to devise different ways to maintain its data transfers with the rest of the world. In this regard, the EDPS appreciated European Commission’s proposed revisions to SCCs, which take into consideration the provisions laid down in CJEU’s judgement “Schrems II”.

The proposed Standard Contractual Clauses look very promising and they are already introducing many thoughts given by the data protection authorities.

Swedish court confirms Google’s violations of the GDPR

16. December 2020

The Administrative Court of Stockholm announced on November 23rd, 2020, that it had rejected Google LLC’s appeal against the decision of the Swedish Data Protection Authority (Datainspektionen) determining Google’s violations of the GDPR. Google as a search engine operator had not fulfilled its obligations regarding the right to be forgotten (RTBF). However, the court reduced the fine from a total of SEK 75 million (approx. € 7,344,000) to SEK 52 million (approx. € 5,091,000).

Background to the case was the Swedish DPA’s audit in 2017 concerning Google’s handling of requests on delisting, which means removal of certain results from a search engine. The DPA concluded the inspection by ordering Google to delist certain individuals’ names due to inaccuracy, irrelevance and superfluous information. In 2018 the DPA initiated a follow-up audit because of indications that Google had not fully complied with the previously issued order. It resulted in issuing an administrative fine of SEK 75 million in March 2020.

The DPA raised attention to the fact that the GDPR increases the obligations of data controllers and data processors as well as strengthens the rights of individuals, which include the right to have their search result delisted. Though, Google has not been fully complying with its obligations, as it has not properly removed two of the search result listings that the DPA had ordered to delete. In one case Google has done a too narrow interpretation of what web addresses to remove, in the other case Google has failed to remove it without undue delay.

Moreover, the DPA criticized Google’s procedure of managing delisting requests and found it to be undermining data subjects’ rights. Following the removal of a search result listing, Google notifies the website to which the link is directed. The delisting request form, directed to the data subject raising the request, states that information on the removed web addresses can be provided to the webmaster. This information has to be seen as misleading since the data subject is made to understand that its consent to the notification is required in order to process the request. Therefore, such practice might result in individuals refraining from exercising their right to request delisting, which violates Art. 5 (1) lit. a) GDPR. What’s more, in the opinion of the DPA the delisting notifications to the webmasters are not covered by legal obligations according to Art. 6 (1) lit. c), 17 (2) GDPR, nor legitimate interests pursuant to Art. 6 (1) lit. f) GDPR. Also, Google’s routine of regularly sending information to webmasters constitutes processing of personal data being incompatible with the purpose for which the data was originally collected. This practice infringes Art. 5 (1) lit. b), 6 (4) GDPR.

Google appealed the decision of the DPA. Though, the Swedish Administrative Court of Stockholm reaffirmed the DPA’s opinion and confirmed Google’s violations of the GDPR.

The court stated that the process concerning delisting requests must facilitate for the individual to exercise its rights. That means, any process that restricts the individuals’ rights may violate Art. 15 through 22 GDPR. The court also specified why the personal data had been processed beyond their original purpose. Since the notifications are only sent after Google has removed a search result, the purpose of the processing has already expired when the notification is sent. Thus, the notification cannot be considered effective in achieving the purpose specified by Google.

Google shall now delist specific search results and cease to inform webmasters of requests. Also, Google must adapt its data subject rights procedure within eight weeks after the court’s judgment has gained legal force.

CNIL fines Google and Amazon

10. December 2020

The French Data Protection Authority Commission Nationale de l’Informatique et des Libertès – “CNIL” – announced that it has fined the big tech companies Google and Amazon due to violations of the GDPR and the French Data Protection Act.

Regarding Google CNIL announced financial penalties of an combined record breaking amount of € 100 million. € 60 million are against Google LLC, the US-based mother company, and € 40 million against Google Ireland Limited, the Irish daughter company. According to the statement of CNIL the fines are based on violations regarding the Cookie requirements on the website google.fr. Due to an online investigation, conducted on March 16th, 2020, CNIL considers it as proven that Google “placed advertising cookies on the computers of users of the search engine google.fr, without obtaining prior consent and without providing adequate information”.

Besides the findings on Cookies, CNIL also critizes a lack of information on the processed personal data and a partial failure of the opposition mechanism.

The high amount of the financial penalties is justified with the seriousness of the violation, the high amount of concerned data subjects and the significant profits of the companies arising of the advertisements.

CNIL also considers the fact, that this procedure is no longer in place since an update in September 2020, because the newly implemented banner does not allow to understand the purposes for which the cookies are used and does not let the data subject know that they can refuse the coolies.

This is already the second, financial penalty CNIL imposes against Google.

Also for violations in connection with cookies CNIL fines Amazon Europe Core a financial penalty of € 35 million. The accusation is the same as with Google and based on several investigations conducted between December 12th, 2019 and May 19th, 2020. CNIL found out, that when a user visited the website, cookies were automatically placed on his or her computer, without any action required on the users part. Several of these cookies were used for advertising purposes. Also a lack of information has been conducted.

The high amount of the financial penalties is in all cases justified with the seriousness of the violation, the high amount of concerned data subjects and the significant profits of the companies arising of the advertisements.

Update: The Council of the European Union publishes recommendations on encryption

8. December 2020

In November, the Austrian broadcasting network “Österreichischer Rundfunk” sparked a controversial discussion by publishing leaked drafts of the Council of the European Union (“EU Council”) on encryption (please see our blog post). After these drafts had been criticized by several politicians, journalists and NGOs, the EU Council published “Recommendations for a way forward on the topic of encryption” on December 1st, in which it considers it important to carefully balance between protecting fundamental rights with ensuring law enforcement investigative powers.

The EU Council sees a dilemma between the need for strong encryption in order to protect privacy on one hand, and the misuse of encryption by criminal subjects such as terrorists and organized crime on the other hand. They further note:

“We acknowledge this dilemma and are determined to find ways that will not compromise
either one, upholding the principle of security through encryption and security despite
encryption.”

The paper lists several intentions that are supposed to help find solutions to this dilemma.

First, it directly addresses EU institutions, agencies, and member states, asking them to coordinate their efforts in developing technical, legal and operational solutions. Part of this cooperation is supposed to be the joint implementation of standardized high-quality training programs for law enforcement officers that are tailored to the skilled criminal environment. International cooperation, particularly with the initiators of the “International Statement: End-to-End Encryption and Public Safety“, is proclaimed as a further intention.

Next the technology industry, civil society and academic world are acknowledged as important partners with whom EU institutions shall establish a permanent dialogue. The recommendations address internet service providers and social media platforms directly, noting that only with their involvement can the full potential of technical expertise be realized. Europol’s EU Innovation Hub and national research and development teams are named key EU institutions for maintaining this dialogue.

The EU Council concludes that the continuous development of encryption requires regular evaluation and review of technical, operational, and legal solutions.

These recommendations can be seen as a direct response to the discussion that arose in November. The EU Council is attempting to appease critics by emphasizing the value of encryption, while still reiterating the importance of law enforcement efficiency. It remains to be seen how willing the private sector will cooperate with the EU institutions and what measures exactly the EU Council intends to implement. This list of intentions lacks clear guidelines, recommendations or even a clearly formulated goal. Instead, the parties are asked to work together to find solutions that offer the highest level of security while maximizing law enforcement efficiency. In summary, these “recommendations” are more of a statement of intent than implementable recommendations on encryption.

Belgian DPA planning to suspend websites that infringe GDPR

The Belgian Data Protection Authority (DPA) signed a Cooperation Agreement on November 26, 2020, with DNS Belgium, the organization behind the management of the “.be” country-code domain name. The background is to allow DNS Belgium to suspend “.be” websites that are infringing the GDPR. The Agreement builds up a two-tier cooperation system, which aims at identifying infringements and suspending the websites if no action is taken.

The first step is a cooperative investigation, for which DNS Belgium has to support the Belgian DPA by providing all information necessary for the investigation.

The second step is the “Notice and Action” procedure, during which, if the Belgian DPA’s Investigation Service considers a data processing activity conducted via a website with a “.be” domain name to infringe one of the data protection principles under the GDPR, and the responsible data controller or data processor does not comply with the DPA’s order to suspend, limit, freeze or end the data processing activity, the Investigation Service is authorized to send a “Notice and Action” notification to DNS Belgium. Once DNS Belgium receives the “Notice and Action” notification, they will proceed to inform the website owner about the infringement and re-direct the relevant domain name to a warning page of the Belgian DPA.

The website owner can take remedial measures within 14 days to remedy the infringement, upon which he can indicate it to the Belgian DPA. If the Belgian DPA does not contest the measures taken, the relevant domain name will be restored. However, if the infringement is not remediated during the 14-day period, the website will continuously to be re-directed to the Belgian DPA’s warning page for a period of six months. After this time the website will be cancelled and placed in quarantine for 40 days before becoming available for registration once again.

Due to the heavy penalty in cases of a controller not taking any action to remedy the infringement, this action by the Belgian DPA is only possible in cases of infringements that cause very serious harm and are committed by natural or legal persons who deliberately infringe the law, or continue a data processing activity despite a prior order by the Investigation Service of the Belgian DPA to suspend, limit, freeze or end the processing activity.

It is to note that the Inspector General of the Belgian DPA can provide extra time to a website owner to comply with the relevant data protection requirements at the Inspector General’s discretion. However, this will depend on a case by case basis and on the cooperation of the website owner.

16 Million brazilian COVID-19 patients’ personal data exposed online

7. December 2020

In November 2020, personal and sensitive health data of about 16 Million brazilian COVID-19 patients has been leaked on the online platform GitHub. The cause was a hospital employee, that uploaded a spreadsheet with usernames, passwords, and access keys to sensitive government systems on the online platforms. Under those affected were also the brazilian President Jair Bolsonaro and his family as well as seven ministers and 17 provincial governors.

Under the exposed systems were two government databases used to store information on COVID-19 patients. The first “E-SUS-VE” was used for recording COVID-19 patients with mild symptoms, while the second “Sivep-Gripe” was used to keep track of hospitalized cases across the country.

However, both systems contained highly sensitive personal information such as patient names, addresses, telephone numbers, individual taxpayer’s ID information, but also healthcare records such as medical history and medication regimes.

The leak was discovered after a GitHub user spotted the spreadsheet containing the password information on the personal GitHub account of an employee of the Albert Einstein Hospital in Sao Paolo. The user informed the Brazilian newspaper Estadao, which analysed the information shared on the platform before it notified the hospital and the health ministry of Brazil.

The spreadsheet was ultimately removed from GitHub, while government officials changed passwords and revoked access keys to secure their systems after the leak.

However, Estadao reporters confirmed that the leaked data included personal data of Brazilians across all 27 states.

New Zealand’s Privacy Act 2020 comes into force

4. December 2020

New Zealand’s Office of the Privacy Commissioner announced the Privacy Act 2020 has taken effect. Certain aspects of the Privacy Act came into force on July 1st, 2020, with most operative provisions commencing from December 1st, 2020. The new law affords better privacy protections and greater obligations for organisations and businesses when handling personal information. It also gives the Privacy Commissioner greater powers to ensure the agencies comply with the Privacy Act.

Notably, the updated legislation features new breach reporting obligations, criminal penalties and provisions on international data transfers.

Part 6. of the Privacy Act 2020 covers notifiable privacy breaches and compliance notices. It introduces a new mandatory reporting requirement. When an agency becomes aware of a privacy breach that it is reasonable to believe has caused serious harm to an affected individual or individuals or is likely to do so (unless a specific limited exception applies), the agency must notify the Privacy Commissioner and affected individuals as soon as practicable. In addition, the Privacy Commissioner may issue a compliance notice to an agency to require it to do something or stop doing something to comply with the Privacy Act. For the sake of completeness, it should be mentioned that there is no distinction between a data controller and a data processor. The term “agencies” refers to all data processing bodies.

Furthermore, new criminal offences have been incorporated into Part 9. of the Privacy Act (Section 212). It is now an offence to mislead an agency for the purpose of obtaining access to someone else’s personal information – for example, by impersonating an individual or falsely pretending to be an individual or to be acting under the authority of an individual. The Privacy Act also creates a new offence of destroying any document containing personal information, knowing that a request has been made in respect of that information. The penalty for these offences is a fine of up to $ 10,000.

Moreover, in accordance with Part 5. of the Privacy Act (Section 92), the Privacy Commissioner may direct an agency to confirm whether it holds any specified personal information about an individual and to provide the individual access to that information in any manner that the Privacy Commissioner considers appropriate.

What’s more, a new Information Privacy Principle (IPP) has been added to Part 3. of the Privacy Act (Section 22), which governs the disclosure of personal information outside New Zealand. Under IPP 12, an agency may disclose personal information to a foreign person or entity only if the receiving agency is subject to privacy laws that, overall, provide comparable safeguards to those in the Privacy Act.

Apart from that, pursuant to Part 1. of the Privacy Act (Section 4), the privacy obligations also apply to overseas agencies within the meaning of Section 9 that are “carrying on business” in New Zealand, even if they do not have a physical presence there. This will affect businesses located offshore.

Privacy Commissioner John Edwards welcomes the Privacy Act, noting that the new law reflects the changes in New Zealand’s wider economy and society as well as a modernised approach to privacy:

The new Act brings with it a wider range of enforcement tools to encourage best practice, which means we are now able to take a different approach to the way we work as a regulator.

Since the Privacy Act 2020 replaces the Privacy Act 1993, which will still be relevant to privacy complaints about actions that happened before December 1st, a guidance has been issued on which act applies and when. The Office of the Privacy Commissioner has also published a compare chart that shall help navigate between the acts.

EU offers new alliance with the USA on data protection

The European Commission and the High Representative of the Union for Foreign Affairs and Security Policy outlined a new EU-US agenda for global change, which was published on December 2nd, 2020. It constitutes a proposal for a new, forward-looking transatlantic cooperation covering a variety of matters, including data protection.

The draft plan states the following guiding principles:

  • Advance of global common goods, providing a solid base for stronger multilateral action and institutions that will support all like-minded partners to join.
  • Pursuing common interests and leverage collective strength to deliver results on strategic priorities.
  • Looking for solutions that respect common values of fairness, openness and competition – including where there are bilateral differences.

As said in the draft plan, it is a “once-in-a-generation” opportunity to forge a new global alliance. It includes an appeal for the EU and US to bury the hatchet on persistent sources of transatlantic tension and join forces to shape the digital regulatory environment. The proposal aims to create a shared approach to enforcing data protection law and combatting cybersecurity threats, which could also include possible restrictive measures against attributed attackers from third countries. Moreover, a transatlantic agreement concerning Artificial Intelligence forms a part of the recommendation. The purpose is setting a blueprint for regional and global standards. The EU also wants to openly discuss diverging views on data governance and facilitate free data flow with trust on the basis of high safeguards. Furthermore, the creation of a specific dialogue with the US on the responsibility of online platforms and Big Tech is included in the proposal as well as the development of a common approach to protecting critical technologies.

The draft plan is expected to be submitted for endorsement by the European Council at a meeting on December 10-11th, 2020. It suggests an EU-US Summit in the first half of 2021 as the moment to launch the new transatlantic agenda.

Pages: Prev 1 2 3 ... 15 16 17 18 19 20 21 ... 67 68 69 Next
1 16 17 18 19 20 69