Category: EU Commission

EDPB AND EDPS criticise the Commission’s Proposal to combat child sexual abuse online

15. August 2022

In May 2022, the European Commission published its proposal on combating child sexual abuse material. The Commission justified the need for this proposal with the alleged insufficiency of voluntary detection carried out by companies. Recently, the European Data Protection Board (EDPB) and European Data Protection Supervisor (EDPS) have issued a joint statement criticizing the proposal on privacy grounds.

According to the proposal, hosting services and communication services would be obliged to identify, remove and report online child pornography. This, in turn, requires that encrypted messages can be screened. In other words, the actual text messages are to be read in order to detect grooming.

In their joint criticism, the EDPB and EDPS highlight that such an AI based system will most likely result into errors and false positives.

EDPS Supervisor, Wojciech Wiewiórowski, said: “Measures allowing public authorities to have access to the content of communications, on a generalised basis, affect the essence of the right to private life. Even if the technology used is limited to the use of indicators, the negative impact of monitoring the text and audio communications of individuals on a generalised basis is so severe that it cannot be justified under the EU Charter of Fundamental Rights. The proposed measures related to the detection of solicitation of children in interpersonal communication services are extremely concerning.”

Individual brought an action against European Commission before Court of Justice of the European Union

27. July 2022

A German citizen brought an action against the European Commission (the Commission) before the Court of Justice of the European Union claiming that the Commission is involved in illegal international data transfers to the US.

The subject-matter of the action, which was recently admitted by the Court, relates to data processing carried out in the context of the web page “future.europa.eu”, a platform that intends to increase citizen’s engagement with the EU.

In his complaint, that was drafted by EuGD, a German data protection organization, he alleges, amongst other things, that upon accessing said website and by enabling a facebook login, personal data, such as users’ IP addresses, is being transferred to US clouds and webhosts. The action’s allegations of illegal transfers are also grounded on the Schrems II judgment according to the organization’s press release.

It should be noted that personal data processings by organs of the EU do not fall under the scope of the GDPR, but instead they are regulated by another regulation, that is, regulation 2018/1725 on the protection of natural persons with regard to the processing of personal data by the Union institutions, bodies, offices and agencies and on the free movement of such data.

Even though the GDPR does not apply to the Commission, regulation 2018/1725 does mention the GDPR in the context of international data transfers to third countries (e.g. recital 65) and it is not too far fetched to hold the view that the ruling contained in Schrems II will indeed extend to this regulation.

One should also remember Recital 5 of Regulation 2018/1725 that reads the following:

Whenever the provisions of this Regulation follow the same principles as the provisions of Regulation (EU) 2016/679, those two sets of provisions should, under the case law of the Court of Justice of the European Union (the ‘Court of Justice’), be interpreted homogeneously, in particular because the scheme of this Regulation should be understood as equivalent to the scheme of Regulation (EU) 2016/679.

The claimant also alleges that the Commission did not duly respond to his access request in which he requested information on the data processed and about the safeguards in place. He specifically alleges that one request was not answered properly and that the other one was left unanswered at first.

The action questioning the legality of European webpages that use US webhosts and enable facebook log-ins comes at an interesting moment in time. Not too long ago, facebook/meta data transfers’ compatibility with GDPR was challenged by the DPC when it recommended to halt EU-US transfers of meta products for failing to comply with the GDPR.

The founder of the organization that is assisting the legal action told EURACTIV “that if a restaurant or a bakery has to figure out a way to comply with the ban on data transfers to the United States, so does the European Commission, as there cannot be double standards.”

Privacy issues in the antitrust legal framework: “the Facebook case”

21. July 2022

European countries were among the first to introduce privacy laws in the context of antitrust and in the competition law framework. As a result of this implementation, in 2019 the German Federal Cartel Office took action to stop Facebook (now a part of Meta Inc.) from further processing personal data that had been acquired through third – party installations (most of all referring to cookies). The proceedings on the matter are still ongoing. Recently also the Irish Data Protection Authority took position against Facebook (which has in the meantime become Meta Inc.), by preventing the American tech giant to transfer user data to the United States due to data safety issues. Also in this matter the parties are still in debate.

In 2014 Facebook notoriously purchased messaging company WhatsApp for almost 22 bln. dollars. At the time Europe did not give much thought to the potential consequences of this merger. This operation was the object of an opinion of the European Commission; in the Commission’s mind the two companies’ privacy policies were way different, and the thought that Facebook now had control over all of the data collected by WhatsApp did not sit well with the European authorities. Another key argument brought forward by the Commission was the lack of an effective competition between the two companies. However, no further action was taken at the time.

A few years later, academic research highlighted the mistake made by the European Commission in not considering the enormous meaning personal data have for these tech companies: due to the fact that personal data are considered to be so – called “nonprice competition”, they play a key role in the strategies and decision – making of big data – driven business models. In particular, when a company depends on collecting and using personal data, it usually lowers the bar of privacy protection standards and raises the number of data collected. This argument was brought forward by the U.K.’s Competition Agency, which stated that by considering the enormous importance personal data have gained in the digital market, companies such as Facebook do not have to face a strong competition in their business.

These arguments and the growing unrest in various DPAs around the globe has brought in 2020 to the notorious investigation of Facebook by the Federal Trade Commission of the United States. In particular the FTC accused Meta Inc. (in particular Facebook) of stifling its competition in order to retain its monopoly of the digital market. On one hand an American court dismissed the claims, but on the other hand the high risks connected with an enormous data collection was highlighted. In particular, according to Section 2 of the Sherman Act, the State has:

  • To prove that a company is in fact a monopoly, and
  • That it has to harm consumers

This does not apply directly to the case, but the FTC argued that the harm to the consumers is to be seen in Meta Inc.’s lowering privacy standards. The case is still pending as of July 2022.

This merger showed how much privacy and antitrust issues overlap in the digitalized market.

In the following months, policymakers and enforcers both in the United States and in the European Union have been struggling to establish new sets of rules to better regulate mergers between companies whose business model relies on the collection of personal data, and above all they called for more cooperation between privacy and antitrust agencies.

EU: Commission publishes Q&A on SCCs

30. May 2022

On 25 May 2022, the European Commission published guidance outlining questions and answers (‘Q&A’) on the two sets of Standard Contractual Clauses (‘SCCs’), on controllers and processors (‘the Controller-Processor SCCs’) and third-country data transfers (‘the Data Transfer SCCs’) respectively, as adopted by the European Commission on 4 June 2021. The Q&A are intended to provide practical guidance on the use of the SCCs. They are based on feedback from various stakeholders on their experiences using the new SCCs in the months following their adoption. 

Specifically, 44 questions are addressed, including those related to contracting, amendments, the relationship to other contract clauses, and the operation of the so-called docking clause.  In addition, the Q&A contains a specific section dedicated to each set of SCCs. Notably, in the section on the Data Transfer SCCs, the Commission addresses the scope of data transfers for which the Data Transfer SCCs may be used, highlighting that they may not be used for data transfers to controllers or processors whose processing operations are directly subject to the General Data Protection Regulation (Regulation (EU) 2016/679) (‘GDPR’) by virtue of Article 3 of the GDPR. Further to this point, the Q&A highlights that the Commission is in the process of developing an additional set of SCCs for this scenario, which will consider the requirements that already apply directly to those controllers and processors under the GDPR. 

In addition, the Q&A includes a section with questions on the obligations of data importers and exporters, specifically addressing the SCC liability scheme. Specifically, the Q&A states that other provisions in the broader (commercial) contract (e.g., specific rules for allocation of liability, caps on liability between the parties) may not contradict or undermine liability schemes of the SCCs. 

Additionally, with respect to the Court of Justice of the European Union’s judgment in Data Protection Commissioner v. Facebook Ireland Limited, Maximillian Schrems (C-311/18) (‘the Schrems II Case’), the Q&A includes a set of questions on local laws and government access aimed at clarifying contracting parties’ obligations under Clause 14 of the Data Transfer SCCs. 

In this regard, the Q&A highlights that Clause 14 of the Data Transfer SCCs should not be read in isolation but used together with the European Data Protection Board’s Recommendations 01/2020 on measures that supplement transfer tools. 

European Commission and United States agree in principle on Trans-Atlantic Data Privacy Framework

29. March 2022

On March 25th, 2022, the United States and the European Commission have committed to a new Trans-Atlantic Data Privacy Framework that aims at taking the place of the previous Privacy Shield framework.

The White House stated that the Trans-Atlantic Data Privacy Framework “will foster trans-Atlantic data flows and address the concerns raised by the Court of Justice of the European Union when it struck down in 2020 the Commission’s adequacy decision underlying the EU-US Privacy Shield framework”.

According to the joint statement of the US and the European Commission, “under the Trans-Atlantic Data Privacy Framework, the United States is to put in place new safeguards to ensure that signals surveillance activities are necessary and proportionate in the pursuit of defined national security objectives, establish a two-level independent redress mechanism with binding authority to direct remedial measures, and enhance rigorous and layered oversight of signals intelligence activities to ensure compliance with limitations on surveillance activities”.

This new Trans-Atlantic Data Privacy Framework has been a strenuous work in the making and reflects more than a year of detailed negotiations between the US and EU led by Secretary of Commerce Gina Raimondo and Commissioner for Justice Didier Reynders.

It is hoped that this new framework will provide a durable basis for the data flows between the EU and the US, and underscores the shared commitment to privacy, data protection, the rule of law, and the collective security.

Like the Privacy Shield before, this new framework will represent a self-certification with the US Department of Commerce. Therefore, it will be crucial for data exporters in the EU to ensure that their data importers are certified under the new framework.

The establishment of a new “Data Protection Review Court” will be the responsible department in cases of the new two-tier redress system that will allow EU citizens to raise complaints in cases of access of their data by US intelligence authorities, aiming at investigating and resolving the complaints.

The US’ commitments will be concluded by an Executive Order, which will form the basis of the adequacy decision by the European Commission to put the new framework in place. While this represents a quicker solution to reach the goal, it also means that Executive Orders can be easily repealed by the next government of the US. Therefore, it remains to be seen if this new framework, so far only agreed upon in principle, will bring the much hoped closure on the topic of trans-Atlantic data flows that is intended to bring.

European Commission adopts South Korea Adequacy Decision

30. December 2021

On December 17th, 2021, the European Commission (Commission) announced in a statement it had adopted an adequacy decision for the transfer of personal data from the European Union (EU) to the Republic of Korea (South Korea) under the General Data Protection Regulation (GDPR).

An adequacy decision is one of the instruments available under the GDPR to transfer personal data from the EU to third countries that ensure a comparable level of protection for personal data as the EU. It is a Commission decision under which personal data can flow freely and securely from the EU to the third country in question without any further conditions or authorizations being required. In other words, the transfer of data to the third country in question can be handled in the same way as the transfer of data within the EU.

This adequacy decision allows for the free flow of personal data between the EU and South Korea without the need for any further authorization or transfer instrument, and it also applies to the transfer of personal data between public sector bodies. It complements the Free Trade Agreement (FTA) between the EU and South Korea, which entered into force in July 2011. The trade agreement has led to a significant increase in bilateral trade in goods and services and, inevitably, in the exchange of personal data.

Unlike the adequacy decision regarding the United Kingdom, this adequacy decision is not time-limited.

The Commission’s statement reads:

The adequacy decision will complement the EU – Republic of Korea Free Trade Agreement with respect to personal data flows. As such, it shows that, in the digital era, promoting high privacy and personal data protection standards and facilitating international trade can go hand in hand.

In South Korea, the processing of personal data is governed by the Personal Information Portection Act (PIPA), which provides similar principles, safeguards, individual rights and obligations as the ones under EU law.

An important step in the adequacy talks was the reform of PIPA, which took effect in August 2020 and strengthened the investigative and enforcement powers of the Personal Information Protection Commission (PIPC), the independent data protection authority of South Korea. As part of the adequacy talks, both sides also agreed on several additional safeguards that will improve the protection of personal data processed in South Korea, such as transparency and onward transfers.

These safeguards provide stronger protections, for example, South Korean data importers will be required to inform Europeans about the processing of their data, and onward transfers to third countries must ensure that the data continue to enjoy the same level of protection. These regulations are binding and can be enforced by the PIPC and South Korean courts.

The Commission has also published a Q&A on the adequacy decision.

EU commission working on allowing automated searches of the content of private and encrypted communications

25. November 2021

The EU Commission is working on a legislative package to combat child abuse, which will also regulate the exchange of child pornography on the internet. The scope of these regulations is expected to include automated searches for private encrypted communications via messaging apps.

When questioned, Olivier Onidi, Deputy Director General of the Directorate-General Migration and Home Affairs at the European Commission, said the proposal aims to “cover all forms of communication, including private communication”.

The EU Commissioner of Home Affairs, Ylva Johansson, declared the fight against child sexual abuse to be her top priority. The current Slovenian EU Council Presidency has also declared the fight against child abuse to be one of its main priorities and intends to focus on the “digital dimension”.

In May 2021, the EU Commission, the Council and the European Parliament reached a provisional agreement on an exemption to the ePrivacy Directive that would allow web-based email and messaging services to detect, remove, and report child sexual abuse material. Previously, the European Electronic Communications Code (EECC) had extended the legal protection of the ePrivacy Directive to private communications related to electronic messaging services. Unlike the General Data Protection Regulation, the ePrivacy Directive does not contain a legal basis for the voluntary processing of content or traffic data for the purpose of detecting child sexual abuse. For this reason, such an exception was necessary.

Critics see this form of preventive mass surveillance as a threat to privacy, IT security, freedom of expression and democracy. A critic to the agreement states:

This unprecedented deal means all of our private e-mails and messages will be subjected to privatized real-time mass surveillance using error-prone incrimination machines inflicting devastating collateral damage on users, children and victims alike.

However, the new legislative initiative goes even further. Instead of allowing providers of such services to search for such content on a voluntary basis, all providers would be required to search the services they offer for such content.

How exactly such a law would be implemented from a technical perspective will probably not be clear from the text of the law and is likely to be left up to the providers.
One possibility would be that software checks the hash of an attachment before it is sent and compares it with a database of hashes that have already been identified as illegal once. Such software is offered by Microsoft, for example, and such a database is operated by the National Center of Missing and Exploited Children in the United States. A hash is a kind of digital fingerprint of a file.
Another possibility would be the monitoring technology “client-side scanning”. This involves scanning messages before they are encrypted on the user’s device. However, this technology has been heavily criticized by numerous IT security researchers and encryption software manufacturers in a joint study. They describe CSS as a threat to privacy, IT security, freedom of expression and democracy, among other things because the technology creates security loopholes and thus opens up gateways for state actors and hackers.

The consequence of this law would be a significant intrusion into the privacy of all EU citizens, as every message would be checked automatically and without suspicion. The introduction of such a law would also have massive consequences for the providers of encrypted messaging services, as they would have to change their software fundamentally and introduce corresponding control mechanisms, but without jeopardizing the security of users, e.g., from criminal hackers.

There is another danger that must be considered: The introduction of such legally mandated automated control of systems for one area of application can always lead to a lowering of the inhibition threshold to use such systems for other purposes as well. This is because the same powers that are introduced in the name of combating child abuse could, of course, also be introduced for investigations in other areas.

It remains to be seen when the relevant legislation will be introduced and when and how it will be implemented. Originally, the bill was scheduled to be presented on December 1st, 2021, but this item has since been removed from the Commission’s calendar.

European Commission pursues legal action against Belgium over independence of Data Protection Autority

16. November 2021

In its October Infringements Package, the European Commission has stated it is pursuing legal actions against Belgium over concerns its Data Protection Authority (DPA) is not operating independently, as it should under the General Data Protection Regulation (GDPR).

The Commission stated that it “considers that Belgium violates Article 52 of the GDPR, which states that the data protection supervisory authority shall perform its tasks and exercise its powers independently. The independence of data protection authorities requires that their members are free from any external influence or incompatible occupation.”

According to the European Commission, however, some members of the Belgian DPA cannot be regarded as free from external influence, as they either report to a management committee depending on the Belgian government, they have taken part in governmental projects on COVID-19 contact tracing, or they are members of the Information Security Committee.

On June 9th, 2021, the Commission sent a letter of formal notice to Belgium, giving the member state two months to take corrective measures. Belgium’s response to the Commission’s letter did not address the issues raised and the members concerned have so far remained in their posts. The European Commission is now giving Belgium two months to take relevant action. If this fails, the Commission may decide to refer the case to the Court of Justice of the European Union.

US court unsuccessfully demanded extensive information about user of the messenger app Signal

On October 27th, 2021 Signal published a search warrant for user data issued by a court in Santa Clara, California. The court ordered Signal to provide a variety of information, including a user’s name, address, correspondence, contacts, groups, and call records from the years 2019 and 2020. Signal was only able to provide two sets of data: the timestamp of when the account was created and the date of the last connection to the Signal server, as Signal does not store any other information about its users.

The warrant also included a confidentiality order that was extended four times. Signal stated:

Though the judge approved four consecutive non-disclosure orders, the court never acknowledged receipt of our motion to partially unseal, nor scheduled a hearing, and would not return counsel’s phone calls seeking to schedule a hearing.

A similar case was made public by Signal in 2016, when a court in Virginia requested the release of user data and ordered that the request not be made public. Signal fought the non-publication order in court and eventually won.

Signal is a messenger app that is highly regarded among privacy experts like Edward Snowden. That’s because Signal has used end-to-end encryption by default from the start, doesn’t ask its users for personal information or store personal data on its servers and is open source. The messenger is therefore considered particularly secure and trustworthy. Moreover, no security vulnerabilities have become known so far, which is definitely the case with numerous competing products.

Since 2018, Signal is beeing operated by the non-profit organization Signal Technology Foundation and the Signal Messenger LLC. At that time, WhatsApp co-founder Brian Acton, among others, joined the company and invested $50 million. Signal founder Moxie Marlinspike is also still on board.

The EU commission is planning a legislative package to fight the spread of child abuse on the Internet. The law will also include automated searches of the content of private and encrypted communications, for example via messenger apps. This would undermine the core functions of Signal in Europe. Critics call this form of preventive mass surveillance a threat to privacy, IT security, freedom of expression and democracy.

New EU SCC must be used as of now

29. September 2021

In June 2021, the European Commission published the long-awaited new Standard Contractual Clauses (SCC) for the transfers of personal data to so-called third countries under the General Data Protection Regulation (GDPR) (please see our blog post). These new SCC modules replace the three 10-year-old SCC sets that were adopted under the EU Data Protection Directive 95/46/EC and thus could not meet the requirements of the GDPR for data transfers to third countries, nor the significant Schrems II ruling of July 16th, 2020 (please see our blog post). The transfer of data to third countries has not only recently become problematic and a focus of supervisory authorities.

As of Monday, September 27th, 2021, these new SCC must be used for new contracts entered into after September 26th, 2021, and for new processing activities that begin after September 26th, if the contract or processing activity involves the transfer of personal data to so-called inadequate third countries. These are countries outside of the European Economic Area (EEA) not deemed to have an adequate level of data protection by an adequacy decision of the European Commission.

Contracts signed before September 27th, 2021, based on the old SCC will still be considered adequate until December 27th, 2022. For these contracts, the old SCCs already signed can be maintained in the meantime as long as the processing of personal data that is the subject of the contract in question does not change. The SCC used for these contracts must be updated to the new SCC, or other data transfer mechanisms in accordance with the GDPR, by December 27th, 2022. As of that date, all SCC used as safeguards for data transfers to inadequate third countries must be the new SCC.

Pages: 1 2 3 4 5 6 7 Next
1 2 3 7