Tag: EU Commission

Privacy issues in the antitrust legal framework: “the Facebook case”

21. July 2022

European countries were among the first to introduce privacy laws in the context of antitrust and in the competition law framework. As a result of this implementation, in 2019 the German Federal Cartel Office took action to stop Facebook (now a part of Meta Inc.) from further processing personal data that had been acquired through third – party installations (most of all referring to cookies). The proceedings on the matter are still ongoing. Recently also the Irish Data Protection Authority took position against Facebook (which has in the meantime become Meta Inc.), by preventing the American tech giant to transfer user data to the United States due to data safety issues. Also in this matter the parties are still in debate.

In 2014 Facebook notoriously purchased messaging company WhatsApp for almost 22 bln. dollars. At the time Europe did not give much thought to the potential consequences of this merger. This operation was the object of an opinion of the European Commission; in the Commission’s mind the two companies’ privacy policies were way different, and the thought that Facebook now had control over all of the data collected by WhatsApp did not sit well with the European authorities. Another key argument brought forward by the Commission was the lack of an effective competition between the two companies. However, no further action was taken at the time.

A few years later, academic research highlighted the mistake made by the European Commission in not considering the enormous meaning personal data have for these tech companies: due to the fact that personal data are considered to be so – called “nonprice competition”, they play a key role in the strategies and decision – making of big data – driven business models. In particular, when a company depends on collecting and using personal data, it usually lowers the bar of privacy protection standards and raises the number of data collected. This argument was brought forward by the U.K.’s Competition Agency, which stated that by considering the enormous importance personal data have gained in the digital market, companies such as Facebook do not have to face a strong competition in their business.

These arguments and the growing unrest in various DPAs around the globe has brought in 2020 to the notorious investigation of Facebook by the Federal Trade Commission of the United States. In particular the FTC accused Meta Inc. (in particular Facebook) of stifling its competition in order to retain its monopoly of the digital market. On one hand an American court dismissed the claims, but on the other hand the high risks connected with an enormous data collection was highlighted. In particular, according to Section 2 of the Sherman Act, the State has:

  • To prove that a company is in fact a monopoly, and
  • That it has to harm consumers

This does not apply directly to the case, but the FTC argued that the harm to the consumers is to be seen in Meta Inc.’s lowering privacy standards. The case is still pending as of July 2022.

This merger showed how much privacy and antitrust issues overlap in the digitalized market.

In the following months, policymakers and enforcers both in the United States and in the European Union have been struggling to establish new sets of rules to better regulate mergers between companies whose business model relies on the collection of personal data, and above all they called for more cooperation between privacy and antitrust agencies.

EU Commission publishes Draft Adequacy Decision for South Korea

25. June 2021

On 16 June 2021, the European Commission published the draft adequacy decision for South Korea and transmitted it to the European Data Protection Board (EDPB) for consultation. Thus, the Commission launched the formal procedure towards the adoption of the adequacy decision. In 2017, the Commission announced to prioritise discussions on possible adequacy decisions with important trading partners in East and South-East Asia, starting with Japan and South Korea. The adequacy decision for Japan was already adopted in 2019.

In the past, the Commission diligently reviewed South Korea’s law and practices with regards to data protection. In the course of ongoing negotiations with South Korea, the investigative and enforcement powers of the Korean data protection supervisory authority “PIPC” were strengthened, among other things. After the EDPB has given its opinion, the adequacy decision will need to be approved by a committee composed of representatives of the EU Member States.

The decision of an adequate level of protection pursuant to Art. 45 of the General Data Protection Regulation (GDPR) by the Commission is one of the possibilities to transfer personal data from the EU to a third-country in a GDPR-compliant manner. The adequacy decision will serve as an important addition to the free trade agreement and a strengthening of cooperation between the EU and South Korea. Věra Jourová, the Commission’s Vice-President for Values and Transparency, expressed after launching the formal procedure:

“This agreement with the Republic of Korea will improve the protection of personal data for our citizens and support business in dynamic trade relations. It is also a sign of an increasing convergence of data protection legislation around the world. In the digitalised economy, free and safe data flows are not a luxury, but a necessity.”

Especially in light of the Schrems II decision of the Court of Justice of the European Union, the adequacy decision for South Korea will be an invaluable asset for European and South Korean companies conducting business with each other.

EDPB adopts opinion on draft UK adequacy decisions

16. April 2021

In accordance with its obligation under Article 70 (1) (s) of the General Data Protection Regulation (GDPR), on April 13th, 2021, the European Data Protection Board (“EDPB”) adopted its opinions on the EU Commissions (“EC”) draft UK adequacy decision (please see our blog post). “Opinion 14/2021” is based on the GDPR and assesses both general data protection aspects and the public authority access to personal data transferred from the EEA for law enforcement and national security purposes contained in the draft adequacy decision, a topic the EC also discussed in detail. At the same time, the EDPB also issued “Opinion 15/2021” on the transfer of personal data under the Law Enforcement Directive (LED).

The EDPB notes that there is a strong alignment between the EU and the UK data protection regimes, especially in the principles relating to the processing of personal data. It expressly praises the fact that the adequacy decision is to apply for a limited period, as the EDPB also sees the danger that the UK could change its data protection laws. Andrea Jelinek, EDPB Chair, is quoted:

“The UK data protection framework is largely based on the EU data protection framework. The UK Data Protection Act 2018 further specifies the application of the GDPR in UK law, in addition to transposing the LED, as well as granting powers and imposing duties on the national data protection supervisory authority, the ICO. Therefore, the EDPB recognises that the UK has mirrored, for the most part, the GDPR and LED in its data protection framework and when analysing its law and practice, the EDPB identified many aspects to be essentially equivalent. However, whilst laws can evolve, this alignment should be maintained. So we welcome the Commission’s decision to limit the granted adequacy in time and the intention to closely monitor developments in the UK.”

But the EDPB also highlights areas of concern that need to be further monitored by the EC:

1. The immigration exemption, which restricts the rights of those data subjects affected.

2. How the transfer of personal data from the EEA to the UK could undermine EU data protection rules, for example on basis of future UK adequacy decisions.

3. Access to personal data by public authorities is given a lot of space in the opinion. For example, the Opinion analyses in detail the Investigatory Powers Act 2016 and related case law. The EDPB welcomes the numerous oversight and redress mechanisms in the UK but identifies a number of issues that need “further clarification and/or oversight”, namely bulk searches, independent assessment and oversight of the use of automated processing tools, and the safeguards provided under UK law when it comes to disclosure abroad, particularly with regard to the application of national security exemptions.

In summary, this EDPB opinion does not put any obstacles in the way of an adequacy decision and recognises that there are many areas where the UK and EU regimes converge. Nevertheless, it highlights very clearly that there are deficiencies, particularly in the UK’s system for monitoring national security, which need to be reviewed and kept under observation.

As for the next steps, the draft UK adequacy decisions will now be assessed by representatives of the EU Member States under the “comitology procedure“. The Commission can then adopt the draft UK adequacy decisions. A bridging period during which free data transfer to the UK is permitted even without an adequacy decision ends in June 2021 (please see our blog post).

European Commission proposes draft “Digital Service Act” and “Digital Market Act”

21. December 2020

On December 15th, the European Commission published drafts on the “Digital Service Act” (“DSA”) and the “Digital Market Act” (“DMA”), which are intended to restrict large online platforms and stimulate competition.

The DSA is intended to rework the 20-year-old e-Commerce Directive and introduce a paradigm shift in accountability. Under the DSA, platforms would have to prove that they acted in a timely manner in removing or blocking access to illegal content, or that they have no actual knowledge of such content. Violators would face fines of up to 6% of annual revenue. Authorities could order providers to take action against specific illegal content, after which they must provide immediate feedback on what action was taken and when. Providing false, incomplete or misleading information as part of the reporting requirement or failing to conduct an on-site inspection could result in fines of up to 1% of annual revenue. The scope of said illegal content is to include for example, criminal hate comments, discriminatory content, depictions of child sexual abuse, non-consensual sharing of private images, unauthorized use of copyrighted works, and terrorist content. Hosting providers will be required to establish efficient notice and action mechanisms that allow individuals to report and take action against posts they deem illegal. Platforms would not only be required to remove illegal content, but also explain to users why the content was blocked and give them the opportunity to complain.

Any advertising on ad-supported platforms would be required to be clearly identifiable as advertising and clearly state who sponsored it. Exceptions are to apply to smaller journalistic portals and bloggers, while even stricter rules would apply to large platforms. For example, platforms with more than 45 million active users in the EU could be forced to grant comprehensive access to stored data, provided that trade secrets are not affected, and to set up archives that make it possible to identify disinformation and illegal advertising.

Social network operators would have to conduct annual risk assessments and review how they deal with systemic threats, such as the spread of illegal content. They would also be required to provide clear, easy-to-understand and detailed reports at least once a year on the content moderation they have carried out during that period.

Newly appointed “Digital Service Coordinators” in each EU-Member-State are supposed to enforce the regulation, for example by ordering platforms to share data with researchers who shall investigate the platforms relevant activities, while a new European committee is to ensure that the DSA is applied uniformly across the EU. On demand of the Digital Service Coordinators platforms would have to provide researchers with key data, so they can investigate the platforms relevant activities.

The DMA includes a list of competition requirements for large platforms, so called “gatekeepers”, that have a monopoly-like status. The regulations aim to strengthen smaller competitors and prevent the large gatekeepers from using their dominance to impose practices perceived as unfair. They would neither be allowed to exclusively pre-install their own applications, nor to force other operating system developers or hardware manufacturers to have programs pre-installed exclusively by the gatekeeper’s company. In addition, preventing users from uninstalling included applications would be prohibited. Other common measures of self-preference would also be prohibited. For example, gatekeepers would no longer be allowed to use data generated by their services for their own commercial activities without also making the information available to other commercial users. If a provider wanted to merge data generated by different portals, he would have to obtain explicit consent from users to do so.

The publication of the DSA and the DMA is the next step in the European Commission’s 2020 European strategy for data, following the proposal of the Data Governance Act in November. Like the Data Governance Act, the DSA and DMA aim to push back the dominance of tech giants, particularly those from the U.S. and China, while promoting competition.

EU Commission proposes “Data Governance Act”

27. November 2020

The European Commission (“EC”) aims for an ecosystem of cheap, versatile, and secure EU-internal data transfers, so data transfers into non-EU-regions are less needed. For this goal, the EC proposed the “Data Governance Act” on November 25th, as a part of its “2020 European strategy for data“.  These strategies are intended in order to open up new ways of sharing data that is collected by companies and the public sector, or freely shared by individuals, while increasing public trust in data sharing by implementing several measures, such as establishing “data sharing intermediaries”. Combined with the Gaia-X project and several measures to follow, the Data Governance Act sets the basis to create a domestic data market that offers more efficiency of data transfers to the businesses, while also ensuring that GDPR standards are preserved. Key industries in the focus of this agenda are the agricultural, environmental, energy, finance, healthcare and mobility sectors as well as public administration.

During her speech presenting the Data Governance Act, Margarethe Vestager, Executive Vice President of the European Commission for A Europe Fit for the Digital Age, said that there are huge amounts of data produced every day, but not put to any productive use. As examples she names road traffic data from GPS, healthcare data that enables better and faster diagnosis, or data tracking heat usage from house sensors. The amount of data produced is only going to increase exponentially in the years to come. Vestager sees a lot of potential in this unused data and states the industry has an interest in using this data, however it lacks the tools to harness it.

EU based neutral data sharing intermediaries, who serve as safe data sharing organizers, are a key factor in this project. Their role is supposed to boost the willingness of sharing personal data whilst preserving the initial owner’s control. Therefore, intermediaries are not allowed to use the data for themselves, but function as neutral third-parties, transferring data between the data holder and the data user. Furthermore, intermediaries are to organize and combine different data in a neutral way, so no company secrets can be abused and the data is only used for the agreed purpose. Before they start operating, intermediates are required to notify the competent authority of their intention to provide data-sharing services.

New laws are going to ensure that sensitive and confidential data – such as intellectual property rights – can be shared and reused, while a legitimate level of protection is maintained. The same applies to data shared by individuals voluntarily. Individuals will be able to share personal data voluntarily in so-called “personal data spaces”. Once businesses will get access to these, they benefit from large amounts of data for low costs, no effort and on short notice. Vestager introduces the example of an individual suffering from a rare illness, who could provide data of his medical tests into such a personal data space, so businesses can use this data to work on treatments. Further examples are improvements in the management of climate change and the development of more precise farming tools.

To ensure the trust of potential participants, each EU-member-state is supposed to implement new competent authorities that are tasked with implementing and enforcing the Data Governance Act. A new EU-institution, the “European Data Innovation Board”, will be implemented and tasked with informing the EC about new data innovations and working out guidelines on how to implement these innovations into practice.

A more fluent exchange between different kinds of technical expertise is the hoped-for outcome of these changes, as a means to diminish the influence of big tech companies from the U.S. and China.

The Data Governance Act now needs to go through the regular legislative process. A timetable for when it is supposed to come into effect has not yet been set.

EDPS publishes opinion on future EU-UK partnership

3. March 2020

On 24 February 2020, the European Data Protection Supervisor (EDPS) published an opinion on the opening of negotiations for the future partnership between the EU and the UK with regards to personal data protection.

In his opinion, the EDPS points out the importance of commitments to fully respect fundamental rights in the future envisaged comprehensive partnership. Especially with regards to the protection of personal data, the partnership shall uphold the high protection level of the EU’s personal data rules.

With respect to the transfer of personal data, the EDPS further expresses support for the EU Commission’s recommendation to work towards the adoption of adequacy decisions for the UK if the relevant conditions are met. However, the Commission must ensure that the UK is not lowering its data protection standard below the EU standard after the Brexit transition period. Lastly, the EDPS recommends the EU Institutions to also prepare for a potential scenario in which no adequacy decisions exist by the end of the transition period on 31 December 2020.

EU Commission: Using Personal Data In Political Campaigns

29. August 2018

Following the Facebook-Cambridge Analytica case, the EU Commission intends to prohibit the misuse of Collection data of voters in order to influence elections. As the Irish Times reports, the EU Commission is drafting an amendment to existing party funding rules prohibiting parties profiting from data collections of the kind as alleged against Cambridge Analytica.

Cambridge Analytica has been accused of obtaining information of millions Facebook users without the data subjects’ consent by using a personality-analysis app during Donald Trump’s presidential campaign.

It is expected that sanctions will have the extent of approximately 5 percent of the annual budget of a political party. An official said “it is meant to ensure that something like Cambridge Analytica can never happen in the EU”.

Considering the upcoming election of the European Parliament in May 2019, various measures are to be recommended or imposed by the EU Commission that shall be followed by the member states in order to prevent misuse of voters’ personal data or the online manipulation of voters. While it is intended to recommend the governments to watch over and clamp down on groups sending personalized political messages to users of social media without their consent, the member states shall also be stricter about the transparency requirements of political advertisement on national level by amending national law.

Last month, Vera Jourova, EU justice commissioner, said: “voters and citizens should always understand – when something is an online campaign – who runs the campaign, who pays for it and what they want to achieve.”

However, she also made clear that the EU will respect free expression and that the EU is not going to regulate online activities of political parties. “The internet is a zone for free expression. Everybody can be a journalist or an influencer, and these are the things that we don’t want to touch”, she stated.