Tag: European Union
7. July 2022
In the last decades AI has had an impressive development in various fields. At the same time, with each step forward the new machines and the new processes they are programmed to perform need to collect way more data than before in order to function properly.
One of the first things that come to mind is how can the rise of AI and the principle of data minimization, as contained in Art. 5 para. 1 lit. c) GDPR, be reconciled? At first glance it seems contradictory that there may be a way: after all, the GDPR clearly states that the number of personal data collected should be as small as possible. A study carried out by the Panel for the Future of Science and Technology of the European Union suggests that, given the wide scope (referring to the exceptions contained in the article) conceded by the norm, this issue could be addressed by measures like pseudonymization. This means that the data collected by the AI is deprived of every information that could refer personal data to a specific individual without additional information, thus lowering the risks for individuals.
The main issue with the current legal framework of the European Union regarding personal data protection is the fact that certain parts have been left vague, which causes uncertainty also in the regulation of artificial intelligence. To address this problem, the EU has put forward a proposal for a new Artificial Intelligence Act (“AIA”), aiming to create a common and more “approachable” legal framework.
One of the main features of this Act is that it divides the application of artificial intelligence in three main categories of risk levels:
- Creating an unacceptable risk, thus prohibited AIs (e.g. systems that violate fundamental rights).
- Creating a high risk, subject to specific regulation.
- Creating a low or minimum risk, with no further regulation.
Regarding high-risk AIs, the AIA foresees the creation of post-market monitoring obligations. If the AI in question violates any part of the AIA, it can then be forcibly withdrawn from the market by the regulator.
This approach has been welcomed by the Joint Opinion of the EDPB – EDPS, although the two bodies stated that the draft still needs to be more aligned with the GDPR.
Although the Commission’s draft contains a precise description of the first two categories, these will likely change over the course of the next years as the proposal is undergoing the legislative processes of the EU.
The draft was published by the European Commission in April 2021 and must still undergo scrutiny from the European Parliament and the Council of the European Union. Currently, some amendments have been formulated and the draft is still under review by the Parliament. After the Act has passed the scrutiny, it will be subject to a two – year implementation period.
Finally, a question remains to be answered: who shall oversee and control the Act’s implementation?It is foreseen that national supervisory authorities shall be established in each EU member state. Furthermore, the AIA aims at establishing a special European AI Board made up of representatives both of the member States and of the European Commission, which will also be the chair. Similar to the EDPB, this Board shall have the power to issue opinions and recommendations, and ensure the consistent application of the regulation throughout the EU.
30. December 2021
On December 17th, 2021, the European Commission (Commission) announced in a statement it had adopted an adequacy decision for the transfer of personal data from the European Union (EU) to the Republic of Korea (South Korea) under the General Data Protection Regulation (GDPR).
An adequacy decision is one of the instruments available under the GDPR to transfer personal data from the EU to third countries that ensure a comparable level of protection for personal data as the EU. It is a Commission decision under which personal data can flow freely and securely from the EU to the third country in question without any further conditions or authorizations being required. In other words, the transfer of data to the third country in question can be handled in the same way as the transfer of data within the EU.
This adequacy decision allows for the free flow of personal data between the EU and South Korea without the need for any further authorization or transfer instrument, and it also applies to the transfer of personal data between public sector bodies. It complements the Free Trade Agreement (FTA) between the EU and South Korea, which entered into force in July 2011. The trade agreement has led to a significant increase in bilateral trade in goods and services and, inevitably, in the exchange of personal data.
Unlike the adequacy decision regarding the United Kingdom, this adequacy decision is not time-limited.
The Commission’s statement reads:
The adequacy decision will complement the EU – Republic of Korea Free Trade Agreement with respect to personal data flows. As such, it shows that, in the digital era, promoting high privacy and personal data protection standards and facilitating international trade can go hand in hand.
In South Korea, the processing of personal data is governed by the Personal Information Portection Act (PIPA), which provides similar principles, safeguards, individual rights and obligations as the ones under EU law.
An important step in the adequacy talks was the reform of PIPA, which took effect in August 2020 and strengthened the investigative and enforcement powers of the Personal Information Protection Commission (PIPC), the independent data protection authority of South Korea. As part of the adequacy talks, both sides also agreed on several additional safeguards that will improve the protection of personal data processed in South Korea, such as transparency and onward transfers.
These safeguards provide stronger protections, for example, South Korean data importers will be required to inform Europeans about the processing of their data, and onward transfers to third countries must ensure that the data continue to enjoy the same level of protection. These regulations are binding and can be enforced by the PIPC and South Korean courts.
The Commission has also published a Q&A on the adequacy decision.
22. January 2021
Already at the beginning of December 2020, first indications came up signaling that WhatsApp will change its terms of service and privacy policy. Earlier this year, users received the update notice when launching the app on their device. It stated that the new terms concern additional information on how WhatsApp processes user data and how businesses can use Facebook hosted services to store and manage their WhatsApp chats. The terms should be accepted by February 8th, 2021, to continue using the chat service. Otherwise, the deletion of the account was suggested, because it will not be possible to use WhatsApp without accepting the changes. The notice has caused all sorts of confusion and criticism, because it has mistakenly made many users believe that the agreement allows WhatsApp to share all collected user data with company parent Facebook, which had faced repeated privacy controversies in the past.
Users’ fears in this regard are not entirely unfounded. As a matter of fact, outside the EU, WhatsApp user data has already been flowing to Facebook since 2016 – for advertising purposes, among other things. Though, for the EU and the United Kingdom, other guidelines apply without any data transfer.
The negative coverage and user reactions caused WhatsApp to hastily note that the changes explicitly do not affect EU users. Niamh Sweeney, director of policy at WhatsApp, said via Twitter that it remained the case that WhatsApp did not share European user data with Facebook for the purpose of using this data to improve Facebook’s products or ads.
However, since the topic continues to stir the emotions, WhatsApp felt compelled to provide clarification with a tweet and a FAQ. The statements make it clear once again that the changes are related to optional business features and provide further transparency about how the company collects and uses data. The end-to-end encryption, with which chat content is only visible to the participating users, will not be changed. Moreover, the new update does not expand WhatsApp’s ability to share data with Facebook.
Nevertheless, despite all efforts, WhatsApp has not managed to explain the changes in an understandable way. It has even had to accept huge user churn in recent days. The interest in messenger alternatives has increased enormously. Eventually, the public backlash led to an official announcement that the controversial considered update will be delayed until May 15th, 2021. Due to misinformation and concern, users shall be given more time to review the policy on their own in order to understand WhatsApp’s privacy and security principles.
21. December 2020
On December 15th, the European Commission published drafts on the “Digital Service Act” (“DSA”) and the “Digital Market Act” (“DMA”), which are intended to restrict large online platforms and stimulate competition.
The DSA is intended to rework the 20-year-old e-Commerce Directive and introduce a paradigm shift in accountability. Under the DSA, platforms would have to prove that they acted in a timely manner in removing or blocking access to illegal content, or that they have no actual knowledge of such content. Violators would face fines of up to 6% of annual revenue. Authorities could order providers to take action against specific illegal content, after which they must provide immediate feedback on what action was taken and when. Providing false, incomplete or misleading information as part of the reporting requirement or failing to conduct an on-site inspection could result in fines of up to 1% of annual revenue. The scope of said illegal content is to include for example, criminal hate comments, discriminatory content, depictions of child sexual abuse, non-consensual sharing of private images, unauthorized use of copyrighted works, and terrorist content. Hosting providers will be required to establish efficient notice and action mechanisms that allow individuals to report and take action against posts they deem illegal. Platforms would not only be required to remove illegal content, but also explain to users why the content was blocked and give them the opportunity to complain.
Any advertising on ad-supported platforms would be required to be clearly identifiable as advertising and clearly state who sponsored it. Exceptions are to apply to smaller journalistic portals and bloggers, while even stricter rules would apply to large platforms. For example, platforms with more than 45 million active users in the EU could be forced to grant comprehensive access to stored data, provided that trade secrets are not affected, and to set up archives that make it possible to identify disinformation and illegal advertising.
Social network operators would have to conduct annual risk assessments and review how they deal with systemic threats, such as the spread of illegal content. They would also be required to provide clear, easy-to-understand and detailed reports at least once a year on the content moderation they have carried out during that period.
Newly appointed “Digital Service Coordinators” in each EU-Member-State are supposed to enforce the regulation, for example by ordering platforms to share data with researchers who shall investigate the platforms relevant activities, while a new European committee is to ensure that the DSA is applied uniformly across the EU. On demand of the Digital Service Coordinators platforms would have to provide researchers with key data, so they can investigate the platforms relevant activities.
The DMA includes a list of competition requirements for large platforms, so called “gatekeepers”, that have a monopoly-like status. The regulations aim to strengthen smaller competitors and prevent the large gatekeepers from using their dominance to impose practices perceived as unfair. They would neither be allowed to exclusively pre-install their own applications, nor to force other operating system developers or hardware manufacturers to have programs pre-installed exclusively by the gatekeeper’s company. In addition, preventing users from uninstalling included applications would be prohibited. Other common measures of self-preference would also be prohibited. For example, gatekeepers would no longer be allowed to use data generated by their services for their own commercial activities without also making the information available to other commercial users. If a provider wanted to merge data generated by different portals, he would have to obtain explicit consent from users to do so.
The publication of the DSA and the DMA is the next step in the European Commission’s 2020 European strategy for data, following the proposal of the Data Governance Act in November. Like the Data Governance Act, the DSA and DMA aim to push back the dominance of tech giants, particularly those from the U.S. and China, while promoting competition.
24. September 2020
In an interview with The Financial Times on Sunday, EU-Commissioner Thierry Breton stated that the European Union is considering plans to increase its enforcement powers regarding tech giants.
This empowerment is supposed to include punitive measures such as forcing tech firms to break off and sell their EU operations if the dominance on the market becomes too large. It is further considered to enable the EU to be able to boot tech companies from the EU single market entirely. Breton stated these measures would of course only be used in extreme circumstances, but did not elaborate on what would qualify as extreme.
“There is a feeling from end-users of these platforms that they are too big to care,” Thierry Breton told The Financial Times. In the interview, he compared tech giants’ market power to the big banks before the financial crisis. “We need better supervision for these big platforms, as we had again in the banking system,” he stated.
In addition, the European Union is considering a rating system, in which companies would be given scores in different categories such as tax compliance, taking action against illegal content, etc. However, Breton said that it is not the intend to make companies liable for their users’ content.
Breton further said that the first drafts of the new law will be ready by the end of the year.
Once the final draft is in place, it will require approval both by the European Parliament as well as the European Council, before it can be enacted.