Personal data risks in the aftermath of the overturning of Roe vs. Wade

23. August 2022

At the end of June 2022, the United States Supreme Court overturned its 1973 ruling in the case of Roe vs. Wade, thus concretely ending federal abortion rights. The decision caused a worldwide outrage, but now a concerning situation presents itself: the massive use of social media and the Internet by the population could result in serious personal privacy violations by the authorities. For example, tech giants such as Apple, Google and Meta Inc. could share users’ data if law enforcement authorities suspect a felony is being committed. This could especially be the case in those States who chose to make abortion illegal after the Supreme Court’s ruling. According to the United States’ Federal Rules of Civil Procedure no. 45, this kind of personal data could be made object of a subpoena, thus forcing the subject to produce them in court. In such a scenario tech companies would have no choice than to provide the consumer’s data. It is clear that this is a high risk for the consumer’s privacy.

In particular, location data could show if a person visited an abortion clinic. Many women use specific apps in order to track periods, fertility and an eventual pregnancy. All these data could be put under surveillance and seized by law enforcement in order to investigate and prosecute abortion – related cases.

In some States this already happened. In 2018 in Mississippi a woman was charged with second – degree murder after seeking health care for a pregnancy loss which happened at home. Prosecutors produced her Internet browser history as proof. After two years she was acquitted of the charges.

Another risk is posed by the so – called data brokers: these are companies that harvest data, cleanse or analyze it and sell them to the highest bidder. These companies could also be used by law enforcement agencies to arbitrarily investigate people who could be related to abortion cases.

The lack of legislation regarding personal data protection is a serious issue in the United States. For example, there is no principle of data minimization as found in the GDPR. The Supreme Courts’ ruling makes this historical moment unexplored territory from a legal point of view. Privacy advisors and activists recommend to try to limit the digital footprint users leave on the web. Also, new laws and bills could be introduce in order to limit the access law enforcement agencies have to personal data.

noyb lodges 226 complaints with 18 different supervisory authorities against websites using “OneTrust” cookie banner software

22. August 2022

On August 9th, 2022, the Austrian NGO noyb announced on its website that it had lodged over 200 complaints with 18 supervisory authorities against several websites that have the cookie banner software “OneTrust” in use.

noyb claims that those banners are designed in a way that nudges the user into clicking the accept button.

According to noyb’s legal analysis, websites that use these cookie banners are neither in conformity with the ePrivacy Directive nor with the GDPR. Further noyb argues: “Deceptive cookie banner designs try to force a user’s agreement by making it insanely burdensome to decline cookies. The GDPR actually requires a fair yes/no choice, not crazy click-marathons.”

It is important to highlight that the complaints were only lodged against companies hosting these websites and using possibly unlawful cookie banners which did not respond to noyb’s emails. Interestingly enough, even companies who have not been contacted by noyb have proceeded, in the mean-time, to update their cookies in accordance with a guiding document provided by noyb.

In response to noyb’s multiple complaints in relation to cookie banners, the EDPB decided to establish a task force in September 2021.

Category: Cookies
Tags: , ,

European Data Protection Board adopts a dispute resolution decision in the context of Instagram

17. August 2022

In early December 2021, the Irish Data Protection Commission (DPC) in its capacity as lead supervisory authority responsible for overseeing Instagram (meta) sent a draft decision to other European supervisory authorities in line with Art. 60 (3) GDPR. In this draft decision, the DPC expressed its concern with instagram’s compliance with several GDPR provisions, notably Art. 5(1)(a) and (c), 6(1), 12(1), 13, 24, 25 and 35 GDPR.

The lead supervisor authority specifically raised the issue of the public disclosure of children’s personal data, such as e-mail addresses and phone numbers, due to their use of the Instagram business account feature.

The respective Supervisory Authorities, however, did not fully agree with the draft decision and issued objections in accordance with Art. 60(4) GDPR. Unable to find common ground on some of the objections, Art. 65(1) (a) GDPR laying down the dispute resolution procedure, became applicable. Consequently, the lead supervisory authority, the DPC, was required to ask the European Data Protection Board (EDPB) to adopt a binding decision.

On July 29, 2022, the EDPB announced that it had adopted a dispute resolution decision following these objections. Now, it is upon the DPC to adopt its final decision and to communicate it to the controller. The DPC has one month to issue its final decision, albeit it should be based on the EDPB decision.

EDPB AND EDPS criticise the Commission’s Proposal to combat child sexual abuse online

15. August 2022

In May 2022, the European Commission published its proposal on combating child sexual abuse material. The Commission justified the need for this proposal with the alleged insufficiency of voluntary detection carried out by companies. Recently, the European Data Protection Board (EDPB) and European Data Protection Supervisor (EDPS) have issued a joint statement criticizing the proposal on privacy grounds.

According to the proposal, hosting services and communication services would be obliged to identify, remove and report online child pornography. This, in turn, requires that encrypted messages can be screened. In other words, the actual text messages are to be read in order to detect grooming.

In their joint criticism, the EDPB and EDPS highlight that such an AI based system will most likely result into errors and false positives.

EDPS Supervisor, Wojciech Wiewiórowski, said: “Measures allowing public authorities to have access to the content of communications, on a generalised basis, affect the essence of the right to private life. Even if the technology used is limited to the use of indicators, the negative impact of monitoring the text and audio communications of individuals on a generalised basis is so severe that it cannot be justified under the EU Charter of Fundamental Rights. The proposed measures related to the detection of solicitation of children in interpersonal communication services are extremely concerning.”

Individual brought an action against European Commission before Court of Justice of the European Union

27. July 2022

A German citizen brought an action against the European Commission (the Commission) before the Court of Justice of the European Union claiming that the Commission is involved in illegal international data transfers to the US.

The subject-matter of the action, which was recently admitted by the Court, relates to data processing carried out in the context of the web page “future.europa.eu”, a platform that intends to increase citizen’s engagement with the EU.

In his complaint, that was drafted by EuGD, a German data protection organization, he alleges, amongst other things, that upon accessing said website and by enabling a facebook login, personal data, such as users’ IP addresses, is being transferred to US clouds and webhosts. The action’s allegations of illegal transfers are also grounded on the Schrems II judgment according to the organization’s press release.

It should be noted that personal data processings by organs of the EU do not fall under the scope of the GDPR, but instead they are regulated by another regulation, that is, regulation 2018/1725 on the protection of natural persons with regard to the processing of personal data by the Union institutions, bodies, offices and agencies and on the free movement of such data.

Even though the GDPR does not apply to the Commission, regulation 2018/1725 does mention the GDPR in the context of international data transfers to third countries (e.g. recital 65) and it is not too far fetched to hold the view that the ruling contained in Schrems II will indeed extend to this regulation.

One should also remember Recital 5 of Regulation 2018/1725 that reads the following:

Whenever the provisions of this Regulation follow the same principles as the provisions of Regulation (EU) 2016/679, those two sets of provisions should, under the case law of the Court of Justice of the European Union (the ‘Court of Justice’), be interpreted homogeneously, in particular because the scheme of this Regulation should be understood as equivalent to the scheme of Regulation (EU) 2016/679.

The claimant also alleges that the Commission did not duly respond to his access request in which he requested information on the data processed and about the safeguards in place. He specifically alleges that one request was not answered properly and that the other one was left unanswered at first.

The action questioning the legality of European webpages that use US webhosts and enable facebook log-ins comes at an interesting moment in time. Not too long ago, facebook/meta data transfers’ compatibility with GDPR was challenged by the DPC when it recommended to halt EU-US transfers of meta products for failing to comply with the GDPR.

The founder of the organization that is assisting the legal action told EURACTIV “that if a restaurant or a bakery has to figure out a way to comply with the ban on data transfers to the United States, so does the European Commission, as there cannot be double standards.”

Privacy issues in the antitrust legal framework: “the Facebook case”

21. July 2022

European countries were among the first to introduce privacy laws in the context of antitrust and in the competition law framework. As a result of this implementation, in 2019 the German Federal Cartel Office took action to stop Facebook (now a part of Meta Inc.) from further processing personal data that had been acquired through third – party installations (most of all referring to cookies). The proceedings on the matter are still ongoing. Recently also the Irish Data Protection Authority took position against Facebook (which has in the meantime become Meta Inc.), by preventing the American tech giant to transfer user data to the United States due to data safety issues. Also in this matter the parties are still in debate.

In 2014 Facebook notoriously purchased messaging company WhatsApp for almost 22 bln. dollars. At the time Europe did not give much thought to the potential consequences of this merger. This operation was the object of an opinion of the European Commission; in the Commission’s mind the two companies’ privacy policies were way different, and the thought that Facebook now had control over all of the data collected by WhatsApp did not sit well with the European authorities. Another key argument brought forward by the Commission was the lack of an effective competition between the two companies. However, no further action was taken at the time.

A few years later, academic research highlighted the mistake made by the European Commission in not considering the enormous meaning personal data have for these tech companies: due to the fact that personal data are considered to be so – called “nonprice competition”, they play a key role in the strategies and decision – making of big data – driven business models. In particular, when a company depends on collecting and using personal data, it usually lowers the bar of privacy protection standards and raises the number of data collected. This argument was brought forward by the U.K.’s Competition Agency, which stated that by considering the enormous importance personal data have gained in the digital market, companies such as Facebook do not have to face a strong competition in their business.

These arguments and the growing unrest in various DPAs around the globe has brought in 2020 to the notorious investigation of Facebook by the Federal Trade Commission of the United States. In particular the FTC accused Meta Inc. (in particular Facebook) of stifling its competition in order to retain its monopoly of the digital market. On one hand an American court dismissed the claims, but on the other hand the high risks connected with an enormous data collection was highlighted. In particular, according to Section 2 of the Sherman Act, the State has:

  • To prove that a company is in fact a monopoly, and
  • That it has to harm consumers

This does not apply directly to the case, but the FTC argued that the harm to the consumers is to be seen in Meta Inc.’s lowering privacy standards. The case is still pending as of July 2022.

This merger showed how much privacy and antitrust issues overlap in the digitalized market.

In the following months, policymakers and enforcers both in the United States and in the European Union have been struggling to establish new sets of rules to better regulate mergers between companies whose business model relies on the collection of personal data, and above all they called for more cooperation between privacy and antitrust agencies.

The Commission’s Proposal for the European Health Data Space raises data protection concerns

On May 3, 2022, the European Commission (EC) published its proposal for the creation of the European Health Data Space (EHDS). This proposal, if adopted, would foresee the creation of an EU-wide infrastructure that allows to link health data sets for practitioners, researchers, and industry. In its communication, the EC points at the necessity for promoting “the free, cross-border flows of personal data” with the aim of creating an “internal market for personal health data and digital health products and services”.

Doctors in Germany, by way of an example, would then be able to access the medical file of a Spanish patient that is currently undergoing medical treatment in Germany. In this context, it might be worthy to note that not all Member States are maintaining electronic records of patients having the consequence that this proposal would require certain member states to take steps towards digitalization. With regard to researchers and industry, the underlying incentive of this proposal is to enable them to draw from health data available to create new solutions and to push forward innovation.

Nevertheless, health data are sensitive data within the meaning of the GDPR, which means that access to such data is only exceptionally possible. This begs the question whether and how access to personal health data that this proposal is intending to enable, can be reconciled with the GDPR. Recently, the European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS) issued a joint opinion on this new legislative initiative expressing several concerns in relation to the proposal from a data protection perspective.

If one takes the example of health data processed while accessing healthcare, then the legal ground of art. 9 (2) (h) GDPR, namely that of medical diagnosis or provision of health, would be applicable. Further processing for any other purpose, however, would then require the data subject’s consent.

In the words of EDPB Chair Andrea Jelinek: “The EU Health Data Space will involve the processing of large quantities of data which are of a highly sensitive nature. Therefore, it is of the utmost importance that the rights of the European Economic Area’s (EEA) individuals are by no means undermined by this Proposal. The description of the rights in the Proposal is not consistent with the GDPR and there is a substantial risk of legal uncertainty for individuals who may not be able to distinguish between the two types of rights. We strongly urge the Commission to clarify the interplay of the different rights between the Proposal and the GDPR.”

Diving into the details of the joint opinion, the EDPB and EDPS strongly recommends making secondary use of personal data steaming from wellness applications, such as wellness and behavioral data, be subject to the prior consent of the data subject, in case these data, contrary to EDPB and EDPS’ recommendation, are not excluded from the scope of this proposal altogether.

That would not only be in line with the GDPR, but would also make possible to differentiate between health data generated by wellness applications, on the one hand, and health data generated by medical devices, on the other hand.

The fundamental difference between both data lies in the different degrees of quality and the fact that wellness applications do also process, for instance, food practices which therefore allows to draw conclusions from data subjects’ daily activities, habits, and practices.

DPC sends draft decision on Meta’s EU-US data transfers to other European DPAs

14. July 2022

On July 7, 2022, it became known that the Irish Data Protection Commission (DPC) had forwarded a draft decision concerning Meta’s EU-US data transfers to other European DPAs for consultation. Having to respect a four-week-period, European DPAs may comment on this draft or formulate objections to it. In such an event, the DPC would be given an additional month to respond to the objections raised (article 60 GDPR).

According to information available to politico, the DPC is intending to halt Meta’s EU-US transfer. The DPC is said to have concluded in its out of “own volition” draft decision that Meta can no longer rely on the SCCs when it transfers their user’s personal data to US based servers. In other words, even though Meta has implemented the EU’s SSCs, it cannot be ruled out that US intelligence services may gain access to personal data of data subjects using facebook, instagram and other meta products.

Following the striking down of both, the Safe Harbour Agreement in 2015 and the EU-US Privacy Shield in 2020 by the Court of Justice of the European Union, this draft decision seems to question the legality and compatibility of EU-US data transfers with the GDPR for a third time.

In this context it is worthy to consider a statement Meta made in its annual report to the United States Securities and Exchange Commission (SEC):

“If a new transatlantic data transfer framework is not adopted and we are unable to continue to rely on SCCs or rely upon other alternative means of data transfers from Europe to the United States, we will likely be unable to offer a number of our most significant products and services, including Facebook and Instagram, in Europe, which would materially and adversely affect our business, financial condition, and results of operations.”

Despite the possibility of a halt of Meta’s EU-US data transfers, there is reason to believe that this DPC initiated procedure will be continued in the future and that it will go beyond the previously mentioned four-weeks timeline. “We expect other DPAs to issue objections, as some major issues are not dealt with in the DPC’s draft. This will lead to another draft and then a vote”, says NOYB’s Max Schrems who filed the original complaint to the DPC. Hence, it seems rather unlikely that an instant stop of an EU-US transfer will occur. Instead, we could rather expect article 65 GDPR to be triggered meaning that the EDPB would be required to issue a final decision, including a vote, on the matter.

With no concrete EU-US transfer agreement in sight and the ongoing uncertainty on whether the DPC will eventually succeed with its draft decision, this matter continues to be of big interest.

European Parliament adopts Digital Services Act and Digital Markets Act

7. July 2022

On July 5, 2022, the EU Parliament voted in favor of the long-awaited Digital Markets Act (DMA) and Digital Services Act (DSA) following trilogue talks and agreements held between Parliament, Council, and European Commission earlier this year.

While the DSA amending the e-Commerce directive strictly prohibits specific forms of targeted advertising and misleading practices, the DMA can be viewed as the Competition law component that sets out stricter obligations for large online platforms within the Commission’s Digital Services Package.

Upon entry into force, advertisements targeting children, advertisements based on sensitive data, and dark patterns will no longer be permitted. Further, online platforms need to provide its users with the option and choice to not receive recommendations based on profiling. What the DSA also seeks to do, is to strengthen platform’s accountability and transparency. This means  that these platforms have to provide authorities and vetted researchers with access to information on the content moderation rules the respective platform uses as well as information on the algorithms used by recommender systems.

The spread of illegal content, such as hate speech, is also being addressed by these legislations obliging large platforms to respond quickly with due regard to other fundamental rights implicated.

Online platforms and other service providers not respecting the new obligations, may be fined with 10% of their annual total turnover in case of violations of the DMA, and 6% for violations of the DSA.

Artificial Intelligence and Personal Data: a hard co-existence. A new perspective for the EU

In the last decades AI has had an impressive development in various fields. At the same time, with each step forward the new machines and the new processes they are programmed to perform need to collect way more data than before in order to function properly.

One of the first things that come to mind is how can the rise of AI and the principle of data minimization, as contained in Art. 5 para. 1 lit. c) GDPR, be reconciled? At first glance it seems contradictory that there may be a way: after all, the GDPR clearly states that the number of personal data collected should be as small as possible. A study carried out by the Panel for the Future of Science and Technology of the European Union suggests that, given the wide scope (referring to the exceptions contained in the article) conceded by the norm, this issue could be addressed by measures like pseudonymization. This means that the data collected by the AI is deprived of every information that could refer personal data to a specific individual without additional information, thus lowering the risks for individuals.

The main issue with the current legal framework of the European Union regarding personal data protection is the fact that certain parts have been left vague, which causes uncertainty also in the regulation of artificial intelligence. To address this problem, the EU has put forward a proposal for a new Artificial Intelligence Act (“AIA”), aiming to create a common and more “approachable” legal framework.

One of the main features of this Act is that it divides the application of artificial intelligence in three main categories of risk levels:

  1. Creating an unacceptable risk, thus prohibited AIs (e.g. systems that violate fundamental rights).
  2. Creating a high risk, subject to specific regulation.
  3. Creating a low or minimum risk, with no further regulation.

Regarding high-risk AIs, the AIA foresees the creation of post-market monitoring obligations. If the AI in question violates any part of the AIA, it can then be forcibly withdrawn from the market by the regulator.

This approach has been welcomed by the Joint Opinion of the EDPB – EDPS, although the two bodies stated that the draft still needs to be more aligned with the GDPR.

Although the Commission’s draft contains a precise description of the first two categories, these will likely change over the course of the next years as the proposal is undergoing the legislative processes of the EU.

The draft was published by the European Commission in April 2021 and must still undergo scrutiny from the European Parliament and the Council of the European Union. Currently, some amendments have been formulated and the draft is still under review by the Parliament. After the Act has passed the scrutiny, it will be subject to a two – year implementation period.

Finally, a question remains to be answered: who shall oversee and control the Act’s implementation?It is foreseen that national supervisory authorities shall be established in each EU member state. Furthermore, the AIA aims at establishing a special European AI Board made up of representatives both of the member States and of the European Commission, which will also be the chair. Similar to the EDPB, this Board shall have the power to issue opinions and recommendations, and ensure the consistent application of the regulation throughout the EU.

Pages: Prev 1 2 3 4 5 6 7 8 9 10 ... 67 68 69 Next
1 2 3 4 5 69