G7 Data Protection Authorities discuss flow of data across borders

27. September 2022

From September 6th to September 9th, 2022 a meeting between representatives of the G7’s Data Protection Authorities was held in Bonn, Germany, to discuss current regulatory and technological issues concerning the concept of Data Flow with Free Trust (DFFT), a proposed guiding principle for international cooperation on data flows.

It aims at providing answers to several questions in order to create a safe global digital environment in which the protection of data flow is guaranteed. The most important question is: how to overcome the existing data flow barriers? It may seem difficult to introduce a harmonization between countries that have a completely different approach and regulations in regard to personal data protection. To answer this question, a bottom – up approach was adopted for the implementation of the DFFT: it is foreseen that high – level intragovernmental discussions that result in pragmatic rule – making will be held, in order to parallel the public/private relationship for the resolution of individual issues.

Scholars and experts seem to think that RegTech could prove a very useful help to the implementation of the DFFT. To tackle some of the issues that were found in the various discussions and that resulted from research, the World Economic Forum issued a white paper finding seven common success factors that define the best deployment of RegTech.

This concept, first proposed by Japan’s late Prime Minister Shinzo Abe in 2019, is now moving into the implementation phase, mainly concerning trade agreements including e – commerce. A milestone regarding this topic will probably be the next G7 Conference, which will be held in Japan in 2023. Kishida Fumio, the new Japanese Prime Minister, claimed his country’s initiative in the project, and pledged his commitment to the continuous development of the DFFT.

EDPS takes legal action against Europol’s new regulation

ON June 28th 2022, two new provisions of the amended Europol regulation came into force. These changes are considered worrying by the European Data Protection Supervisor (EDPS), as they have a direct impact on the data processing of individuals in the European Union: based on these provisions, the new regulation allows the Europol to retroactively process large volumes of data, even of individuals with no links to criminal activity.

Specifically, before these new provisions were passed, individuals could expect that if their data was gathered by Europol it would be processed within six months in order to establish whether the individual was involved in illicit activities or not, and if the former was the case, that the data related to that person would be deleted. With these modifications, Europol would be allowed to store and process these data even if the individual was found not part of any wrongdoing.

In an effort to stop these changes to effectively come into force, the EDPS issued an order on January 3rd 2022 to amend the new provisions including a precisely determined deletion period for data related to individuals not connected to unlawful activities. Seen as the order was ignored by Europol, on September 16th the EDPS requested that the European Court of Justice (ECJ) annuls these two provisions. The authorities stated that this proceeding by Europol is a clear violation of the individual’s fundamental rights.

Furthermore, it is clear that by overriding a direct order by the European data protection watchdogs and by introducing such amendments the independent controlling power of the supervising authority is undermined: this could set a dangerous precedent by which authorities in the European Union could foresee possible counter – reactions of the legislative power to override their supervising activities depending on political will. This would result in a clear violation of the European Charter of Fundamental Rights, since there would be a concrete risk of undermining the independence of a controlling authority by making it subject to undue political pressure or interference.

noyb files complaints against Google with CNIL in the context of direct marketing emails

30. August 2022

On August 24th, 2022, the Austrian NGO noyb announced that it had filed a complaint against Google with CNIL, the French Supervisory Authority in the context of direct marketing emails.

According to noyb, several google users on whose behalf noyb filed the complaint, have received advertising emails for which these users have not given their consent. This would however contravene Art. 13 (1) ePrivacy Directive which reads the following: “the use […] of electronic mail for the purposes of direct marketing may only be allowed in respect of subscribers who have given their prior consent.”

The issue of “inbox advertising” has also received the attention of the Court of Justice of the European Union (CJEU). In its judgment from 2021, the CJEU pronounced itself on the lawfulness of this advertising practice holding the view that emails sent to user’s inbox for the purpose of direct marketing require consent.

Noyb highlights in its announcement that “[s]pam is a commercial email sent without consent. And it is illegal. Spam does not become legal just because it is generated by the email provider.”

It remains to be seen whether this complaint will lead to the imposition of a fine by the CNIL.

Danish watchdogs ban Google Chromebooks and Google Workspace in municipality

26. August 2022

In July 2022, after an investigation related to a data breach was carried out by the Danish Data Protection Authority (Datailsynet), Google Chromebooks and Google Workspace were banned in schools in the municipality of Helsingor. The DPA ruled that the risk assessment carried out by city officials shows that the processing of personal data by Google does not meet GDPR requirements. In particular, data transfers have been targeted by the Authority: the Data Processing Agreement allows data transfer to third countries for analytical and statistical support, though the data are primarily stored in Google’s European facilities.

This decision comes in a moment of tension in the world of personal data between Europe and the United States of America: other notorious cases (some still ongoing) are the case of the Irish Data Protection Authority vs. Facebook (now part of Meta Inc.), and the case of the German Federal Cartel Office vs. Facebook. European watchdogs have found that in many cases the American tech giants’ policies do not meet the requirements established by the GDPR. This could be traced back to a lack of legal framework in the field of privacy and personal data protection in the United States, were these companies are based.

This decision was taken in the aftermath of the Schrems II ruling by the European Court of Justice, which stated that the pre-existing agreement on data transfers between Europe and the US (so-called Privacy Shield)was not compatible with the GDPR. A new deal is on the table, but not yet approved nor effective.

Google is becoming the target of various investigations by European data watchdogs, above all because of its tool Google Analytics. In January the Austrian Data Protection Authority published an opinion in which it stated that companies using Google Analytics inadvertently transferred customers’ personal data such as IP addresses to the United States, in breach of the GDPR. Italy’s Garante per la Protezione dei Dati Personali published a similar opinion a few weeks later, stating that “the current methods adopted by Google do not guarantee an adequate level of protection of personal data”.

Personal data risks in the aftermath of the overturning of Roe vs. Wade

23. August 2022

At the end of June 2022, the United States Supreme Court overturned its 1973 ruling in the case of Roe vs. Wade, thus concretely ending federal abortion rights. The decision caused a worldwide outrage, but now a concerning situation presents itself: the massive use of social media and the Internet by the population could result in serious personal privacy violations by the authorities. For example, tech giants such as Apple, Google and Meta Inc. could share users’ data if law enforcement authorities suspect a felony is being committed. This could especially be the case in those States who chose to make abortion illegal after the Supreme Court’s ruling. According to the United States’ Federal Rules of Civil Procedure no. 45, this kind of personal data could be made object of a subpoena, thus forcing the subject to produce them in court. In such a scenario tech companies would have no choice than to provide the consumer’s data. It is clear that this is a high risk for the consumer’s privacy.

In particular, location data could show if a person visited an abortion clinic. Many women use specific apps in order to track periods, fertility and an eventual pregnancy. All these data could be put under surveillance and seized by law enforcement in order to investigate and prosecute abortion – related cases.

In some States this already happened. In 2018 in Mississippi a woman was charged with second – degree murder after seeking health care for a pregnancy loss which happened at home. Prosecutors produced her Internet browser history as proof. After two years she was acquitted of the charges.

Another risk is posed by the so – called data brokers: these are companies that harvest data, cleanse or analyze it and sell them to the highest bidder. These companies could also be used by law enforcement agencies to arbitrarily investigate people who could be related to abortion cases.

The lack of legislation regarding personal data protection is a serious issue in the United States. For example, there is no principle of data minimization as found in the GDPR. The Supreme Courts’ ruling makes this historical moment unexplored territory from a legal point of view. Privacy advisors and activists recommend to try to limit the digital footprint users leave on the web. Also, new laws and bills could be introduce in order to limit the access law enforcement agencies have to personal data.

noyb lodges 226 complaints with 18 different supervisory authorities against websites using “OneTrust” cookie banner software

22. August 2022

On August 9th, 2022, the Austrian NGO noyb announced on its website that it had lodged over 200 complaints with 18 supervisory authorities against several websites that have the cookie banner software “OneTrust” in use.

noyb claims that those banners are designed in a way that nudges the user into clicking the accept button.

According to noyb’s legal analysis, websites that use these cookie banners are neither in conformity with the ePrivacy Directive nor with the GDPR. Further noyb argues: “Deceptive cookie banner designs try to force a user’s agreement by making it insanely burdensome to decline cookies. The GDPR actually requires a fair yes/no choice, not crazy click-marathons.”

It is important to highlight that the complaints were only lodged against companies hosting these websites and using possibly unlawful cookie banners which did not respond to noyb’s emails. Interestingly enough, even companies who have not been contacted by noyb have proceeded, in the mean-time, to update their cookies in accordance with a guiding document provided by noyb.

In response to noyb’s multiple complaints in relation to cookie banners, the EDPB decided to establish a task force in September 2021.

Category: Cookies
Tags: , ,

European Data Protection Board adopts a dispute resolution decision in the context of Instagram

17. August 2022

In early December 2021, the Irish Data Protection Commission (DPC) in its capacity as lead supervisory authority responsible for overseeing Instagram (meta) sent a draft decision to other European supervisory authorities in line with Art. 60 (3) GDPR. In this draft decision, the DPC expressed its concern with instagram’s compliance with several GDPR provisions, notably Art. 5(1)(a) and (c), 6(1), 12(1), 13, 24, 25 and 35 GDPR.

The lead supervisor authority specifically raised the issue of the public disclosure of children’s personal data, such as e-mail addresses and phone numbers, due to their use of the Instagram business account feature.

The respective Supervisory Authorities, however, did not fully agree with the draft decision and issued objections in accordance with Art. 60(4) GDPR. Unable to find common ground on some of the objections, Art. 65(1) (a) GDPR laying down the dispute resolution procedure, became applicable. Consequently, the lead supervisory authority, the DPC, was required to ask the European Data Protection Board (EDPB) to adopt a binding decision.

On July 29, 2022, the EDPB announced that it had adopted a dispute resolution decision following these objections. Now, it is upon the DPC to adopt its final decision and to communicate it to the controller. The DPC has one month to issue its final decision, albeit it should be based on the EDPB decision.

EDPB AND EDPS criticise the Commission’s Proposal to combat child sexual abuse online

15. August 2022

In May 2022, the European Commission published its proposal on combating child sexual abuse material. The Commission justified the need for this proposal with the alleged insufficiency of voluntary detection carried out by companies. Recently, the European Data Protection Board (EDPB) and European Data Protection Supervisor (EDPS) have issued a joint statement criticizing the proposal on privacy grounds.

According to the proposal, hosting services and communication services would be obliged to identify, remove and report online child pornography. This, in turn, requires that encrypted messages can be screened. In other words, the actual text messages are to be read in order to detect grooming.

In their joint criticism, the EDPB and EDPS highlight that such an AI based system will most likely result into errors and false positives.

EDPS Supervisor, Wojciech Wiewiórowski, said: “Measures allowing public authorities to have access to the content of communications, on a generalised basis, affect the essence of the right to private life. Even if the technology used is limited to the use of indicators, the negative impact of monitoring the text and audio communications of individuals on a generalised basis is so severe that it cannot be justified under the EU Charter of Fundamental Rights. The proposed measures related to the detection of solicitation of children in interpersonal communication services are extremely concerning.”

Individual brought an action against European Commission before Court of Justice of the European Union

27. July 2022

A German citizen brought an action against the European Commission (the Commission) before the Court of Justice of the European Union claiming that the Commission is involved in illegal international data transfers to the US.

The subject-matter of the action, which was recently admitted by the Court, relates to data processing carried out in the context of the web page “future.europa.eu”, a platform that intends to increase citizen’s engagement with the EU.

In his complaint, that was drafted by EuGD, a German data protection organization, he alleges, amongst other things, that upon accessing said website and by enabling a facebook login, personal data, such as users’ IP addresses, is being transferred to US clouds and webhosts. The action’s allegations of illegal transfers are also grounded on the Schrems II judgment according to the organization’s press release.

It should be noted that personal data processings by organs of the EU do not fall under the scope of the GDPR, but instead they are regulated by another regulation, that is, regulation 2018/1725 on the protection of natural persons with regard to the processing of personal data by the Union institutions, bodies, offices and agencies and on the free movement of such data.

Even though the GDPR does not apply to the Commission, regulation 2018/1725 does mention the GDPR in the context of international data transfers to third countries (e.g. recital 65) and it is not too far fetched to hold the view that the ruling contained in Schrems II will indeed extend to this regulation.

One should also remember Recital 5 of Regulation 2018/1725 that reads the following:

Whenever the provisions of this Regulation follow the same principles as the provisions of Regulation (EU) 2016/679, those two sets of provisions should, under the case law of the Court of Justice of the European Union (the ‘Court of Justice’), be interpreted homogeneously, in particular because the scheme of this Regulation should be understood as equivalent to the scheme of Regulation (EU) 2016/679.

The claimant also alleges that the Commission did not duly respond to his access request in which he requested information on the data processed and about the safeguards in place. He specifically alleges that one request was not answered properly and that the other one was left unanswered at first.

The action questioning the legality of European webpages that use US webhosts and enable facebook log-ins comes at an interesting moment in time. Not too long ago, facebook/meta data transfers’ compatibility with GDPR was challenged by the DPC when it recommended to halt EU-US transfers of meta products for failing to comply with the GDPR.

The founder of the organization that is assisting the legal action told EURACTIV “that if a restaurant or a bakery has to figure out a way to comply with the ban on data transfers to the United States, so does the European Commission, as there cannot be double standards.”

Privacy issues in the antitrust legal framework: “the Facebook case”

21. July 2022

European countries were among the first to introduce privacy laws in the context of antitrust and in the competition law framework. As a result of this implementation, in 2019 the German Federal Cartel Office took action to stop Facebook (now a part of Meta Inc.) from further processing personal data that had been acquired through third – party installations (most of all referring to cookies). The proceedings on the matter are still ongoing. Recently also the Irish Data Protection Authority took position against Facebook (which has in the meantime become Meta Inc.), by preventing the American tech giant to transfer user data to the United States due to data safety issues. Also in this matter the parties are still in debate.

In 2014 Facebook notoriously purchased messaging company WhatsApp for almost 22 bln. dollars. At the time Europe did not give much thought to the potential consequences of this merger. This operation was the object of an opinion of the European Commission; in the Commission’s mind the two companies’ privacy policies were way different, and the thought that Facebook now had control over all of the data collected by WhatsApp did not sit well with the European authorities. Another key argument brought forward by the Commission was the lack of an effective competition between the two companies. However, no further action was taken at the time.

A few years later, academic research highlighted the mistake made by the European Commission in not considering the enormous meaning personal data have for these tech companies: due to the fact that personal data are considered to be so – called “nonprice competition”, they play a key role in the strategies and decision – making of big data – driven business models. In particular, when a company depends on collecting and using personal data, it usually lowers the bar of privacy protection standards and raises the number of data collected. This argument was brought forward by the U.K.’s Competition Agency, which stated that by considering the enormous importance personal data have gained in the digital market, companies such as Facebook do not have to face a strong competition in their business.

These arguments and the growing unrest in various DPAs around the globe has brought in 2020 to the notorious investigation of Facebook by the Federal Trade Commission of the United States. In particular the FTC accused Meta Inc. (in particular Facebook) of stifling its competition in order to retain its monopoly of the digital market. On one hand an American court dismissed the claims, but on the other hand the high risks connected with an enormous data collection was highlighted. In particular, according to Section 2 of the Sherman Act, the State has:

  • To prove that a company is in fact a monopoly, and
  • That it has to harm consumers

This does not apply directly to the case, but the FTC argued that the harm to the consumers is to be seen in Meta Inc.’s lowering privacy standards. The case is still pending as of July 2022.

This merger showed how much privacy and antitrust issues overlap in the digitalized market.

In the following months, policymakers and enforcers both in the United States and in the European Union have been struggling to establish new sets of rules to better regulate mergers between companies whose business model relies on the collection of personal data, and above all they called for more cooperation between privacy and antitrust agencies.

Pages: 1 2 3 4 5 6 7 8 9 10 ... 65 66 67 Next
1 2 3 67