Category: Data Protection

noyb filed complaints against the cookie paywalls of seven major news websites in Austria and Germany

25. August 2021

Privacy Activist Max Schrems’ data protection organization noyb (an acronym for “none of your business”) announced on August 13th, 2021, they filed complaints against the cookie paywalls of seven major German and Austrian news websites. In the statement, they question whether consent can be “voluntarily” given if you have to pay to keep your data.

An increasing amount of websites asks their users to either agree to data being passed on to hundreds of tracking companies (which generates a few cents of revenue for the website) or take out a subscription (for up to € 80 per year). Can consent be considered “freely given” if the alternative is to pay 10, 20 or 100 times the market price of your data to keep it to yourself?

With these paywalls, the user must decide whether to agree to the use of his or her own data for advertising purposes or to enter into a paid subscription with the respective publisher. However, personal data may only be processed if there is a legal basis for doing so. Such a legal basis may arise, for example, from Article 6 (1) (a) of the GDPR, if the data subject has given his or her consent to this processing. Such consent must be “freely given”. According to Rectical 42, sentence 5, “consent is not regarded as freely given if the data subject has no genuine or free choice or is unable to refuse or withdraw consent without detriment.” noyb is of the opinion that the paywall solution lacks the necessary voluntariness for consent and thus also lacks a legal basis according to Art. 6 (1) a) DSGVO.

Art. 7 (4) GDPR demands, “when assessing whether consent is freely given, utmost account shall be taken of whether, inter alia, the performance of a contract, including the provision of a service, is conditional on consent to the processing of personal data that is not necessary for the performance of that contract.”

In contrast, in a decision on November 30th, 2018, the Austrian data protection authority did not see a violation of the GDPR in a paywall system, as the data subject receives a recognizable benefit, and expressed that the decision was thus voluntary after all.

Accordingly, users’ personal data could be considered a “means of payment” with which they pay for a paid subscription instead of a monetary benefit. Consent to data processing would thus be necessary for fulfillment, as it represents the quid pro quo the data subject, in other words, the purchase price. How the responsible data protection authorities will ultimately decide remains to be seen.

These complaints by noyb represent the organization’s second major campaign this month. On August 10, they have already filed 422 formal complaints with 10 European regulators based on inadequate cookie banners.

Google Play Store apps soon obliged to provide privacy notices

20. August 2021

On the Android Developers Blog, Google has announced further details for the upcoming new safety section in its Play Store. It aims at presenting the security of the offered apps in a simple way to give users a deeper insight into privacy and security practices of the developers. This should allow users to see what data the app may be collecting and why, even before the installation. In order to achieve this, apps in the Google Play Store will be required to publish the corresponding information in the safety section.

The new summary will be displayed to users on an app’s store listing page. It is intended to highlight details such as:

  • What type of data is collected and shared, e.g. location, contacts, name, email address, financial information,
  • How the data will be used, e.g. for app functionality or personalization,
  • Whether the data collection is optional or mandatory for the use of an app,
  • Security practices, e.g. data encryption,
  • Compliance with the family policy,
  • Validation from an independent source against a global security standard.

To support the safety section, policy changes are being made which should lead to more transparency to users. Thus, all developers will be required to provide a privacy notice. Previously, only apps that collected personal and sensitive user data had to do so. The innovation applies to all apps published on Google Play, including Google’s own apps.

Developers will be able to submit information to the Google Play Console for review in October. However, by April 2022 at the latest, the safety section must be approved for their apps. The reason for this is that the new section is scheduled to be rolled out and visible to users in Q1 2022.

Aside from sharing additional information for developers on how to get prepared, Google has also assured that more guidance will be released over the next few months.

Luxembourg’s National Commission for Data Protection fines Amazon a record-breaking 746 million Euros for misuse of customer data

11. August 2021

On August 6, 2021, Amazon disclosed the ruling of the Luxembourg data protection authority Commission nationale pour la protection des donées (CNPD) in an SEC filing, which imposed a record-breaking €746 million fine on Amazon Europe Core S.à.r.l. for alleged violations of the EU General Data Protection Regulation (GDPR) on July 16, 2021.

Based on press reports and Amazon’s public statements, the fine appears to relate to Amazon’s use of customer data for targeted advertising purposes.

The penalty is the result of a 2018 complaint by French privacy rights group La Quadrature du Net, a group that aims to represent the interests of thousands of Europeans to ensure their data is used according to data protection law in an attempt to avoid Big Tech companies manipulating their behavior for political or commercial purposes. The complaint also targets Apple, Facebook, Google and LinkedIn and was filed on behalf of more than 10,000 customers and alleges that Amazon manipulates customers for commercial means by choosing what advertising and information they receive.

Amazon stated that they „strongly disagree with the CNPD’s ruling“ and intend to appeal. „The decision relating to how we show customers relevant advertising relies on subjective and untested interpretations of European privacy law, and the proposed fine is entirely out of proportion with even that interpretation.”

The amount of the fine is substantially higher than the proposed fine in a draft decision that was previously reported in the press. The French data protection authority (CNIL) said Luxembourg’s decision, which is “of an unprecedented scale and marks a turning point in the application of the GDPR and the protection of the rights of European nationals.“

The CNIL confirmed the CNPD fined Amazon, and other European member states agreed to the Luxembourg decision. Amazon will have six months to correct the issue.

CNIL fines Monsanto 400,000 € for GDPR violations

29. July 2021

France’s data protection authority, the Commission Nationale de l’Informatique et des Libertés (CNIL), imposed a fine of 400,000 € on the U.S.-based biotechnology corporation Monsanto Company for contravention of Article 14 GDPR regarding the information of data subjects about the collection of their personal data and Article 28 GDPR concerning contractual guarantees which lay down relations with a data processor.

In May 2019, several media outlets revealed that Monsanto was in possession of a file containing personal data of more than 200 political figures or members of civil society (e.g. journalists, environmental activists, scientists or farmers). The investigations carried out by the CNIL disclosed that the information had been collected for lobbying purposes. The individuals named on this “watch list” were Monsanto’s opponents and critics from several European countries, meant to be “educated” or “monitored”. This strategy should have influenced the debate and public opinion on the renewal of the authorization of glyphosate in Europe, a controversial active substance contained in Monsanto’s best-known product for weed control. The reason for the still current scientific controversy is the causation of diseases by glyphosate, most notably cancer.

The file included, for each of the individuals, personal data such as organization, position, business address, business phone number, cell phone number, business email address, and in some cases Twitter accounts. In addition, each person was given a score from 1 to 5 to evaluate their influence, credibility, and support for Monsanto on various issues such as pesticides or genetically modified organisms.

It should be noted that the creation of contact files by stakeholders for lobbying purposes is not illegal per se. While it is not necessary to obtain the consent of the data subjects, the data have to be lawfully collected and the individuals have to be informed of the processing.

In imposing the penalty, the CNIL considered that Monsanto had failed to comply with the provisions of the GDPR by not informing the data subjects about the storage of their data, as required by Article 14 GDPR. In addition, none of the exceptions provided in Article 14 para. 5 GDPR were applicable in this case. The data protection authority stressed that the aforementioned obligation is a key measure under the GDPR insofar as it allows the data subjects to exercise their other rights, in particular the right to object.

Furthermore, Monsanto violated its obligations under Article 28 GDPR. As a controller, the company was required to establish a legal framework for the processing carried out on its behalf by its processor, in particular to provide data security guarantees. However, in the CNIL’s opinion, none of the contracts concluded between the two companies complied with the requirements of Article 28 para. 4 GDPR.

EDPS and the EDPB call for a tightening of the EU draft legislation on the regulation of Artificial Intelligence (AI)

26. July 2021

In a joint statement, the European Data Protection Supervisor (EDPS) and the European Data Protection Board (EDPB) call for a general ban on the use of artificial intelligence for the automated recognition of human characteristics in publicly accessible spaces. This refers to surveillance technologies that recognise faces, human gait, fingerprints, DNA, voice, keystrokes and other biometric or behavioral signals. In addition to the AI-supported recognition of human characteristics in public spaces, the EDPS and EPDB also call for a ban of AI systems using biometrics to categorize individuals into clusters based on ethnicity, gender, political or sexual orientation, or other grounds on which discrimination is prohibited under Article 21 of the Charter of Fundamental Rights. With the exception of individual applications in the medical field, EDPS and the EDPB are also calling for a ban on AI for sentiment recognition.

In April, the EU Commission presented a first draft law on the regulation of AI applications. The draft explicitly excluded the area of international law enforcement cooperation. The EDPS and EDPB expressed “concern” about the exclusion of international law enforcement cooperation from the scope of the draft. The draft is based on a categorisation of different AI applications into different types of risk, which are to be regulated to different degrees depending on the level of risk to the fundamental rights. In principle, the EDPS and EDPB support this approach and the fact that the EU is addressing the issue in general. However, they call for this concept of fundamental rights risk to be adapted to the EU data protection framework.

Andrea Jelinek, EDPB Chair, and Wojciech Wiewiórowski, of the EDPS, are quoted:

Deploying remote biometric identification in publicly accessible spaces means the end of anonymity in those places. Applications such as live facial recognition interfere with fundamental rights and freedoms to such an extent that they may call into question the essence of these rights and freedoms.

The EDPS and EDPB explicitly support, that the draft provides for national data protection authorities to become competent supervisory authorities for the application of the new regulation and explicitly welcome, that the EDPS is intended to be the competent authority and the market surveillance authority for the supervision of the Union institutions, agencies and bodies. The idea that the Commission also gives itself a predominant role in the “European Artificial Intelligence Board” is questioned by the EU data protection authorities. “This contradicts the need for a European AI Board that is independent of political influence”. They call for the board to be given more autonomy, to ensure its independence.

Worldwide there is great resistance against the use of biometric surveillance systems in public spaces. A large global alliance of 175 civil society organisations, academics and activists is calling for a ban on biometric surveillance in public spaces. The concern is that the potential for abuse of these technologies is too great and the consequences too severe. For example, the BBC reports that China is testing a camera system on Uighurs in Xinjiang that uses AI and facial recognition to detect emotional states. This system is supposed to serve as a kind of modern lie detector and be used in criminal proceedings, for example.

Amex fined for sending four million unlawful emails

15. July 2021

American Express Service Europe Limited (Amex) has received a £ 90,000 fine from the UK Information Commissioner’s Office (ICO) for sending over four million unwanted marketing emails to customers.

The reason for the investigation by UK’s supervisory authority were complaints from Amex customers, which claimed to have been receiving marketing emails even though they had not given their consent to do so. The emails, sent as a part of a campaign, contained information regarding benefits of online shopping, optimal use of the card and encouragement to download the Amex app. According to Amex, the emails were rather about “servicing”, not “marketing”. The company insisted that customers would be disadvantaged if they were not aware of the campaigns and that the emails were a requirement of the credit agreements.

The ICO did not share this view. In its opinion, the emails were aimed at inducing customers to make purchases with their cards in return for a £ 50 benefit, and thus “deliberately” for “financial gain”. This constitutes a marketing activity which, without a valid consent, violates Regulation 22 of the Privacy and Electronic Communications Regulations 2003. The consents and therefore the legal basis were not given in this case.

The ICO Head of Investigations pointed out how important it is for companies to know the differences between a service email and a marketing email to ensure that email communications with customers are compliant with the law. While service messages contain routine information such as changes in terms and conditions or notices of service interruptions, direct marketing is any communication of promotional or marketing material directed to specific individuals.

An Amex spokesperson assured that the company takes customers’ marketing preferences very seriously and has already taken steps to address the concerns raised.

China intensifies data protection of companies

The state leadership in Beijing is tightening its data protection rules. Chinese driving service provider Didi has now become the subject of far-reaching data protection regulatory measures. Other companies could soon be affected as well.

For months now, Chinese regulators and ministries in China have been issuing a slew of new regulations that not only affect tech companies, but are also directed at how companies handle data in general.

A prime example of China’s “new” data protection policy can be seen in Didi’s public launch on the New York Stock Exchange. The Uber rival only went public for a few days and was urged by the Chinese authorities to remove its app from the app store before the end of the week. The reason for this is reported to have been serious data protection violations by the company, which are now being investigated. The company is said to have processed the collection and use of personal data by the company in a privacy-hostile manner.

Didi was ordered to comply with legal requirements and adhere to national standards. It should also ensure that the security of its users’ personal data is effectively protected.

The announcement had sent shares of the stock market newcomer crashing by more than 5% as of Friday. The news also caused tech stocks to fall on Asian exchanges.

Didi is the nearly undisputed leader among ride-hailing services in China, with 493 million active users and a presence in 14 countries.

Beijing’s new data protection

The actions of Chinese authorities or the Chinese leadership against tech companies speak for a rethinking of the Chinese leadership in terms of data protection.

Initially, there is much to suggest that the state leadership wants to get companies more under control. This is also to prevent third countries from obtaining data from Chinese companies and to prevent Chinese companies from installing themselves abroad.

According to reports, a document from the State Council in Beijing indicates that stricter controls are planned for Chinese companies that are traded on stock exchanges abroad. Capital raised by emerging Chinese companies on foreign stock markets, such as in New York or Hong Kong, will also be subject to more stringent requirements. Especially in the area of “data security, cross-border data flow and management of confidential information”, new standards are to be expected.

However, the aim seems also to better protect the data of Chinese citizens from unauthorized access by criminals or excessive data collection by tech groups and companies.
This is supported by the fact that the Chinese leadership has introduced several rules in recent years and months that are intended to improve data protection. Although the state is not to cede its own rights here, citizens are to be given more rights, at least with respect to companies.

The introduction of the European General Data Protection Regulation also forced Chinese technology companies to meet global data protection standards in order to expand abroad.

China’s data protection policy thus seems to be a contradiction in terms. It is a step towards more protection of the data subjects and at the same time another step towards more control.

Colorado Privacy Act officially enacted into Law

14. July 2021

On July 8, 2021, the state of Colorado officially enacted the Colorado Privacy Act (CPA), which makes it the third state to have a comprehensive data privacy law, following California and Virginia. The Act will go into effect on July 1, 2023, with some specific provisions going into effect at later dates.

The CPA shares many similarities with the California Consumer Privacy Act (CCPA) and the Virgina Consumer Data Protection Act (CDPA), not having developed any brand-new ideas in its laws. However, there are also differences. For example, the CPA applies to controllers that conduct business in Colorado or target residents of Colorado with their business, and controls or processes the data of more than 100 000 consumers in a calendar year or receive revenue by processing data of more than 25 000 consumers. Therefore, it is broader than the CDPA, and does not include revenue thresholds like the CCPA.

Similar to the CDPA, the CPA defines a consumer as “a Colorado resident acting only in an individual or household context” and explicitly omits individuals acting in “a commercial or employment context, as a job applicant, or as a beneficiary of someone acting in an employment context”. As a result, controllers do not need to consider the employee personal data they collect and process in the application of the CPA.

The CPA further defines “the sale of personal information” as “the exchange of personal data for monetary or other valuable consideration by a controller to a third party”. Importantly, the definition of “sale” explicitly excludes certain types of disclosures, as is the case in the CDPA, such as:

  • Disclosures to a processor that processes the personal data on behalf of a controller;
  • Disclosures of personal data to a third party for purposes of providing a product or service requested by consumer;
  • Disclosures or transfer or personal data to an affiliate of the controller’s;
  • Disclosure or transfer to a third party of personal data as an asset that is part of a proposed or actual merger, acquisition, bankruptcy, or other transaction in which the third party assumes control of all or part of the controller’s assets;
  • Disclosure of personal data that a consumer directs the controller to disclose or intentionally discloses by using the controller to interact with a third party; or intentionally made available by a consumer to the general public via a channel of mass media.

The CPA provides five main consumer rights, such as the right of access, right of correction, right of deletion, right to data portability and right to opt out. In case of the latter, the procedure is different from the other laws. The CPA mandates a controller provide consumers with the right to opt out and a universal opt-out option so a consumer can click one button to exercise all opt-out rights.

In addition, the CPA also provides the consumer with a right to appeal a business’ denial to take action within a reasonable time period.

The CPA differentiates between controller and processor in a similar way that the European General Data Protection Regulation (GDPR) does and follows, to an extent, similar basic principles such as duty of transparency, duty of purpose specification, duty of data minimization, duty of care and duty to avoid secondary use. In addition, it follows the principle of duty to avoid unlawful discrimination, which prohibits controllers from processing personal data in violation of state or federal laws that prohibit discrimination.

No obligation to disclose vaccination certificates at events in Poland

7. July 2021

According to recent announcements, the Polish Personal Data Protection Office (UODO) has indicated that vaccinated individuals participating in certain events cannot be required to disclose evidence of vaccination against COVID-19.

In Poland, one of the regulations governing the procedures related to the prevention of the spread of coronavirus is the Decree of the Council of Ministers of May 6th, 2021 on the establishment of certain restrictions, orders and prohibitions in connection with the occurrence of an epidemic state. Among other things, it sets limits on the number of people who can attend various events which are defined by Sec. 26 para. 14 point 2, para. 15 points 2, 3. The aforementioned provisions concern events and meetings for up to 25 people that take place outdoors or in the premises/building indicated as the host’s place of residence or stay as well as events and meetings for up to 50 people that take place outdoors or in the premises/separate food court of a salesroom. Pursuant to Sec. 26 para. 16, the stated number of people does not include those vaccinated against COVID-19.

In this context the question has arisen how the information about the vaccination can be obtained. As this detail is considered health data which constitutes a special category of personal data referred to in Art. 9 para. 1 GDPR, its processing is subject to stricter protection and permissible if at least one of the conditions specified in para. 2 is met. This is, according to Art. 9 para. 2 lit. i GDPR, especially the case if the processing is necessary for reasons of public interest in the area of public health, such as protecting against serious cross-border threats to health or ensuring high standards of quality and safety of health care and of medicinal products or medical devices, on the basis of Union or Member State law which provides for suitable and specific measures to safeguard the rights and freedoms of the data subject, in particular professional secrecy.

The provisions of the Decree do not regulate the opportunity of requiring the participants in the mentioned events to provide information on their vaccination against COVID-19. Hence, it is not specified who may verify the evidence of vaccination, under what conditions and in what manner. Moreover, “specific measures to safeguard” as referred to in Art. 9 para. 2 lit. i GDPR, cited above, are not provided as well. Therefore, the regulations of the Decree cannot be seen as a legal basis authorizing entities obliged to comply with this limit of persons to obtain such data. Consequently, the data subjects are not obliged to provide it.

Because of this, collection of vaccination information can only be seen as legitimate if the data subject consents to the data submission, as the requirement of Art. 9 para. 2 lit. a GDPR will be fulfilled. Notably, the conditions for obtaining consent set out in Art. 4 para. 11 and Art. 7 GDPR must be met. Thus, the consent must be voluntary, informed, specific, expressed in the form of an unambiguous manifestation of will and capable of being revoked at any time.

British Airways could reach a settlement over the 2018 data breach

Back in 2018 British Airways was hit by a data breach affecting up to 500 000 data subjects – customers as well as British Airways staff.

Following the breach the UK’s Information Commissioners Office (ICO) has fined British Airways firstly in 2019 with a record fine of £183.000.000 (€ 205.000.000), due to the severe consequences of the breach. As reported beside inter alia e-mail addresses of the concerned data subjects also credit card information have been accessed by the hackers.

The initial record fine has been reduced by the ICO in 2020 after British Airways appealed against it. The ICO announced the final sanction in October 2020 –  £20.000.000 (€ 22.000.000). Reason for the reduction has been inter alia the current COVID-19 situation and it’s consequences for the Aviation industry.

Most recently it has been published that British Airways also came to a settlement in a UK breach class action with up to 16 000 claimants. The details of the settlement have been kept confidential, so that the settlement sum is not known, but the law firm, PGMBM, representing the claimants, as well as British Airways announced the settlement on July 6th.

PGMBM further explains, that the fine of the ICO “did not provide redress to those affected”, but that “the settlement now addresses” the consequences for the data subjects, as reported by the BBC.

Pages: Prev 1 2 3 4 5 6 7 8 9 10 11 12 13 Next
1 6 7 8 9 10 13