CNIL fines Monsanto 400,000 € for GDPR violations

29. July 2021

France’s data protection authority, the Commission Nationale de l’Informatique et des Libertés (CNIL), imposed a fine of 400,000 € on the U.S.-based biotechnology corporation Monsanto Company for contravention of Article 14 GDPR regarding the information of data subjects about the collection of their personal data and Article 28 GDPR concerning contractual guarantees which lay down relations with a data processor.

In May 2019, several media outlets revealed that Monsanto was in possession of a file containing personal data of more than 200 political figures or members of civil society (e.g. journalists, environmental activists, scientists or farmers). The investigations carried out by the CNIL disclosed that the information had been collected for lobbying purposes. The individuals named on this “watch list” were Monsanto’s opponents and critics from several European countries, meant to be “educated” or “monitored”. This strategy should have influenced the debate and public opinion on the renewal of the authorization of glyphosate in Europe, a controversial active substance contained in Monsanto’s best-known product for weed control. The reason for the still current scientific controversy is the causation of diseases by glyphosate, most notably cancer.

The file included, for each of the individuals, personal data such as organization, position, business address, business phone number, cell phone number, business email address, and in some cases Twitter accounts. In addition, each person was given a score from 1 to 5 to evaluate their influence, credibility, and support for Monsanto on various issues such as pesticides or genetically modified organisms.

It should be noted that the creation of contact files by stakeholders for lobbying purposes is not illegal per se. While it is not necessary to obtain the consent of the data subjects, the data have to be lawfully collected and the individuals have to be informed of the processing.

In imposing the penalty, the CNIL considered that Monsanto had failed to comply with the provisions of the GDPR by not informing the data subjects about the storage of their data, as required by Article 14 GDPR. In addition, none of the exceptions provided in Article 14 para. 5 GDPR were applicable in this case. The data protection authority stressed that the aforementioned obligation is a key measure under the GDPR insofar as it allows the data subjects to exercise their other rights, in particular the right to object.

Furthermore, Monsanto violated its obligations under Article 28 GDPR. As a controller, the company was required to establish a legal framework for the processing carried out on its behalf by its processor, in particular to provide data security guarantees. However, in the CNIL’s opinion, none of the contracts concluded between the two companies complied with the requirements of Article 28 para. 4 GDPR.

EDPS and the EDPB call for a tightening of the EU draft legislation on the regulation of Artificial Intelligence (AI)

26. July 2021

In a joint statement, the European Data Protection Supervisor (EDPS) and the European Data Protection Board (EDPB) call for a general ban on the use of artificial intelligence for the automated recognition of human characteristics in publicly accessible spaces. This refers to surveillance technologies that recognise faces, human gait, fingerprints, DNA, voice, keystrokes and other biometric or behavioral signals. In addition to the AI-supported recognition of human characteristics in public spaces, the EDPS and EPDB also call for a ban of AI systems using biometrics to categorize individuals into clusters based on ethnicity, gender, political or sexual orientation, or other grounds on which discrimination is prohibited under Article 21 of the Charter of Fundamental Rights. With the exception of individual applications in the medical field, EDPS and the EDPB are also calling for a ban on AI for sentiment recognition.

In April, the EU Commission presented a first draft law on the regulation of AI applications. The draft explicitly excluded the area of international law enforcement cooperation. The EDPS and EDPB expressed “concern” about the exclusion of international law enforcement cooperation from the scope of the draft. The draft is based on a categorisation of different AI applications into different types of risk, which are to be regulated to different degrees depending on the level of risk to the fundamental rights. In principle, the EDPS and EDPB support this approach and the fact that the EU is addressing the issue in general. However, they call for this concept of fundamental rights risk to be adapted to the EU data protection framework.

Andrea Jelinek, EDPB Chair, and Wojciech Wiewiórowski, of the EDPS, are quoted:

Deploying remote biometric identification in publicly accessible spaces means the end of anonymity in those places. Applications such as live facial recognition interfere with fundamental rights and freedoms to such an extent that they may call into question the essence of these rights and freedoms.

The EDPS and EDPB explicitly support, that the draft provides for national data protection authorities to become competent supervisory authorities for the application of the new regulation and explicitly welcome, that the EDPS is intended to be the competent authority and the market surveillance authority for the supervision of the Union institutions, agencies and bodies. The idea that the Commission also gives itself a predominant role in the “European Artificial Intelligence Board” is questioned by the EU data protection authorities. “This contradicts the need for a European AI Board that is independent of political influence”. They call for the board to be given more autonomy, to ensure its independence.

Worldwide there is great resistance against the use of biometric surveillance systems in public spaces. A large global alliance of 175 civil society organisations, academics and activists is calling for a ban on biometric surveillance in public spaces. The concern is that the potential for abuse of these technologies is too great and the consequences too severe. For example, the BBC reports that China is testing a camera system on Uighurs in Xinjiang that uses AI and facial recognition to detect emotional states. This system is supposed to serve as a kind of modern lie detector and be used in criminal proceedings, for example.

Amex fined for sending four million unlawful emails

15. July 2021

American Express Service Europe Limited (Amex) has received a £ 90,000 fine from the UK Information Commissioner’s Office (ICO) for sending over four million unwanted marketing emails to customers.

The reason for the investigation by UK’s supervisory authority were complaints from Amex customers, which claimed to have been receiving marketing emails even though they had not given their consent to do so. The emails, sent as a part of a campaign, contained information regarding benefits of online shopping, optimal use of the card and encouragement to download the Amex app. According to Amex, the emails were rather about “servicing”, not “marketing”. The company insisted that customers would be disadvantaged if they were not aware of the campaigns and that the emails were a requirement of the credit agreements.

The ICO did not share this view. In its opinion, the emails were aimed at inducing customers to make purchases with their cards in return for a £ 50 benefit, and thus “deliberately” for “financial gain”. This constitutes a marketing activity which, without a valid consent, violates Regulation 22 of the Privacy and Electronic Communications Regulations 2003. The consents and therefore the legal basis were not given in this case.

The ICO Head of Investigations pointed out how important it is for companies to know the differences between a service email and a marketing email to ensure that email communications with customers are compliant with the law. While service messages contain routine information such as changes in terms and conditions or notices of service interruptions, direct marketing is any communication of promotional or marketing material directed to specific individuals.

An Amex spokesperson assured that the company takes customers’ marketing preferences very seriously and has already taken steps to address the concerns raised.

China intensifies data protection of companies

The state leadership in Beijing is tightening its data protection rules. Chinese driving service provider Didi has now become the subject of far-reaching data protection regulatory measures. Other companies could soon be affected as well.

For months now, Chinese regulators and ministries in China have been issuing a slew of new regulations that not only affect tech companies, but are also directed at how companies handle data in general.

A prime example of China’s “new” data protection policy can be seen in Didi’s public launch on the New York Stock Exchange. The Uber rival only went public for a few days and was urged by the Chinese authorities to remove its app from the app store before the end of the week. The reason for this is reported to have been serious data protection violations by the company, which are now being investigated. The company is said to have processed the collection and use of personal data by the company in a privacy-hostile manner.

Didi was ordered to comply with legal requirements and adhere to national standards. It should also ensure that the security of its users’ personal data is effectively protected.

The announcement had sent shares of the stock market newcomer crashing by more than 5% as of Friday. The news also caused tech stocks to fall on Asian exchanges.

Didi is the nearly undisputed leader among ride-hailing services in China, with 493 million active users and a presence in 14 countries.

Beijing’s new data protection

The actions of Chinese authorities or the Chinese leadership against tech companies speak for a rethinking of the Chinese leadership in terms of data protection.

Initially, there is much to suggest that the state leadership wants to get companies more under control. This is also to prevent third countries from obtaining data from Chinese companies and to prevent Chinese companies from installing themselves abroad.

According to reports, a document from the State Council in Beijing indicates that stricter controls are planned for Chinese companies that are traded on stock exchanges abroad. Capital raised by emerging Chinese companies on foreign stock markets, such as in New York or Hong Kong, will also be subject to more stringent requirements. Especially in the area of “data security, cross-border data flow and management of confidential information”, new standards are to be expected.

However, the aim seems also to better protect the data of Chinese citizens from unauthorized access by criminals or excessive data collection by tech groups and companies.
This is supported by the fact that the Chinese leadership has introduced several rules in recent years and months that are intended to improve data protection. Although the state is not to cede its own rights here, citizens are to be given more rights, at least with respect to companies.

The introduction of the European General Data Protection Regulation also forced Chinese technology companies to meet global data protection standards in order to expand abroad.

China’s data protection policy thus seems to be a contradiction in terms. It is a step towards more protection of the data subjects and at the same time another step towards more control.

Colorado Privacy Act officially enacted into Law

14. July 2021

On July 8, 2021, the state of Colorado officially enacted the Colorado Privacy Act (CPA), which makes it the third state to have a comprehensive data privacy law, following California and Virginia. The Act will go into effect on July 1, 2023, with some specific provisions going into effect at later dates.

The CPA shares many similarities with the California Consumer Privacy Act (CCPA) and the Virgina Consumer Data Protection Act (CDPA), not having developed any brand-new ideas in its laws. However, there are also differences. For example, the CPA applies to controllers that conduct business in Colorado or target residents of Colorado with their business, and controls or processes the data of more than 100 000 consumers in a calendar year or receive revenue by processing data of more than 25 000 consumers. Therefore, it is broader than the CDPA, and does not include revenue thresholds like the CCPA.

Similar to the CDPA, the CPA defines a consumer as “a Colorado resident acting only in an individual or household context” and explicitly omits individuals acting in “a commercial or employment context, as a job applicant, or as a beneficiary of someone acting in an employment context”. As a result, controllers do not need to consider the employee personal data they collect and process in the application of the CPA.

The CPA further defines “the sale of personal information” as “the exchange of personal data for monetary or other valuable consideration by a controller to a third party”. Importantly, the definition of “sale” explicitly excludes certain types of disclosures, as is the case in the CDPA, such as:

  • Disclosures to a processor that processes the personal data on behalf of a controller;
  • Disclosures of personal data to a third party for purposes of providing a product or service requested by consumer;
  • Disclosures or transfer or personal data to an affiliate of the controller’s;
  • Disclosure or transfer to a third party of personal data as an asset that is part of a proposed or actual merger, acquisition, bankruptcy, or other transaction in which the third party assumes control of all or part of the controller’s assets;
  • Disclosure of personal data that a consumer directs the controller to disclose or intentionally discloses by using the controller to interact with a third party; or intentionally made available by a consumer to the general public via a channel of mass media.

The CPA provides five main consumer rights, such as the right of access, right of correction, right of deletion, right to data portability and right to opt out. In case of the latter, the procedure is different from the other laws. The CPA mandates a controller provide consumers with the right to opt out and a universal opt-out option so a consumer can click one button to exercise all opt-out rights.

In addition, the CPA also provides the consumer with a right to appeal a business’ denial to take action within a reasonable time period.

The CPA differentiates between controller and processor in a similar way that the European General Data Protection Regulation (GDPR) does and follows, to an extent, similar basic principles such as duty of transparency, duty of purpose specification, duty of data minimization, duty of care and duty to avoid secondary use. In addition, it follows the principle of duty to avoid unlawful discrimination, which prohibits controllers from processing personal data in violation of state or federal laws that prohibit discrimination.

No obligation to disclose vaccination certificates at events in Poland

7. July 2021

According to recent announcements, the Polish Personal Data Protection Office (UODO) has indicated that vaccinated individuals participating in certain events cannot be required to disclose evidence of vaccination against COVID-19.

In Poland, one of the regulations governing the procedures related to the prevention of the spread of coronavirus is the Decree of the Council of Ministers of May 6th, 2021 on the establishment of certain restrictions, orders and prohibitions in connection with the occurrence of an epidemic state. Among other things, it sets limits on the number of people who can attend various events which are defined by Sec. 26 para. 14 point 2, para. 15 points 2, 3. The aforementioned provisions concern events and meetings for up to 25 people that take place outdoors or in the premises/building indicated as the host’s place of residence or stay as well as events and meetings for up to 50 people that take place outdoors or in the premises/separate food court of a salesroom. Pursuant to Sec. 26 para. 16, the stated number of people does not include those vaccinated against COVID-19.

In this context the question has arisen how the information about the vaccination can be obtained. As this detail is considered health data which constitutes a special category of personal data referred to in Art. 9 para. 1 GDPR, its processing is subject to stricter protection and permissible if at least one of the conditions specified in para. 2 is met. This is, according to Art. 9 para. 2 lit. i GDPR, especially the case if the processing is necessary for reasons of public interest in the area of public health, such as protecting against serious cross-border threats to health or ensuring high standards of quality and safety of health care and of medicinal products or medical devices, on the basis of Union or Member State law which provides for suitable and specific measures to safeguard the rights and freedoms of the data subject, in particular professional secrecy.

The provisions of the Decree do not regulate the opportunity of requiring the participants in the mentioned events to provide information on their vaccination against COVID-19. Hence, it is not specified who may verify the evidence of vaccination, under what conditions and in what manner. Moreover, “specific measures to safeguard” as referred to in Art. 9 para. 2 lit. i GDPR, cited above, are not provided as well. Therefore, the regulations of the Decree cannot be seen as a legal basis authorizing entities obliged to comply with this limit of persons to obtain such data. Consequently, the data subjects are not obliged to provide it.

Because of this, collection of vaccination information can only be seen as legitimate if the data subject consents to the data submission, as the requirement of Art. 9 para. 2 lit. a GDPR will be fulfilled. Notably, the conditions for obtaining consent set out in Art. 4 para. 11 and Art. 7 GDPR must be met. Thus, the consent must be voluntary, informed, specific, expressed in the form of an unambiguous manifestation of will and capable of being revoked at any time.

British Airways could reach a settlement over the 2018 data breach

Back in 2018 British Airways was hit by a data breach affecting up to 500 000 data subjects – customers as well as British Airways staff.

Following the breach the UK’s Information Commissioners Office (ICO) has fined British Airways firstly in 2019 with a record fine of £183.000.000 (€ 205.000.000), due to the severe consequences of the breach. As reported beside inter alia e-mail addresses of the concerned data subjects also credit card information have been accessed by the hackers.

The initial record fine has been reduced by the ICO in 2020 after British Airways appealed against it. The ICO announced the final sanction in October 2020 –  £20.000.000 (€ 22.000.000). Reason for the reduction has been inter alia the current COVID-19 situation and it’s consequences for the Aviation industry.

Most recently it has been published that British Airways also came to a settlement in a UK breach class action with up to 16 000 claimants. The details of the settlement have been kept confidential, so that the settlement sum is not known, but the law firm, PGMBM, representing the claimants, as well as British Airways announced the settlement on July 6th.

PGMBM further explains, that the fine of the ICO “did not provide redress to those affected”, but that “the settlement now addresses” the consequences for the data subjects, as reported by the BBC.

European Commission Adopts UK Adequacy Decisions

5. July 2021

On June 28, 2021, the European Commission adopted two adequacy decisions for the United Kingdom, one under the General Data Protection Regulation (GDPR) and another under the Law Enforcement Directive.

This means that organizations in the EU can continue to transfer personal data to organizations in the UK without restriction and fear of repercussions. Thus, there is no need to rely upon data transfer mechanisms, such as the EU Standard Contractual Clauses, to ensure an adequate level of protection while transferring personal data, which represents a relief as the bridging mechanism of the interim period decided on after Brexit set out to expire by the end of June 2021.

The European Commission found the U.K.’s data protection system has continued to incorporate to the same rules that were applicable when it was an EU member state, as it had “fully incorporated” the principles, rights and obligations of the GDPR and Law Enforcement Directive into its post-Brexit legal system.

The Commission also noted the U.K. system provides strong safeguards in regards to how it handles personal data access by public authorities, particularly for issues of national security.

In regards to criticism of potential changes in the UK’s legal system concerning personal data, Věra Jourová, Vice-President for Values and Transparency stated that: „We have listened very carefully to the concerns expressed by the Parliament, the Members States and the European Data Protection Board, in particular on the possibility of future divergence from our standards in the UK’s privacy framework. We are talking here about a fundamental right of EU citizens that we have a duty to protect. This is why we have significant safeguards and if anything changes on the UK side, we will intervene.“

The Commission highlighted that the collection of data by UK intelligence authorities is legally subject to prior authorization by an independent judicial body and that any access to data needs to be necessary and proportionate to the purpose pursued. Individuals also have the ability to seek redress in the UK Investigatory Powers Tribunal.

More passenger data collected

1. July 2021

The German Federal Criminal Police Office regularly records so-called PNR (Passenger Name Records) on flights. This includes, among other information, date of birth, names, e-mail addresses, possible frequent flyer numbers or the means of payment used. The aim of the screening is to help track and prevent terrorist offences and serious crime.

Last year, the quantity of these passenger data collected increased significantly. A total of 105 million data records were collected by the Federal Criminal Police Office (BKA) on passengers taking off or landing in Germany. Approximately 31 million passengers are affected by this, including those who have flown more than once. It is to be highlighted here that the number of passengers has fallen by 75 % compared to 2019 due to the corona pandemic.

In 2019, however, around 78 million passenger records of almost 24 million passengers were processed. Subsequently, 111,588 persons were checked with the police’s wanted persons database. The number of “technically positive” search hits was 1960, which corresponds to 0.082 per thousand.

In 2020, after a comparison with the police wanted persons database, 78,179 person transactions remained in the network. The number of positive search hits increased to 5347, which, nevertheless, still only corresponds to 0.2 per thousand. This number is again largely a matter of errors.

Various lawsuits against this dragnet investigation are already before the European Court of Justice. In particular, it is accused that the dragnet investigation is not proportionate. In particular, it affects uninvolved persons. The state should rather take a targeted approach in these cases and not a generalised one.

U.S. Senator Kirsten Gillibrand announces the Data Protection Act 2021

30. June 2021

U.S. Senator Kirsten Gillibrand announced in a press release on June 17, 2021, the reintroduction of the Data Protection Act of 2021. The intention is to create an independent federal agency, the Data Protection Agency, to better equip data protection in the U.S. for the digital age.

Since the first bill was drafted in 2020, it has undergone several updates. For example, the paper will now include adjusted rules to protect data subjects against privacy violations, monitor risky data practices, and examine social, ethical, and economic impacts of data collection. In the press release, Gillibrand explains that the DPA will have three main core tasks. The core tasks are driven by the goal of preventing risky data practices and regulating the collection, processing and sharing of personal data.

The first goal, she says, is to give individuals control and protection over their own data. To this end, data subjects should be given the right to establish and enforce data protection rules. To implement this, emphasis would also have to be placed on complaint handling. The authority would also be given wide-ranging powers. For example, it would be able to conduct investigations and administer civil penalties, injunctions and other appropriate remedies to combat data privacy violations.

The second task would be to promote fair competition in the digital market. This can be achieved, for example, through the development and refinement of model standards, guidelines and policies to protect privacy and data protection. Companies should find it easier to deal with data protection. At the same time, the U.S. should be able to keep pace with leading nations in data protection.

In this context data aggregators are to be monitored by the Data Protection Agency by maintaining a publicly available list of such data aggregators that meet certain thresholds. The FTC (Federal Trade Commission) would at the same time be required to report on the privacy and data protection implications of mergers involving major data aggregators or involving the transfer of personal data of 50,000 or more individuals. The bill would also lastly prohibit data aggregators from certain acts. For example, it would prevent the commission of abusive or discriminatory acts in connection with the processing or transfer of personal data. The goal, Gillibrand says, is also to prevent the identification of a person, household, or device from anonymized data.

A third important task is to prepare the U.S. government for the digital age. The agency is supposed to contribute to more education on digital issues by advising Congress on new privacy and technology issues. She says the agency would also participate as the U.S. representative in international privacy forums. The goal also is to ensure consistent regulatory treatment of personal data by federal and state agencies. To that extent, the authority would act as an interface between federal and state agencies.

Senator Gillibrand commented as follows: “In today’s digital age, Big Tech companies are free to sell individuals’ data to the highest bidder without fear of real consequences, posing a severe threat to modern-day privacy and civil rights. A data privacy crisis is looming over the everyday lives of Americans and we need to hold these bad actors accountable. (…) The U.S. needs a new approach to privacy and data protection and it’s Congress’ duty to step forward and seek answers that will give Americans meaningful protection from private companies that value profits over people.”

Category: General · USA
Pages: Prev 1 2 3 ... 8 9 10 11 12 13 14 ... 67 68 69 Next
1 9 10 11 12 13 69