Category: General
11. August 2021
On August 6, 2021, Amazon disclosed the ruling of the Luxembourg data protection authority Commission nationale pour la protection des donées (CNPD) in an SEC filing, which imposed a record-breaking €746 million fine on Amazon Europe Core S.à.r.l. for alleged violations of the EU General Data Protection Regulation (GDPR) on July 16, 2021.
Based on press reports and Amazon’s public statements, the fine appears to relate to Amazon’s use of customer data for targeted advertising purposes.
The penalty is the result of a 2018 complaint by French privacy rights group La Quadrature du Net, a group that aims to represent the interests of thousands of Europeans to ensure their data is used according to data protection law in an attempt to avoid Big Tech companies manipulating their behavior for political or commercial purposes. The complaint also targets Apple, Facebook, Google and LinkedIn and was filed on behalf of more than 10,000 customers and alleges that Amazon manipulates customers for commercial means by choosing what advertising and information they receive.
Amazon stated that they „strongly disagree with the CNPD’s ruling“ and intend to appeal. „The decision relating to how we show customers relevant advertising relies on subjective and untested interpretations of European privacy law, and the proposed fine is entirely out of proportion with even that interpretation.”
The amount of the fine is substantially higher than the proposed fine in a draft decision that was previously reported in the press. The French data protection authority (CNIL) said Luxembourg’s decision, which is “of an unprecedented scale and marks a turning point in the application of the GDPR and the protection of the rights of European nationals.“
The CNIL confirmed the CNPD fined Amazon, and other European member states agreed to the Luxembourg decision. Amazon will have six months to correct the issue.
26. July 2021
In a joint statement, the European Data Protection Supervisor (EDPS) and the European Data Protection Board (EDPB) call for a general ban on the use of artificial intelligence for the automated recognition of human characteristics in publicly accessible spaces. This refers to surveillance technologies that recognise faces, human gait, fingerprints, DNA, voice, keystrokes and other biometric or behavioral signals. In addition to the AI-supported recognition of human characteristics in public spaces, the EDPS and EPDB also call for a ban of AI systems using biometrics to categorize individuals into clusters based on ethnicity, gender, political or sexual orientation, or other grounds on which discrimination is prohibited under Article 21 of the Charter of Fundamental Rights. With the exception of individual applications in the medical field, EDPS and the EDPB are also calling for a ban on AI for sentiment recognition.
In April, the EU Commission presented a first draft law on the regulation of AI applications. The draft explicitly excluded the area of international law enforcement cooperation. The EDPS and EDPB expressed “concern” about the exclusion of international law enforcement cooperation from the scope of the draft. The draft is based on a categorisation of different AI applications into different types of risk, which are to be regulated to different degrees depending on the level of risk to the fundamental rights. In principle, the EDPS and EDPB support this approach and the fact that the EU is addressing the issue in general. However, they call for this concept of fundamental rights risk to be adapted to the EU data protection framework.
Andrea Jelinek, EDPB Chair, and Wojciech Wiewiórowski, of the EDPS, are quoted:
Deploying remote biometric identification in publicly accessible spaces means the end of anonymity in those places. Applications such as live facial recognition interfere with fundamental rights and freedoms to such an extent that they may call into question the essence of these rights and freedoms.
The EDPS and EDPB explicitly support, that the draft provides for national data protection authorities to become competent supervisory authorities for the application of the new regulation and explicitly welcome, that the EDPS is intended to be the competent authority and the market surveillance authority for the supervision of the Union institutions, agencies and bodies. The idea that the Commission also gives itself a predominant role in the “European Artificial Intelligence Board” is questioned by the EU data protection authorities. “This contradicts the need for a European AI Board that is independent of political influence”. They call for the board to be given more autonomy, to ensure its independence.
Worldwide there is great resistance against the use of biometric surveillance systems in public spaces. A large global alliance of 175 civil society organisations, academics and activists is calling for a ban on biometric surveillance in public spaces. The concern is that the potential for abuse of these technologies is too great and the consequences too severe. For example, the BBC reports that China is testing a camera system on Uighurs in Xinjiang that uses AI and facial recognition to detect emotional states. This system is supposed to serve as a kind of modern lie detector and be used in criminal proceedings, for example.
15. July 2021
The state leadership in Beijing is tightening its data protection rules. Chinese driving service provider Didi has now become the subject of far-reaching data protection regulatory measures. Other companies could soon be affected as well.
For months now, Chinese regulators and ministries in China have been issuing a slew of new regulations that not only affect tech companies, but are also directed at how companies handle data in general.
A prime example of China’s “new” data protection policy can be seen in Didi’s public launch on the New York Stock Exchange. The Uber rival only went public for a few days and was urged by the Chinese authorities to remove its app from the app store before the end of the week. The reason for this is reported to have been serious data protection violations by the company, which are now being investigated. The company is said to have processed the collection and use of personal data by the company in a privacy-hostile manner.
Didi was ordered to comply with legal requirements and adhere to national standards. It should also ensure that the security of its users’ personal data is effectively protected.
The announcement had sent shares of the stock market newcomer crashing by more than 5% as of Friday. The news also caused tech stocks to fall on Asian exchanges.
Didi is the nearly undisputed leader among ride-hailing services in China, with 493 million active users and a presence in 14 countries.
Beijing’s new data protection
The actions of Chinese authorities or the Chinese leadership against tech companies speak for a rethinking of the Chinese leadership in terms of data protection.
Initially, there is much to suggest that the state leadership wants to get companies more under control. This is also to prevent third countries from obtaining data from Chinese companies and to prevent Chinese companies from installing themselves abroad.
According to reports, a document from the State Council in Beijing indicates that stricter controls are planned for Chinese companies that are traded on stock exchanges abroad. Capital raised by emerging Chinese companies on foreign stock markets, such as in New York or Hong Kong, will also be subject to more stringent requirements. Especially in the area of “data security, cross-border data flow and management of confidential information”, new standards are to be expected.
However, the aim seems also to better protect the data of Chinese citizens from unauthorized access by criminals or excessive data collection by tech groups and companies.
This is supported by the fact that the Chinese leadership has introduced several rules in recent years and months that are intended to improve data protection. Although the state is not to cede its own rights here, citizens are to be given more rights, at least with respect to companies.
The introduction of the European General Data Protection Regulation also forced Chinese technology companies to meet global data protection standards in order to expand abroad.
China’s data protection policy thus seems to be a contradiction in terms. It is a step towards more protection of the data subjects and at the same time another step towards more control.
14. July 2021
On July 8, 2021, the state of Colorado officially enacted the Colorado Privacy Act (CPA), which makes it the third state to have a comprehensive data privacy law, following California and Virginia. The Act will go into effect on July 1, 2023, with some specific provisions going into effect at later dates.
The CPA shares many similarities with the California Consumer Privacy Act (CCPA) and the Virgina Consumer Data Protection Act (CDPA), not having developed any brand-new ideas in its laws. However, there are also differences. For example, the CPA applies to controllers that conduct business in Colorado or target residents of Colorado with their business, and controls or processes the data of more than 100 000 consumers in a calendar year or receive revenue by processing data of more than 25 000 consumers. Therefore, it is broader than the CDPA, and does not include revenue thresholds like the CCPA.
Similar to the CDPA, the CPA defines a consumer as “a Colorado resident acting only in an individual or household context” and explicitly omits individuals acting in “a commercial or employment context, as a job applicant, or as a beneficiary of someone acting in an employment context”. As a result, controllers do not need to consider the employee personal data they collect and process in the application of the CPA.
The CPA further defines “the sale of personal information” as “the exchange of personal data for monetary or other valuable consideration by a controller to a third party”. Importantly, the definition of “sale” explicitly excludes certain types of disclosures, as is the case in the CDPA, such as:
- Disclosures to a processor that processes the personal data on behalf of a controller;
- Disclosures of personal data to a third party for purposes of providing a product or service requested by consumer;
- Disclosures or transfer or personal data to an affiliate of the controller’s;
- Disclosure or transfer to a third party of personal data as an asset that is part of a proposed or actual merger, acquisition, bankruptcy, or other transaction in which the third party assumes control of all or part of the controller’s assets;
- Disclosure of personal data that a consumer directs the controller to disclose or intentionally discloses by using the controller to interact with a third party; or intentionally made available by a consumer to the general public via a channel of mass media.
The CPA provides five main consumer rights, such as the right of access, right of correction, right of deletion, right to data portability and right to opt out. In case of the latter, the procedure is different from the other laws. The CPA mandates a controller provide consumers with the right to opt out and a universal opt-out option so a consumer can click one button to exercise all opt-out rights.
In addition, the CPA also provides the consumer with a right to appeal a business’ denial to take action within a reasonable time period.
The CPA differentiates between controller and processor in a similar way that the European General Data Protection Regulation (GDPR) does and follows, to an extent, similar basic principles such as duty of transparency, duty of purpose specification, duty of data minimization, duty of care and duty to avoid secondary use. In addition, it follows the principle of duty to avoid unlawful discrimination, which prohibits controllers from processing personal data in violation of state or federal laws that prohibit discrimination.
7. July 2021
Back in 2018 British Airways was hit by a data breach affecting up to 500 000 data subjects – customers as well as British Airways staff.
Following the breach the UK’s Information Commissioners Office (ICO) has fined British Airways firstly in 2019 with a record fine of £183.000.000 (€ 205.000.000), due to the severe consequences of the breach. As reported beside inter alia e-mail addresses of the concerned data subjects also credit card information have been accessed by the hackers.
The initial record fine has been reduced by the ICO in 2020 after British Airways appealed against it. The ICO announced the final sanction in October 2020 – £20.000.000 (€ 22.000.000). Reason for the reduction has been inter alia the current COVID-19 situation and it’s consequences for the Aviation industry.
Most recently it has been published that British Airways also came to a settlement in a UK breach class action with up to 16 000 claimants. The details of the settlement have been kept confidential, so that the settlement sum is not known, but the law firm, PGMBM, representing the claimants, as well as British Airways announced the settlement on July 6th.
PGMBM further explains, that the fine of the ICO “did not provide redress to those affected”, but that “the settlement now addresses” the consequences for the data subjects, as reported by the BBC.
5. July 2021
On June 28, 2021, the European Commission adopted two adequacy decisions for the United Kingdom, one under the General Data Protection Regulation (GDPR) and another under the Law Enforcement Directive.
This means that organizations in the EU can continue to transfer personal data to organizations in the UK without restriction and fear of repercussions. Thus, there is no need to rely upon data transfer mechanisms, such as the EU Standard Contractual Clauses, to ensure an adequate level of protection while transferring personal data, which represents a relief as the bridging mechanism of the interim period decided on after Brexit set out to expire by the end of June 2021.
The European Commission found the U.K.’s data protection system has continued to incorporate to the same rules that were applicable when it was an EU member state, as it had “fully incorporated” the principles, rights and obligations of the GDPR and Law Enforcement Directive into its post-Brexit legal system.
The Commission also noted the U.K. system provides strong safeguards in regards to how it handles personal data access by public authorities, particularly for issues of national security.
In regards to criticism of potential changes in the UK’s legal system concerning personal data, Věra Jourová, Vice-President for Values and Transparency stated that: „We have listened very carefully to the concerns expressed by the Parliament, the Members States and the European Data Protection Board, in particular on the possibility of future divergence from our standards in the UK’s privacy framework. We are talking here about a fundamental right of EU citizens that we have a duty to protect. This is why we have significant safeguards and if anything changes on the UK side, we will intervene.“
The Commission highlighted that the collection of data by UK intelligence authorities is legally subject to prior authorization by an independent judicial body and that any access to data needs to be necessary and proportionate to the purpose pursued. Individuals also have the ability to seek redress in the UK Investigatory Powers Tribunal.
30. June 2021
U.S. Senator Kirsten Gillibrand announced in a press release on June 17, 2021, the reintroduction of the Data Protection Act of 2021. The intention is to create an independent federal agency, the Data Protection Agency, to better equip data protection in the U.S. for the digital age.
Since the first bill was drafted in 2020, it has undergone several updates. For example, the paper will now include adjusted rules to protect data subjects against privacy violations, monitor risky data practices, and examine social, ethical, and economic impacts of data collection. In the press release, Gillibrand explains that the DPA will have three main core tasks. The core tasks are driven by the goal of preventing risky data practices and regulating the collection, processing and sharing of personal data.
The first goal, she says, is to give individuals control and protection over their own data. To this end, data subjects should be given the right to establish and enforce data protection rules. To implement this, emphasis would also have to be placed on complaint handling. The authority would also be given wide-ranging powers. For example, it would be able to conduct investigations and administer civil penalties, injunctions and other appropriate remedies to combat data privacy violations.
The second task would be to promote fair competition in the digital market. This can be achieved, for example, through the development and refinement of model standards, guidelines and policies to protect privacy and data protection. Companies should find it easier to deal with data protection. At the same time, the U.S. should be able to keep pace with leading nations in data protection.
In this context data aggregators are to be monitored by the Data Protection Agency by maintaining a publicly available list of such data aggregators that meet certain thresholds. The FTC (Federal Trade Commission) would at the same time be required to report on the privacy and data protection implications of mergers involving major data aggregators or involving the transfer of personal data of 50,000 or more individuals. The bill would also lastly prohibit data aggregators from certain acts. For example, it would prevent the commission of abusive or discriminatory acts in connection with the processing or transfer of personal data. The goal, Gillibrand says, is also to prevent the identification of a person, household, or device from anonymized data.
A third important task is to prepare the U.S. government for the digital age. The agency is supposed to contribute to more education on digital issues by advising Congress on new privacy and technology issues. She says the agency would also participate as the U.S. representative in international privacy forums. The goal also is to ensure consistent regulatory treatment of personal data by federal and state agencies. To that extent, the authority would act as an interface between federal and state agencies.
Senator Gillibrand commented as follows: “In today’s digital age, Big Tech companies are free to sell individuals’ data to the highest bidder without fear of real consequences, posing a severe threat to modern-day privacy and civil rights. A data privacy crisis is looming over the everyday lives of Americans and we need to hold these bad actors accountable. (…) The U.S. needs a new approach to privacy and data protection and it’s Congress’ duty to step forward and seek answers that will give Americans meaningful protection from private companies that value profits over people.”
28. June 2021
Ransomware attacks are on a steep rise as the global pandemic continues. According to the cybersecurity firm SonicWall, there were more than 304 million attempted ransomware attacks tracked by them in 2020, which was a 62 percent increase over 2019. During the first five months of 2021, the firm detected another 116 percent increase in ransomware attempts compared to the same period in 2020. Another cybersecurity firm called Cybereason found in a recent study interviewing nearly 1,300 security professionals from all around the world that more than half of organisations have been the victim of a ransomware attack, and that 80 percent of businesses that decided to pay a ransom fee suffered a second ransomware attack, often times by the same cybercriminals.
Ransomware is a type of malicious software, which encrypts files, databases, or applications on a computer or network and perpetually holds them hostage or even threatens to publish data until the owner pays the attacker the requested fee. Captivated data may include Personal Data, business data and intellectual property. While Phishing attacks are the most common gateway for ransomware, there are also highly targeted attacks on financially strong companies and institutions (“Big game hunting”).
Alluding to the industry term Software-as-a-Service (SaaS), a new unlawful industry sub-branch has emerged in recent years, which according to security experts lowered the entrance barriers to this industry immensely: Ransomware-as-a-Service (RaaS). With RaaS, a typical monthly subscription could cost around 50 US-Dollars and the purchaser receives the ransomware code and decryption key. Sophisticated RaaS offerings even include customer service and dashboards that allow hackers to track the status of infections and the status of ransomware payments. Thus, cybercriminals do not necessarily have to have the technical skills themselves to create corresponding malware.
Experts point to various factors that are contributing to the recent increase in Ransomeware attacks. One factor is a consequence of the pandemic: the worldwide trend to work from home. Many companies and institutions were abruptly forced to introduce remote working and let employees use their own private equipment. Furthermore, many companies were not prepared to face the rising threats with respect to their cybersecurity management. Another reported factor has been the latest increase in value of the cryptocurrency Bitcoin which is the preferred currency by criminals for ransom payments.
Successful Ransomware attacks can lead to personal data breaches pursuant to Art. 4 No. 12 GDPR and can also lead to the subsequent obligation to report the data breach to the supervisory authorities (Art. 33 GDPR) and to the data subjects (Art. 34 GDPR) for the affected company. Businesses are called to implement appropriate technical and organisational measures based on the risk-based approach, Art. 32 GDPR.
Earlier this month, the Danish Data Protection Authority provided companies with practical guidance on how to mitigate the risk of ransomware attacks. Measures to ensure the ongoing confidentiality, integrity, availability and resilience of processing systems when faced with ransomware may include providing regular trainings for employees, having a high level of technical protection of systems and networks in place, patching programs in a timely manner, and storing backups in an environment other than the normal network.
25. June 2021
On 16 June 2021, the European Commission published the draft adequacy decision for South Korea and transmitted it to the European Data Protection Board (EDPB) for consultation. Thus, the Commission launched the formal procedure towards the adoption of the adequacy decision. In 2017, the Commission announced to prioritise discussions on possible adequacy decisions with important trading partners in East and South-East Asia, starting with Japan and South Korea. The adequacy decision for Japan was already adopted in 2019.
In the past, the Commission diligently reviewed South Korea’s law and practices with regards to data protection. In the course of ongoing negotiations with South Korea, the investigative and enforcement powers of the Korean data protection supervisory authority “PIPC” were strengthened, among other things. After the EDPB has given its opinion, the adequacy decision will need to be approved by a committee composed of representatives of the EU Member States.
The decision of an adequate level of protection pursuant to Art. 45 of the General Data Protection Regulation (GDPR) by the Commission is one of the possibilities to transfer personal data from the EU to a third-country in a GDPR-compliant manner. The adequacy decision will serve as an important addition to the free trade agreement and a strengthening of cooperation between the EU and South Korea. Věra Jourová, the Commission’s Vice-President for Values and Transparency, expressed after launching the formal procedure:
“This agreement with the Republic of Korea will improve the protection of personal data for our citizens and support business in dynamic trade relations. It is also a sign of an increasing convergence of data protection legislation around the world. In the digitalised economy, free and safe data flows are not a luxury, but a necessity.”
Especially in light of the Schrems II decision of the Court of Justice of the European Union, the adequacy decision for South Korea will be an invaluable asset for European and South Korean companies conducting business with each other.
On June 15th, 2021, the Court of Justice of the European Union (CJEU) ruled that “under certain conditions, a national supervisory authority may exercise its power to bring any alleged infringement of the GDPR before a court of a member state, even though that authority is not the lead supervisory authority”. It grants each supervisory authority the power to bring matters within its supervisory area before the courts. If a non-lead supervisory authority wishes to bring cross-border cases to court, it can do so under the so-called emergency procedure under Article 66 of the GDPR.
The General Data Protection Regulation (GDPR) provides that the data protection authority of the country in which a company has its principal place of business in the EU has primary jurisdiction for cross-border proceedings against such companies (the so-called one-stop-shop principle). Facebook and a number of other international companies have their EU headquarters in Ireland. The Irish data protection authority has been criticised several times for dragging out numerous important cases against tech companies. The CJEU’s ruling is likely to lead to more enforcement proceedings by local data protection authorities.
In 2015 – before the GDPR came into force – the Belgian data protection authority filed a lawsuit in Belgian courts against Facebook’s collection of personal data via hidden tracking tools. These tracking tools even tracked users without Facebook accounts. After the GDPR came into force, Facebook argued that lawsuits against data protection violations could only be filed in Ireland. A court of appeal in Brussels then referred the question to the ECJ as to whether proceedings against Facebook were admissible in Belgium. This has now been confirmed by the ECJ. The Belgian court is now free to make a final decision (please see our blog post).
The CJEU has now ruled that, in principle, the lead data protection authority is responsible for prosecuting alleged GDPR violations if they involve cross-border data processing. The data processing must therefore take place in more than one Member State or have an impact on individuals in several member states. However, it is also specified that the “one-stop-shop” principle of the GDPR obliges the lead authority to cooperate closely with the respective local supervisory authority concerned. In addition, local data protection authorities may also have jurisdiction pursuant to Art. 56 (2) and Art. 66 GDPR. According to the CJEU, if the respective requirements of these provisions are met, a local supervisory authority may also initiate legal proceedings. The CJEU has clarified that actions by non-lead data protection authorities can still be upheld if they are based on the Data Protection Directive, the predecessor of the GDPR.
The EU consumer association BEUC called the ruling a positive development. BEUC Director General Monique Goyens said:
Most Big Tech companies are based in Ireland, and it should not be up to that country’s authority alone to protect 500 million consumers in the EU.
While Facebook’s associate general counsel Jack Gilbert said:
We are pleased that the CJEU has upheld the value and principles of the one-stop-shop mechanism, and highlighted its importance in ensuring the efficient and consistent application of GDPR across the EU.
Pages: Prev 1 2 3 4 5 6 7 8 9 10 ... 30 31 32 Next