Category: Countries

UK announces Data Reform Bill

31. May 2022

In 2021 the Department for Culture, Media and Sport (DCMS) published a consultation document entitled “Data: a new direction”, requesting opinions on proposals that could bring changes to the UK’s data protection regime. On May 10, 2022, as part of the Queen’s Speech, Prince Charles confirmed that the government of the United Kingdom (UK) is in the process of reforming its data privacy rules, raising questions about whether the country could still be in compliance with the General Data Protection Regulation (GDPR).

Other than the statement itself, not much information was provided regarding the specific details. The accompanying briefing notes provided more information. They set out the main purposes of the Bill, namely to:

  • The establishment of a new pro-growth and trusted data protection framework
  • Reducing the burdens on business
  • Creation of a world class data rights regime
  • Supporting innovation
  • Driving industry participation in schemes which give citizens and small businesses more control of their data, particularly in relation to health and social care
  • Modernization of the  Information Commissioner’s Office (ICO), including strengthening its enforcement powers and increasing its accountability

Nevertheless, the defined goals are rather superficial. Another concern is that the new bill could deviate too far from the GDPR. The new regime might not be able to retain the adequacy-status with the EU, allowing personal data to be exchanged between UK and EU organizations. Prime Minister Johnson said that the Data Reform Bill would “improve the burdensome GDPR, allowing information to be shared more effectively and securely between public bodies.” So far, no time frame for the adoption of the new law has been published.

Twitter fined $150m for handing users’ contact details to advertisers

30. May 2022

Twitter has been fined $150 million by U.S. authorities after the company collected users’ email addresses and phone numbers for security reasons and then used the data for targeted advertising. 

According to a settlement with the U.S. Department of Justice and the Federal Trade Commission, the social media platform had told users that the information would be used to keep their accounts secure. “While Twitter represented to users that it collected their telephone numbers and email addresses to secure their accounts, Twitter failed to disclose that it also used user contact information to aid advertisers in reaching their preferred audiences,” said a court complaint filed by the DoJ. 

A stated in the court documents, the breaches occurred between May 2013 and September 2019, and the information was apparently used for purposes such as two-factor authentication. However, in addition to the above-mentioned purposes, Twitter used that data to allow advertisers to target specific groups of users by matching phone numbers and email addresses with advertisers’ own lists. 

In addition to financial compensation, the settlement requires Twitter to improve its compliance practices. According to the complaint, the false disclosures violated FTC law and a 2011 settlement with the agency. 

Twitter’s chief privacy officer, Damien Kieran, said in a statement that the company has “cooperated with the FTC at every step of the way.” 

“In reaching this settlement, we have paid a $150m penalty, and we have aligned with the agency on operational updates and program enhancements to ensure that people’s personal data remains secure, and their privacy protected,” he added. 

Twitter generates 90 percent of its $5 billion (£3.8 billion) in annual revenue from advertising.  

The complaint also alleges that Twitter falsely claimed to comply with EU and U.S. privacy laws, as well as Swiss and U.S. privacy laws, which prohibit companies from using data in ways that consumers have not approved of. 

The settlement with Twitter follows years of controversy over tech companies’ privacy practices. Revelations in 2018 that Facebook, the world’s largest social network, used phone numbers provided for two-factor authentication for advertising purposes enraged privacy advocates. Facebook, now Meta, also settled the matter with the FTC as part of a $5 billion settlement in 2019. 

 

Record GDPR fine by the Hungarian Data Protection Authority for the unlawful use of AI

22. April 2022

The Hungarian Data Protection Authority (Nemzeti Adatvédelmi és Információszabadság Hatóság, NAIH) has recently published its annual report in which it presented a case where the Authority imposed the highest fine to date of ca. €670,000 (HUF 250 million).

This case involved the processing of personal data by a bank that acted as a data controller. The controller automatically analyzed recorded audio of costumer calls. It used the results of the analysis to determine which customers should be called back by analyzing the emotional state of the caller using an artificial intelligence-based speech signal processing software that automatically analyzed the call based on a list of keywords and the emotional state of the caller. The software then established a ranking of the calls serving as a recommendation as to which caller should be called back as a priority.

The bank justified the processing on the basis of its legitimate interests in retaining its customers and improving the efficiency of its internal operations.

According to the bank this procedure aimed at quality control, in particular at the prevention of customer complaints. However, the Authority held that the bank’s privacy notice referred to these processing activities in general terms only, and no material information was made available regarding the voice analysis itself. Furthermore, the privacy notice only indicated quality control and complaint prevention as purposes of the data processing.

In addition, the Authority highlighted that while the Bank had conducted a data protection impact assessment and found that the processing posed a high risk to data subjects due to its ability to profile and perform assessments, the data protection impact assessment did not provide substantive solutions to address these risks. The Authority also emphasized that the legal basis of legitimate interest cannot serve as a “last resort” when all other legal bases are inapplicable, and therefore data controllers cannot rely on this legal basis at any time and for any reason. Consequently, the Authority not only imposed a record fine, but also required the bank to stop analyzing emotions in the context of speech analysis.

 

Google launches “Reject All” button on cookie banners

After being hit with a €150 million fine by France’s data protection agency CNIL earlier in the year for making the process of rejecting cookies unnecessarily confusing and convoluted for users, Google has added a new “Reject All” button to the cookie consent banners that have become ubiquitous on websites in Europe. Users visiting Search and YouTube in Europe while signed out or in incognito mode will soon see an updated cookie dialogue with reject all and accept all buttons.

Previously, users only had two options: “I accept” and “personalize.” While this allowed users to accept all cookies with a single click, they had to navigate through various menus and options if they wanted to reject all cookies. “This update, which began rolling out earlier this month on YouTube, will provide you with equal “Reject All” and “Accept All” buttons on the first screen in your preferred language,” wrote Google product manager Sammit Adhya in a blog post.

According to Google they have kicked off the rollout of the new cookie banner in France and will be extending the change to all Google users in Europe, the U.K., and Switzerland soon.

Google’s plan to include a “Reject All” button on cookie banners after its existing policy violated EU law was also welcomed by Hamburg’s Commissioner for Data Protection and Freedom of Information Thomas Fuchs during a presentation of his 2021 activity report.

But the introduction of the “Reject All” button is likely to be only an interim solution because the US giant already presented far-reaching plans at the end of January to altogether remove Google cookies from third-party providers by 2023.

Instead of cookies, the internet giant wants to rely on in-house tracking technology for the Google Privacy Sandbox project.

UK’s new data protection clauses now in force

31. March 2022

After the British government announced reforms to UK’s data protection system last year, the Secretary of State submitted on February 2nd, 2022, a framework to the Parliament to regulate international data transfers and replace the EU Standard Contractual Clauses (SCC). As no objections were raised and the Parliament approved the documents, they entered into force on March 21st, 2022.

The set of rules consists of the International Data Transfer Agreement (IDTA), the International Data Transfer Addendum to the European Commission’s SCC for international data transfers (Addendum) and a Transitional Provisions document. The transfer rules are issued under Section 119A of the Data Protection Act 2018 and take into account the binding judgement of the European Court of Justice in the case commonly referred to as “Schrems II”.

The documents serve as a new tool for compliance with Art. 46 UK GDPR for data transfers to third countries and broadly mirror the rules of the EU GDPR. The UK government also retained the ability to issue its own adequacy decisions regarding data transfers to other third countries and international organizations.

The transfer rules are of immediate benefit to organizations transferring personal data outside the UK. In addition, the transitional provisions allow organizations to rely on the EU SCC until March 21st, 2024, for contracts entered into up to and including September 21st, 2022. However, this is subject to the condition that the data processing activities remain unchanged and that the clauses ensure adequate safeguards.

European Commission and United States agree in principle on Trans-Atlantic Data Privacy Framework

29. March 2022

On March 25th, 2022, the United States and the European Commission have committed to a new Trans-Atlantic Data Privacy Framework that aims at taking the place of the previous Privacy Shield framework.

The White House stated that the Trans-Atlantic Data Privacy Framework “will foster trans-Atlantic data flows and address the concerns raised by the Court of Justice of the European Union when it struck down in 2020 the Commission’s adequacy decision underlying the EU-US Privacy Shield framework”.

According to the joint statement of the US and the European Commission, “under the Trans-Atlantic Data Privacy Framework, the United States is to put in place new safeguards to ensure that signals surveillance activities are necessary and proportionate in the pursuit of defined national security objectives, establish a two-level independent redress mechanism with binding authority to direct remedial measures, and enhance rigorous and layered oversight of signals intelligence activities to ensure compliance with limitations on surveillance activities”.

This new Trans-Atlantic Data Privacy Framework has been a strenuous work in the making and reflects more than a year of detailed negotiations between the US and EU led by Secretary of Commerce Gina Raimondo and Commissioner for Justice Didier Reynders.

It is hoped that this new framework will provide a durable basis for the data flows between the EU and the US, and underscores the shared commitment to privacy, data protection, the rule of law, and the collective security.

Like the Privacy Shield before, this new framework will represent a self-certification with the US Department of Commerce. Therefore, it will be crucial for data exporters in the EU to ensure that their data importers are certified under the new framework.

The establishment of a new “Data Protection Review Court” will be the responsible department in cases of the new two-tier redress system that will allow EU citizens to raise complaints in cases of access of their data by US intelligence authorities, aiming at investigating and resolving the complaints.

The US’ commitments will be concluded by an Executive Order, which will form the basis of the adequacy decision by the European Commission to put the new framework in place. While this represents a quicker solution to reach the goal, it also means that Executive Orders can be easily repealed by the next government of the US. Therefore, it remains to be seen if this new framework, so far only agreed upon in principle, will bring the much hoped closure on the topic of trans-Atlantic data flows that is intended to bring.

ICO releases Guidance on Video Surveillance

7. March 2022

At the end of February 2022, The UK Information Commissioners’ Office (ICO) published a guidance for organizations that capture CCTVs footage in order to provide advice for when they operate video surveillance systems that view or record individuals.

The recommendations aim to focus on best practices for data activities related to “emerging capabilities that can assist human decision making, such as the use of Facial Recognition Technology and machine learning algorithms.” As per the Guidance, surveillance systems specifically include traditional CCTV, Automatic Number Plate Recognition, Body Worn Video, Drones, Facial Recognition Technology, dashcams and smart doorbell cameras.

In their Guidance, the ICO offers checklists with points that controllers can use in order to monitor their use of video surveillance and keep track of their compliance with the applicable law. It further touches on the principles of data protection and how they specifically apply to video surveillance. In addition, it helps companies with the documentation of a Data Processing Impact Assessment.

The Guidance gives in depth advice on video surveillance at the workplace as well as if video feeds should also record audio.

Overall, the Guidance aims to sensibilize controllers regarding the various issues faced with when using video surveillance, and gives them in depth help on what to do to be compliant with the data protection regulations in the UK.

Apps are tracking personal data despite contrary information

15. February 2022

Tracking in apps enables the app providers to offer users personalized advertising. On the one hand, this causes higher financial revenues for app providers. On the other hand, it leads to approaches regarding data processing which are uncompliant with the GDPR.

For a year now data privacy labels are mandatory and designed to show personal data the app providers access (article in German) and provide to third parties. Although these labels on iPhones underline that data access does not take place, 80% of the analyzed applications that have these labels have access to data by tracking personal information. This is a conclusion of an analysis done by an IT specialist at the University of Oxford.

For example, the “RT News” app, which supposedly does not collect data, actually provides different sets of data to tracking services like Facebook, Google, ComScore and Taboola. However, data transfer activities have to be shown in the privacy labels of apps that may actually contain sensitive information of viewed content.

In particular, apps that access GPS location information are sold by data companies. This constitutes an abuse of data protection because personal data ishandled without being data protection law compliant and provided illegally to third parties.

In a published analysis in the Journal Internet Policy Review, tests of two million Android apps have shown that nearly 90 percent of Google’s Play Store apps share data with third parties directly after launching the app. However, Google indicates that these labels with false information about not tracking personal data come from the app provider. Google therefore evades responsibility for the implementation for these labels. Whereby, Apple asserts that controls of correctness are made.

Putting it into perspective, this issue raises the question whether these privacy labels make the use of apps safer in terms of data protection. One can argue that, if the app developers can simply give themselves these labels under Google, the Apple approach seems more legitimate. It remains to be seen if any actions will be taken in this regard.

CNIL judges use of Google Analytics illegal

14. February 2022

On 10th February 2022, the French Data Protection Authority Commission Nationale de l’Informatique et des Libertés (CNIL) has pronounced the use of Google Analytics on European websites to not be in line with the requirements of the General Data Protection Regulation (GDPR) and has ordered the website owner to comply with the requirements of the GDPR within a month’s time.

The CNIL judged this decision in regard to several complaints maybe by the NOYB association concerning the transfer to the USA of personal data collected during visits to websites using Google Analytics. All in all, NOYB filed 101 complaints against data controllers allegedly transferring personal data to the USA in all of the 27 EU Member States and the three further states of European Economic Area (EEA).

Only two weeks ago, the Austrian Data Protection Authority (ADPA) made a similar decision, stating that the use of Google Analytics was in violation of the GDPR.

Regarding the French decision, the CNIL concluded that transfers to the United States are currently not sufficiently regulated. In the absence of an adequacy decision concerning transfers to the USA, the transfer of data can only take place if appropriate guarantees are provided for this data flow. However, while Google has adopted additional measures to regulate data transfers in the context of the Google Analytics functionality, the CNIL deemed that those measures are not sufficient to exclude the accessibility of the personal data for US intelligence services. This would result in “a risk for French website users who use this service and whose data is exported”.

The CNIL stated therefore that “the data of Internet users is thus transferred to the United States in violation of Articles 44 et seq. of the GDPR. The CNIL therefore ordered the website manager to bring this processing into compliance with the GDPR, if necessary by ceasing to use the Google Analytics functionality (under the current conditions) or by using a tool that does not involve a transfer outside the EU. The website operator in question has one month to comply.”

The CNIL has also given advice regarding website audience measurement and analysis services. For these purposes, the CNIL recommended that these tools should only be used to produce anonymous statistical data. This would allow for an exemption as the aggregated data would not be considered “personal” data and therefore not fall under the scope of the GDPR and the requirements for consent, if the data controller ensures that there are no illegal transfers.

(Update) Processing of COVID-19 immunization data of employees in non-EEA countries

21. January 2022

With COVID-19 vaccination campaigns well under way, employers are faced with the question of whether they are legally permitted to ask employees about their COVID-19 related information and, if so, how that information may be used.

COVID-19 related information, such as vaccination status, whether an employee has recovered from an infection or whether an employee is infected with COVID-19, is considered health data. This type of data is considered particularly sensitive data in most data protection regimes, which may only be processed under strict conditions. Art. 9 (1) General Data Protection Regulation (GDPR)(EU), Art. 9 (1) UK-GDPR (UK), Art. 5 (II) General Personal Data Protection Law (LGPD) (Brazil), para. 1798.140. (b) California Consumer Privacy Act of 2018 (CCPA) (California) all consider health-related information as sensitive personal data. However, the question of whether COVID-19-related data may be processed by an employer is evaluated differently, even in the context of the same data protection regime such as the GDPR.

Below, we discuss whether employers in different European Economic Area (EEA) countries are permitted to process COVID-19-related data about their employees.

Brazil: According to the Labor Code (CLT), employers in Brazil have the right to require their employees to be vaccinated. The employer is responsible for the health and safety of its employees in the workplace and therefore has the right to take reasonable measures to ensure health and safety in the workplace. Since employers can require their employees to be vaccinated, they can also require proof of vaccination. As LGPD considers this information to be sensitive personal data, special care must be taken in processing it.

Hong-Kong: An employer may require its employees to disclose their immunization status. Under the Occupational Safety and Health Ordinance (OSHO), employers are required to take all reasonably practicable measures to ensure the safety and health of all their employees in the workplace. The vaccination may be considered as part of  COVID-19 risk assessments as a possible additional measure to mitigate the risks associated with infection with the virus in the workplace. The requirement for vaccination must be lawful and reasonable. Employers may decide, following such a risk assessment, that a vaccinated workforce is necessary and appropriate to mitigate the risk. In this case, the employer must comply with the Personal Data Protection Regulation (PDPO). Among other things, the PDPO requires that the collection of data must be necessary for the purpose for which it is collected and must not be kept longer than is necessary for that purpose. According to the PDPO, before collecting data, the employer must inform the employee whether the collection is mandatory or voluntary for the employee and, if mandatory, what the consequences are for the employee if he or she does not provide the data.

Russia: Employers must verify which employees have been vaccinated and record this information if such vaccinations are required by law. If a vaccination is not required by law, the employer may require this information, but employees have the right not to provide it. If the information on vaccinations is provided on a voluntary basis, the employer may keep it in the employee’s file, provided that the employee consents in writing to the processing of the personal data. An employer may impose mandatory vaccination if an employee performs an activity involving a high risk of infection (e.g. employees in educational institutions, organizations working with infected patients, laboratories working with live cultures of pathogens of infectious diseases or with human blood and body fluids, etc.) and a corresponding vaccination is listed in the national calendar of protective vaccinations for epidemic indications. All these cases are listed in the Decree of the Government of the Russian Federation dated July 15, 1999 No 825.

UK: An employer may inquire about an employee’s vaccination status or conduct tests on employees if it is proportionate and necessary for the employer to comply with its legal obligation to ensure health and safety at work. The employer must be able to demonstrate that the processing of this information is necessary for compliance with its health and safety obligations under employment law, Art. 9 (2) (b) UK GDPR. He must also conduct a data protection impact assessment to evaluate the necessity of the data collection and balance that necessity against the employee’s right to privacy. A policy for the collection of such data and its retention is also required. The information must be retained only as long as it is needed. There must also be no risk of unlawful discrimination, e.g. the reason for refusing vaccination could be protected from discrimination by the Equality Act 2010.

In England, mandatory vaccination is in place for staff in care homes, and from April 2022, this will also apply to staff with patient contact in the National Health Service (NHS). Other parts of the UK have not yet introduced such rules.

USA: The Equal Employment Opportunity Commission (EEOC) published a document proposing that an employer may implement a vaccination policy as a condition of physically returning to the workplace. Before implementing a vaccination requirement, an employer should consider whether there are any relevant state laws or regulations that might change anything about the requirements for such a provision. If an employer asks an unvaccinated employee questions about why he or she has not been vaccinated or does not want to be vaccinated, such questions may elicit information about a disability and therefore would fall under the standard for disability-related questions. Because immunization records are personally identifiable information about an employee, the information must be recorded, handled, and stored as confidential medical information. If an employer self-administers the vaccine to its employees or contracts with a third party to do so, it must demonstrate that the screening questions are “job-related and consistent with business necessity.”

On November 5th, 2021, the U.S. Occupational Safety and Health Administration (OSHA) released a emergency temporary standard (ETS) urging affected employers to take affirmative action on COVID-19 safety, including adopting a policy requiring full COVID-19 vaccination of employees or giving employees the choice of either being vaccinated against COVID-19 or requiring COVID-19 testing and facial coverage. On November 12th, 2021, the court of appeals suspended enforcement of the ETS pending a decision on a permanent injunction. While this suspension is pending, OSHA cannot take any steps to implement or enforce the ETS.

In the US there are a number of different state and federal workplace safety, employment, and privacy laws that provide diverging requirements on processing COVID-19 related information.

Pages: Prev 1 2 3 4 5 6 7 8 9 10 ... 21 22 23 Next
1 2 3 4 23