Category: China

TikTok faces huge fine from Britain’s ICO

12. October 2022

Lately, the Chinese social media success has been the subject of an investigation by the British data protection watchdog, the Information Commissioner’s Office (ICO): the investigation has so far concluded that the social media network has clearly breached the United Kingdom’s data protection laws, in particular the regulations concerning children’s personal data in the time. The Authority issued therefore a notice of intent, which is a potential precursor to a fine amounting up to a staggering 27 million pounds.

In particular, the Authority found out that the platform could have processed personal data of children under the age of 13 failing to gather the parents’ consent for the processing of these data. Under these data there are allegedly also special category data, which have a special protection under Art. 9 GDPR.

Furthermore, in the ICO’s opinion the principle of transparency was not respected by the Chinese hit platform by not providing complete or transparent information on the data processing or their gathering.

The ICO’s investigation is still ongoing as the Commissioner’s Office is still deciding whether to impose the fine or whether there has been a breach of data protection law.

The protection of teenagers and children is the top priority of the ICO according to current Information Commissioner John Edwards. Under his guidance, the ICO has several ongoing investigations targeting various tech companies who could be breaking the UK’s data protection laws.

This is not the first time TikTok has been under observation by data protection watchdogs. In July a US – Australian cybersecurity firm has found that TikTok gathers excessive amounts of information from their users, and voiced their concern over their findings. Based on these precedents, it could be possible that local data protection authorities will increment their efforts to control TikTok’s compliance with local laws and, in Europe, with the  GDPR.

(Update) Processing of COVID-19 immunization data of employees in non-EEA countries

21. January 2022

With COVID-19 vaccination campaigns well under way, employers are faced with the question of whether they are legally permitted to ask employees about their COVID-19 related information and, if so, how that information may be used.

COVID-19 related information, such as vaccination status, whether an employee has recovered from an infection or whether an employee is infected with COVID-19, is considered health data. This type of data is considered particularly sensitive data in most data protection regimes, which may only be processed under strict conditions. Art. 9 (1) General Data Protection Regulation (GDPR)(EU), Art. 9 (1) UK-GDPR (UK), Art. 5 (II) General Personal Data Protection Law (LGPD) (Brazil), para. 1798.140. (b) California Consumer Privacy Act of 2018 (CCPA) (California) all consider health-related information as sensitive personal data. However, the question of whether COVID-19-related data may be processed by an employer is evaluated differently, even in the context of the same data protection regime such as the GDPR.

Below, we discuss whether employers in different European Economic Area (EEA) countries are permitted to process COVID-19-related data about their employees.

Brazil: According to the Labor Code (CLT), employers in Brazil have the right to require their employees to be vaccinated. The employer is responsible for the health and safety of its employees in the workplace and therefore has the right to take reasonable measures to ensure health and safety in the workplace. Since employers can require their employees to be vaccinated, they can also require proof of vaccination. As LGPD considers this information to be sensitive personal data, special care must be taken in processing it.

Hong-Kong: An employer may require its employees to disclose their immunization status. Under the Occupational Safety and Health Ordinance (OSHO), employers are required to take all reasonably practicable measures to ensure the safety and health of all their employees in the workplace. The vaccination may be considered as part of  COVID-19 risk assessments as a possible additional measure to mitigate the risks associated with infection with the virus in the workplace. The requirement for vaccination must be lawful and reasonable. Employers may decide, following such a risk assessment, that a vaccinated workforce is necessary and appropriate to mitigate the risk. In this case, the employer must comply with the Personal Data Protection Regulation (PDPO). Among other things, the PDPO requires that the collection of data must be necessary for the purpose for which it is collected and must not be kept longer than is necessary for that purpose. According to the PDPO, before collecting data, the employer must inform the employee whether the collection is mandatory or voluntary for the employee and, if mandatory, what the consequences are for the employee if he or she does not provide the data.

Russia: Employers must verify which employees have been vaccinated and record this information if such vaccinations are required by law. If a vaccination is not required by law, the employer may require this information, but employees have the right not to provide it. If the information on vaccinations is provided on a voluntary basis, the employer may keep it in the employee’s file, provided that the employee consents in writing to the processing of the personal data. An employer may impose mandatory vaccination if an employee performs an activity involving a high risk of infection (e.g. employees in educational institutions, organizations working with infected patients, laboratories working with live cultures of pathogens of infectious diseases or with human blood and body fluids, etc.) and a corresponding vaccination is listed in the national calendar of protective vaccinations for epidemic indications. All these cases are listed in the Decree of the Government of the Russian Federation dated July 15, 1999 No 825.

UK: An employer may inquire about an employee’s vaccination status or conduct tests on employees if it is proportionate and necessary for the employer to comply with its legal obligation to ensure health and safety at work. The employer must be able to demonstrate that the processing of this information is necessary for compliance with its health and safety obligations under employment law, Art. 9 (2) (b) UK GDPR. He must also conduct a data protection impact assessment to evaluate the necessity of the data collection and balance that necessity against the employee’s right to privacy. A policy for the collection of such data and its retention is also required. The information must be retained only as long as it is needed. There must also be no risk of unlawful discrimination, e.g. the reason for refusing vaccination could be protected from discrimination by the Equality Act 2010.

In England, mandatory vaccination is in place for staff in care homes, and from April 2022, this will also apply to staff with patient contact in the National Health Service (NHS). Other parts of the UK have not yet introduced such rules.

USA: The Equal Employment Opportunity Commission (EEOC) published a document proposing that an employer may implement a vaccination policy as a condition of physically returning to the workplace. Before implementing a vaccination requirement, an employer should consider whether there are any relevant state laws or regulations that might change anything about the requirements for such a provision. If an employer asks an unvaccinated employee questions about why he or she has not been vaccinated or does not want to be vaccinated, such questions may elicit information about a disability and therefore would fall under the standard for disability-related questions. Because immunization records are personally identifiable information about an employee, the information must be recorded, handled, and stored as confidential medical information. If an employer self-administers the vaccine to its employees or contracts with a third party to do so, it must demonstrate that the screening questions are “job-related and consistent with business necessity.”

On November 5th, 2021, the U.S. Occupational Safety and Health Administration (OSHA) released a emergency temporary standard (ETS) urging affected employers to take affirmative action on COVID-19 safety, including adopting a policy requiring full COVID-19 vaccination of employees or giving employees the choice of either being vaccinated against COVID-19 or requiring COVID-19 testing and facial coverage. On November 12th, 2021, the court of appeals suspended enforcement of the ETS pending a decision on a permanent injunction. While this suspension is pending, OSHA cannot take any steps to implement or enforce the ETS.

In the US there are a number of different state and federal workplace safety, employment, and privacy laws that provide diverging requirements on processing COVID-19 related information.

China publishes Draft Measures on Security Assessment of Cross-border Data Transfer for public consultation

8. November 2021

On October 29th, 2021, the Cyberspace Administration of China (CAC) announced a public consultation on its “Draft Measures on Security Assessment of Cross-border Data Transfer”. This is the CAC’s third legislative attempt to build a cross-border data transfer mechanism in China, and it came only days before the effective date of the Personal Information Protection Law (PIPL) on November 1st, 2021.

The CAC said its proposed data transfer assessment aims to comply with China’s PIPL and Data Security Law, while specifically focusing on efforts to “regulate data export activities, protect the rights and interests of personal information, safeguard national security and social public interests, and promote the safe and free flow of data across borders”. If they were to be made final, the Draft Measures would apply to cross-border transfers of personal information and “important data” collected and generated in China under certain circumstances.

Data controllers, or data handlers according to the PIPL, would be subject to mandatory security assessments by the CAC in the following circumstances:

  • transfer of personal information and important data collected and generated by critical information infrastructure operators as defined under China’s Cybersecurity Law;
  • transfer of important data;
  • transfer of personal information by data handlers who process over 1 million individuals’ personal information;
  • cumulatively transferring personal information of more than 100,000 individuals or “sensitive” personal information of more than 10,000 individuals; or
  • other conditions to be specified by the CAC.

According to the Draft Measures, data handlers that require a mandatory security assessment would need to submit certain materials in connection with it, which include an application form, the data handler’s self-security assessment, and the relevant data transfer agreement.

Upon receiving the data handler’s application, the CAC would confirm whether it will accept the application within seven business days. The CAC would have 45 business days to complete the assessment after issuing the notice of acceptance. This period could be extended in complex cases or where the CAC requires supplementary documents, however according to the Draft Measures the timeline should not exceed 60 business days.

In evaluating a data handler’s mandatory security assessment, the CAC would aim to focus on:

  • the legality, propriety and necessity of the cross-border transfer;
  • the data protection laws and regulations of the data recipient’s jurisdiction, the security of the data being transferred, and whether the protections provided by the data recipient satisfy Chinese laws and regulations and mandatory national standards;
  • the volume, scope, type and sensitivity of the data being transferred and the risk of a leak, damage, corruption, loss and misuse;
  • whether the data transfer agreement adequately allocates responsibilities for data protection;
  • compliance with Chinese laws, administrative regulations and departmental regulations; and
  • other matters that are deemed necessary by the CAC.

The CAC’s mandatory security assessment result would be effective for two years, after which a new assessment is necessary. Under circumstances, a re-evaluation would have to take place, e.g. in cases of changes to the purpose, means, scope and type of the cross-border transfer or processing of personal information and/or important data by the data recipient, an extension of the retention period for the personal information and/or important data and other circumstances that might affect the security of transferred data.

The public consultation period extends until November 28th, 2021, after which the CAC will review the public comments and recommendations.

China intensifies data protection of companies

15. July 2021

The state leadership in Beijing is tightening its data protection rules. Chinese driving service provider Didi has now become the subject of far-reaching data protection regulatory measures. Other companies could soon be affected as well.

For months now, Chinese regulators and ministries in China have been issuing a slew of new regulations that not only affect tech companies, but are also directed at how companies handle data in general.

A prime example of China’s “new” data protection policy can be seen in Didi’s public launch on the New York Stock Exchange. The Uber rival only went public for a few days and was urged by the Chinese authorities to remove its app from the app store before the end of the week. The reason for this is reported to have been serious data protection violations by the company, which are now being investigated. The company is said to have processed the collection and use of personal data by the company in a privacy-hostile manner.

Didi was ordered to comply with legal requirements and adhere to national standards. It should also ensure that the security of its users’ personal data is effectively protected.

The announcement had sent shares of the stock market newcomer crashing by more than 5% as of Friday. The news also caused tech stocks to fall on Asian exchanges.

Didi is the nearly undisputed leader among ride-hailing services in China, with 493 million active users and a presence in 14 countries.

Beijing’s new data protection

The actions of Chinese authorities or the Chinese leadership against tech companies speak for a rethinking of the Chinese leadership in terms of data protection.

Initially, there is much to suggest that the state leadership wants to get companies more under control. This is also to prevent third countries from obtaining data from Chinese companies and to prevent Chinese companies from installing themselves abroad.

According to reports, a document from the State Council in Beijing indicates that stricter controls are planned for Chinese companies that are traded on stock exchanges abroad. Capital raised by emerging Chinese companies on foreign stock markets, such as in New York or Hong Kong, will also be subject to more stringent requirements. Especially in the area of “data security, cross-border data flow and management of confidential information”, new standards are to be expected.

However, the aim seems also to better protect the data of Chinese citizens from unauthorized access by criminals or excessive data collection by tech groups and companies.
This is supported by the fact that the Chinese leadership has introduced several rules in recent years and months that are intended to improve data protection. Although the state is not to cede its own rights here, citizens are to be given more rights, at least with respect to companies.

The introduction of the European General Data Protection Regulation also forced Chinese technology companies to meet global data protection standards in order to expand abroad.

China’s data protection policy thus seems to be a contradiction in terms. It is a step towards more protection of the data subjects and at the same time another step towards more control.

China passes new data security law

15. June 2021

China’s “National People’s Congress”, the Chinese legislative body, approved the new “Data Security Law 2021” on June 10th, 2021 (unofficial English translation here). The new law gives President Xi Jinping the power to shut down or fine tech companies. The law will go into effect on September 1st, 2021.

The law applies to data processing activities and security surveillance within China’s territory. Data processing activities outside China’s territory that threaten China’s national security and public interests are also covered by the law. For international companies, the law means they must localize data in China. For example, data generated in factories in China must be kept in China and be subject to cyber data oversight.

Companies that leak sensitive data abroad or are found “mishandling core state data” can be forced to cease operations, have their licenses revoked, or fined up to 1.6 million US$, and companies who provide electronic information to foreign law enforcement authorities can be fined up to approx. 150.000 US$ or forced to suspend their business.

While the Chinese government is increasing its financial involvement in tech companies it is also producing new legislations to tighten its grip on such companies. The new data law is expected to provide a wide outline for future rules for Internet services and to ease the tracking of valuable data in the interest of national security. This may include directives that certain types of data must be stored and handled locally, as well as requirements for companies to track and report the information they hold.

A personal information protection law is still under review in China.

China issued new Draft for Personal Information Protection Law

23. November 2020

At the end of October 2020, China issued a draft for a new „Personal Information Protection Law” (PIPL). This new draft is the introduction of a comprehensive system in terms of data protection, which seems to have taken inspiration from the European General Data Protection Regulation (GDPR).

With the new draft, China’s regulations regarding data protection will be consisting of China’s Cybersecurity Law, Data Security Law (draft) and Draft PIPL. The new draft legislation contains provisions relating to issues presented by new technology and applications, all of this in around 70 articles. The fines written in the draft for non-compliance are quite high, and will bring significant impact to companies with operations in China or targeting China as a market.

The data protection principles drawn out in the draft PIPL include transparency, fairness, purpose limitation, data minimization, limited retention, data accuracy and accountability. The topics that are covered include personal information processing, the cross-border transfer of personal information, the rights of data subjects in relation to data processing, obligations of data processors, the authority in charge of personal information as well as legal liabilities.

Unlike China’s Cybersecurity Law, which provides limited extraterritorial application, the draft PIPL proposes clear and specific extraterritorial application to overseas entities and individuals that process the personal data of data subjects in China.

Further, the definition of “personal data” and “processing” under the draft PIPL are very similar to its equivalent term under the GDPR. Organizations or individuals outside China that fall into the scope of the draft PIPL are also required to set up a dedicated organization or appoint a representative in China, in addition to also report relevant information of their domestic organization or representative to Chinese regulators.

In comparison to the GDPR, the draft PIPL extends the term of “sensitive data” to also include nationality, financial accounts, as well as personal whereabouts. However, sensitive personal information is defined as information that once leaked or abused may cause damage to personal reputation or seriously endanger personal and property safety, which opens the potential for further interpretation.

The draft legislation also regulates cross-border transfers of personal information, which shall be possible if it is certified by recognized institutions, or the data processor executes a cross-border transfer agreement with the recipient located outside of China, to ensure that the processing meets the protection standard provided under the draft PIPL. Where the data processor is categorized as a critical information infrastructure operator or the volume of data processed by the data processor exceeds the level stipulated by the Cyberspace Administration of China (CAC), the cross-border transfer of personal information must pass a security assessment conducted by the CAC.

It further to keep in mind that the draft PIPL enlarges the range of penalties beyond those provided in the Cybersecurity Law, which will put a much higher pressure on liabilities for Controllers operating in China.

Currently, the period established to receive open comments on the draft legislation has ended, but the next steps have not yet been reported, and it not yet sure when the draft legislation will come into full effect.

German Robert-Koch-Institute discusses mobile phone tracking to slow down the spreading of the Coronavirus

9. March 2020

According to a news report by the German newspaper “Der Tagesspiegel”, a small group of scientists at the Robert-Koch-Institute (RKI) and other institutions are currently discussing the evaluation and matching of movement data from mobile phones to detect people infected with the Coronavirus (COVID-19).

The scientists, who are trying to slow down the spreading of the disease, complain about the problem of the time-consuming and vague questionings of infected people on who they came in contact with. The evaluation and matching of mobile phone data may be more accurate and could speed up the process of identifying infected people, which could be essential for saving lives.

In a comment, the German Federal Commissioner for Data Protection Ulrich Kelber expressed that this procedure may cause large data protection issues, especially with regards to having a legal basis for processing and the proportionality of processing according to the GDPR.

German Officials warn Travellers to China of Espionage

17. January 2020

The German Federal Office for the Protection of the Constitution (BfV) sees a significant risk for the security of personal data when accessing local WiFi networks and the mobile network in China. A request from the German newspaper “Handelsblatt” to the BfV revealed that the Officials warn travellers to China of an increasing risk of espionage.

For the stay in China, the BfV discourages travellers from using laptops and smartphones that contain personal data, especially contact information. Instead, the BfV recommends to acquire a travel laptop and a prepaid mobile phone that could be resetted or even be disposed of after leaving China.

According to Handelsblatt, the warning stems from cases in which the Chinese border police conducted mobile phone controls at the Chinese border of Xinjiang and installed a surveillance App on tourists’ smartphones.

In 2016, the BfV already cautioned of potential espionage by Chinese secret services targetting students and researchers.

NIST examines the effect of demographic differences on face recognition

31. December 2019

As part of its Face Recognition Vendor Test (FRVT) program, the U.S. National Institute of Standards and Technology (NIST) conducted a study that evaluated face recognition algorithms submitted by industry and academic developers for their ability to perform various tasks. The study evaluated 189 software algorithms submitted by 99 developers. It focuses on how well each algorithm performs one of two different tasks that are among the most common applications of face recognition.

The two tasks are “one-to-one” matching, i.e. confirming that a photo matches another photo of the same person in a database. This is used, for example, when unlocking a smartphone or checking a passport. The second task involved “one-to-many” matching, i.e. determining whether the person in the photo matches any database. This is used to identify a person of interest.

A special focus of this study was that it also looked at the performance of the individual algorithms taking demographic factors into account. For one-to-one matching, only a few previous studies examined demographic effects; for one-to-many matching, there were none.

To evaluate the algorithms, the NIST team used four photo collections containing 18.27 million images of 8.49 million people. All were taken from operational databases of the State Department, Department of Homeland Security and the FBI. The team did not use images taken directly from Internet sources such as social media or from video surveillance. The photos in the databases contained metadata information that indicated the age, gender, and either race or country of birth of the person.

The study found that the result depends ultimately on the algorithm at the heart of the system, the application that uses it, and the data it is fed with. But the majority of face recognition algorithms exhibit demographic differences. In one-to-one matching, the algorithm rated photos of two different people more often as one person if they were Asian or African-American than if they were white. In algorithms developed by Americans, the same error occurred when the person was a Native American. In contrast, algorithms developed in Asia did not show such a significant difference in one-to-one matching results between Asian and Caucasian faces. However, these results show that algorithms can be trained to achieve correct face recognition results by using a wide range of data.

China publishes provisions on the protection of personal data of children

10. October 2019

On 23 August 2019, the Cyberspace Administration of China published regulations on the cyber protection of personal data of children, which came into force on 1 October 2019. China thus enacted the first rules focusing exclusively on the protection of children’s personal data.

In the regulations, “children” refers to minors under the age of 14. This corresponds to the definition in the national “Information Security Technology – Personal Information Security Specification”.

The provisions regulate activities related to the collection, storage, use, transfer and disclosure of personal data of children through networks located on the territory of China. However, the provisions do not apply to activities conducted outside of China or to similar activities conducted offline.

The provisions provide a higher standard of consent than the Cybersecurity Law of China. To obtain the consent of a guardian, a network operator has to provide the possibility of refusal and expressly inform the guardian of the following:

  • Purpose, means and scope of collection, storage, use, transfer and disclosure of children’s personal information;
  • Storage location of children’s personal information, retention period and how the relevant information will be handled after expiration of the retention period;
  • Safeguard measures protecting children’s personal information;
  • Consequences of rejection by a guardian;
  • The channels and means of filing or reporting complaints; and
  • How to correct and delete children’s personal information.

The network operator also has to restrict internal access to children’s personal information. In particular, before accessing the information, personnel must obtain consent of the person responsible for the protection of children’s personal data or an authorised administrator.

If children’s personal data are processed by a third party processor, the network operator is obliged to carry out a security assessment of the data processor commissioned to process the children’s personal data. He also has to conclude an entrustment agreement with the data processor. The data processor is obliged to support the network operator in fulfilling the request of the guardian to delete the data of a child after termination of the service. Subletting or subcontracting by the data processor is prohibited.

If personal data of children is transferred to a third party, the network operator shall carry out a security assessment of the commissioned person or commission a third party to carry out such an assessment.

Children or their legal guardians have the right to demand the deletion of children’s personal data under certain circumstances. In any case, they have the right to demand the correction of personal data of children if they are collected, stored, used or disclosed by a network operator. In addition, the legal guardians have the right to withdraw their consent in its entirety.

In the event of actual or potential data breaches, the network operator is obliged to immediately initiate its emergency plan and take remedial action. If the violation has or may have serious consequences, the network operator must immediately report the violation to the competent authorities and inform the affected children and their legal guardians by e-mail, letter, telephone or push notification. Where it is challenging to send the notification to any data subject, the network operator shall take appropriate and effective measures to make the notification public. However, the rules do not contain a precise definition of the serious consequences.

In the event that the data breach is caused or observed by a data processor, the data processor is obliged to inform the network operator in good time.

Pages: 1 2 Next
1 2