Category: General
16. December 2021
On December 14th, 2021, John Gruszczyk, a technical product manager at Microsoft (MS), announced, that end-to-end encryption (E2EE) is now generally available for MS Teams calls between two users. MS launched a public preview of E2EE for calls back in October, after announcing the option earlier in 2021.
IT administrators now have the option to enable and manage the feature for their organization once the update is implemented. However, E2EE will not be enabled by default at the user even then. Once IT administrators have configured MS Teams to be used with E2EE enabled, users will still need to enable E2EE themselves in their Teams settings. E2EE encrypts audio, video and screen sharing.
Certain futures will not be available when E2EE is turned on. These include recording of a call, live caption and transcription, transferring a call to another device, adding participants, parking calls, call transfer, and merging calls. If any of these features are required for a call, E2EE must be turned off for that call.
Currently, MS Teams encrypts data, including chat content, in transit and at rest by default, and allows authorized services to decrypt content. MS also uses SharePoint encryption to secure files at rest and OneNote encryption for notes stored in MS Teams. E2EE is particularly suitable for one-on-one calls in situations requiring increased confidentiality.
MS also published an in depth explanation of how this option can me turned on.
With this step, MS is following the example of Zoom, which launched E2EE in October and is making it available for larger group sessions (up to 200 participants).
On December 2nd, EU Advocate General Richard de la Tour published an opinion in which he stated that EU member states may allow consumer protection associations to bring representative actions against infringements of rights that data subjects derive directly from the General Data Protection Regulation (“GDPR”). In doing so, he agrees with the legal opinion of the Federation of the Bundesverband der Verbraucherzentralen und Verbraucherverbände – Verbraucherzentrale Bundesverband e.V. (Federation of German Consumer Organisations (“vzbv”)), which has filed an action for an injunction against Facebook in German courts for non-transparent use of data.
The lawsuit of the vzbv is specifically about third-party games that Facebook offers in its “App Center”. In order to play games like Scrabble within Facebook, users must consent to the use of their data. However, Facebook had not provided information about the use of the data in a precise, transparent and comprehensible manner, as required by Article 13 GDPR. The Federal Court of Justice in Germany (“Bundesgerichtshof”) already came to this conclusion in May 2020, but the Bundesgerichtshof considered it unclear whether associations such as the vzbv have the legal authority to bring data protection violations to court. It argues, inter alia, that it can be inferred from the fact that the GDPR grants supervisory authorities extended supervisory and investigatory powers, as well as the power to adopt remedial measures, that it is primarily the task of those authorities to monitor the application of the provisions of the Regulation. The Bundesgerichtshof therefore asked the Court of Justice of the European Union (“CJEU”) to interpret the GDPR. The Advocate General now affirms the admissibility of such an action by an association, at least if the EU member state in question permits it. The action for an injunction brought by the vzbv against Facebook headquarters in Ireland is therefore deemed admissible by the EU Advocate General.
The Advocate General states, that
the defence of the collective interests of consumers by associations is particularly suited to the objective of the General Data Protection Regulation of establishing a high level of personal data protection.
The Advocate General’s Opinion is not legally binding on the CJEU. The role of the Advocate General is to propose a legal solution for the cases to the CJEUin complete independence. The judges of the Court will now begin their consultations in this case.
France’s data protection authority, the Commission nationale de l’informatique et des libertés (CNIL), has published a guidance on the use of alternatives to third-party cookies.
The guidance aims to highlight that there are other ways to track users online than through third-party cookies, and that it is important to apply data protection principles to new technologies with tracking ability.
In the guidance, the CNIL gives an overview on what cookies are and the difference between first-party and third-party cookies, as well as the meaning of the two for personalized advertisement targeting.
It also highlights consent management and collection as being the key role to ensure a data protection compliant online tracking culture for new tracking methods and technologies. Further, the guidance also emphasizes that consent is not the only important requirement. In addition, online tracking and targeting methods should ensure that users keep control of their data and that all data subject rights are allowed and facilitated.
In light of this, the CNIL has gone ahead and published a guide for developers to help outline how to implement data protection compliant third-party cookies and other tracers in order to sensibilize people that are part of the implementation process as to how to stay compliant.
However, the CNIL also issued about 60 cookie compliance notices and 30 new orders to organizations for not offering users a data protection compliant ability to refuse cookies.
The CNIL has stepped up efforts to tackle cookie management and consent in order to ensure the rights and freedom of the data subjects in relation to their personal data online are kept safe. It has made clear that cookies are its main focus for the upcoming year, and that it will continue to hold companies liable for their insufficient data protection implementation.
30. November 2021
On November 25th, Apple announced in a press release that it has filed a lawsuit against NSO Group Technologies Ltd. (NSO Group) to hold them accountable for their spy software “Pegasus”.
NSO Group is a technology company that supplies surveillance software for governments and government agencies. Applications like Pegasus exploit vulnerabilities in software to infect the target’s devices with Trojans. Pegasus is a spyware that can be secretly installed on cell phones (and other devices) running most iOS and Android versions. Pegasus is not a single exploit, but a series of exploits that exploit many vulnerabilities in the system. Some of the exploits used by Pegasus are zero-click, which means that they can be executed without any interaction from the victim. It is reorted to be able to read text messages, track calls, collect passwords, track location, access the microphone and camera of the targeted device, extract contacts, photos, web browsing history, settings and collect information from apps.
NSO Group is accused of selling its software to authoritarian governments, which use it to monitor journalists and the opposition. Accusations that the company regularly denies. According to an investigation done by a global consortium of journalists of 17 media oganizations, Pegasus has been used to monitor female journalists, human rights activists, lawyers and high-ranking politicians. There are even reports suggesting it is even used by Mexican drug cartels to target and intimidate Mexican journalists. Among the more famous confirmed Pegasus victims are Amazon founder Jeff Bezos and murdered Saudi Arabian journalist Jamal Kashoggi.
Apple wants to prevent “further abuse and harm” to Apple users. The lawsuit also demands unspecified compensation for spying on users.
In the press release Apple states:
NSO Group and its clients devote the immense resources and capabilities of nation-states to conduct highly targeted cyberattacks, allowing them to access the microphone, camera, and other sensitive data on Apple and Android devices. To deliver FORCEDENTRY to Apple devices, attackers created Apple IDs to send malicious data to a victim’s device — allowing NSO Group or its clients to deliver and install Pegasus spyware without a victim’s knowledge. Though misused to deliver FORCEDENTRY, Apple servers were not hacked or compromised in the attacks.
Ivan Krstić, head of Apple Security Engineering and Architecture is quoted:
In a free society, it is unacceptable to weaponize powerful state-sponsored spyware against those who seek to make the world a better place
Apple has announced the lawsuit contains new information about the so-called ForcedEntry exploit for a now-closed vulnerability that NSO Group used to “break into a victim’s Apple device and install the latest version of NSO Group’s Pegasus spyware program,” according to Apple’s press release. The vulnerability was originally discovered by Citizen Lab, a research group at the University of Toronto. Apple says it will support organizations like Citizen Lab and Amnesty Tech in their work, and will donate $10 million and any compensation from the lawsuit to organizations involved in researching and protecting against cyber surveillance. The company will also support Citizen Lab with free technology and technical assistance.
Apple is the second major company to sue NSO Group after WhatsApp Inc. and its parent company Meta Platforms, Inc.(then Facebook, Inc.) filed a complaint against NSO Group in 2019. The allogation of that lawsuit is that NSO Group unlawfully exploited WhatsApp’s systems to monitor users.
In early November 2021, the US Department of Commerce placed NSO Group on its “Entity List”. The justification for this step states that Pegasus was used to monitor government officials, journalists, business people, activists, academics and embassy staff. On the “Entity List,” the U.S. government lists companies, individuals or governments whose activities are contrary to the national security or foreign policy interests of the United States. Trade with these companies is subject to strict restrictions and in some cases is only possible with an exemption from the Department.
25. November 2021
The EU Commission is working on a legislative package to combat child abuse, which will also regulate the exchange of child pornography on the internet. The scope of these regulations is expected to include automated searches for private encrypted communications via messaging apps.
When questioned, Olivier Onidi, Deputy Director General of the Directorate-General Migration and Home Affairs at the European Commission, said the proposal aims to “cover all forms of communication, including private communication”.
The EU Commissioner of Home Affairs, Ylva Johansson, declared the fight against child sexual abuse to be her top priority. The current Slovenian EU Council Presidency has also declared the fight against child abuse to be one of its main priorities and intends to focus on the “digital dimension”.
In May 2021, the EU Commission, the Council and the European Parliament reached a provisional agreement on an exemption to the ePrivacy Directive that would allow web-based email and messaging services to detect, remove, and report child sexual abuse material. Previously, the European Electronic Communications Code (EECC) had extended the legal protection of the ePrivacy Directive to private communications related to electronic messaging services. Unlike the General Data Protection Regulation, the ePrivacy Directive does not contain a legal basis for the voluntary processing of content or traffic data for the purpose of detecting child sexual abuse. For this reason, such an exception was necessary.
Critics see this form of preventive mass surveillance as a threat to privacy, IT security, freedom of expression and democracy. A critic to the agreement states:
This unprecedented deal means all of our private e-mails and messages will be subjected to privatized real-time mass surveillance using error-prone incrimination machines inflicting devastating collateral damage on users, children and victims alike.
However, the new legislative initiative goes even further. Instead of allowing providers of such services to search for such content on a voluntary basis, all providers would be required to search the services they offer for such content.
How exactly such a law would be implemented from a technical perspective will probably not be clear from the text of the law and is likely to be left up to the providers.
One possibility would be that software checks the hash of an attachment before it is sent and compares it with a database of hashes that have already been identified as illegal once. Such software is offered by Microsoft, for example, and such a database is operated by the National Center of Missing and Exploited Children in the United States. A hash is a kind of digital fingerprint of a file.
Another possibility would be the monitoring technology “client-side scanning”. This involves scanning messages before they are encrypted on the user’s device. However, this technology has been heavily criticized by numerous IT security researchers and encryption software manufacturers in a joint study. They describe CSS as a threat to privacy, IT security, freedom of expression and democracy, among other things because the technology creates security loopholes and thus opens up gateways for state actors and hackers.
The consequence of this law would be a significant intrusion into the privacy of all EU citizens, as every message would be checked automatically and without suspicion. The introduction of such a law would also have massive consequences for the providers of encrypted messaging services, as they would have to change their software fundamentally and introduce corresponding control mechanisms, but without jeopardizing the security of users, e.g., from criminal hackers.
There is another danger that must be considered: The introduction of such legally mandated automated control of systems for one area of application can always lead to a lowering of the inhibition threshold to use such systems for other purposes as well. This is because the same powers that are introduced in the name of combating child abuse could, of course, also be introduced for investigations in other areas.
It remains to be seen when the relevant legislation will be introduced and when and how it will be implemented. Originally, the bill was scheduled to be presented on December 1st, 2021, but this item has since been removed from the Commission’s calendar.
On November 19th, 2021, the European Data Protection Board (EDPB) published a new set of draft Guidelines 05/2021 on the interplay between the EU General Data Protection Regulation’s (GDPR) territorial scope, and the GDPR’s provisions on international data transfers.
The EDPB stated in their press release that “by clarifying the interplay between the territorial scope of the GDPR (Art. 3) and the provisions on international transfers in Chapter V, the Guidelines aim to assist controllers and processors in the EU in identifying whether a processing operation constitutes an international transfer, and to provide a common understanding of the concept of international transfers.”
The Guidelines set forth three cumulative criteria to consider in determining whether a processing activity qualifies as an international data transfer under the GDPR, namely:
- the exporting controller or processor is subject to the GDPR for the given processing activity,
- the exporting controller or processor transmits or makes available the personal data to the data importer (e.g., another controller, joint controller, or a processor and
- the data importer is in a third country (or is an international organization), irrespective of whether the data importer or its processing activities are subject to the GDPR.
If all three requirements are met, the processing activity is to be considered an international data transfer under the GDPR, which results in the requirements of Chapter V of the GDPR to be applicable.
The Guidelines further clarify that the safeguards implemented to accommodate the international data transfer must be tailored to the specific transfer at issue. In an example, the EDPB indicates that the transfer of personal data to a controller in a third country that is subject to the GDPR will generally require fewer safeguards. In such a case, the transfer tool should focus on the elements and principles that are specific to the importing jurisdiction. This includes particularly conflicting national laws, government access requests in the receiving third country and the difficulty for data subjects to obtain redress against an entity in the receiving third country.
The EDPB offers its support in developing a transfer tool that would cover the above-mentioned situation.
The Guidelines are open for public consultation until January, 31st, 2022.
16. November 2021
In its October Infringements Package, the European Commission has stated it is pursuing legal actions against Belgium over concerns its Data Protection Authority (DPA) is not operating independently, as it should under the General Data Protection Regulation (GDPR).
The Commission stated that it “considers that Belgium violates Article 52 of the GDPR, which states that the data protection supervisory authority shall perform its tasks and exercise its powers independently. The independence of data protection authorities requires that their members are free from any external influence or incompatible occupation.”
According to the European Commission, however, some members of the Belgian DPA cannot be regarded as free from external influence, as they either report to a management committee depending on the Belgian government, they have taken part in governmental projects on COVID-19 contact tracing, or they are members of the Information Security Committee.
On June 9th, 2021, the Commission sent a letter of formal notice to Belgium, giving the member state two months to take corrective measures. Belgium’s response to the Commission’s letter did not address the issues raised and the members concerned have so far remained in their posts. The European Commission is now giving Belgium two months to take relevant action. If this fails, the Commission may decide to refer the case to the Court of Justice of the European Union.
15. November 2021
On November 10th, 2021, the UK Supreme Court issued a long-awaited judgment in the Lloyd v Google case and denied the class-action lawsuit against Google over alleged illegal tracking of millions of iPhone users back in 2011 and 2012 to proceed further. The 3 billion GBP lawsuit, which was filed on behalf of 4.4 million residents in England and Wales, had implications for other class-action lawsuits filed in the U.K.
The case was originally filed by Richard Lloyd on behalf of the group “Google You Owe Us.” The group accused Google of bypassing Apple iPhone security by collecting personal information of users on the phone’s Safari web browser between August 2011 and February 2012. A U.K. court dismissed the case in October 2018, but it was later overturned by the UK Court of Appeal.
In a final decision in the case dating from last week, the Supreme Court ruled in favor of Google, deciding that the representative claim against Google under the Data Protection Act 1998 (DPA) should not be allowed to proceed. In reaching its decision, the Supreme Court considered the following points:
- the statutory scheme of the DPA does not permit recovery of compensation for the mere “loss of control” of personal data and
- the representative claim by Lloyd on behalf of the 4.4 million affected individuals should not be allowed to proceed, as Lloyd was unable to demonstrate that each of those individuals who he represented in the claim had suffered a violation of their rights under the DPA and material damage because of that violation.
“The claimants seeks damages,” Judge George Leggatt stated the decision, “for each individual member of the represented class without attempting to show that any wrongful use was made by Google of personal data relating to that individual or that the individual suffered any material damage or distress as a result of a breach.” Judge Leggatt also said, “Without proof of these matters, a claim for damages cannot succeed.”
The decision will be welcomed by controllers, as it limits the prospects of representative claims of the nature of that advanced by Lloyd and further provides reassurance that mere technical breaches of the UK GDPR that do not result in material damage to data subjects do not represent sufficient ground for compensation.
8. November 2021
On October 29th, 2021, the Cyberspace Administration of China (CAC) announced a public consultation on its “Draft Measures on Security Assessment of Cross-border Data Transfer”. This is the CAC’s third legislative attempt to build a cross-border data transfer mechanism in China, and it came only days before the effective date of the Personal Information Protection Law (PIPL) on November 1st, 2021.
The CAC said its proposed data transfer assessment aims to comply with China’s PIPL and Data Security Law, while specifically focusing on efforts to “regulate data export activities, protect the rights and interests of personal information, safeguard national security and social public interests, and promote the safe and free flow of data across borders”. If they were to be made final, the Draft Measures would apply to cross-border transfers of personal information and “important data” collected and generated in China under certain circumstances.
Data controllers, or data handlers according to the PIPL, would be subject to mandatory security assessments by the CAC in the following circumstances:
- transfer of personal information and important data collected and generated by critical information infrastructure operators as defined under China’s Cybersecurity Law;
- transfer of important data;
- transfer of personal information by data handlers who process over 1 million individuals’ personal information;
- cumulatively transferring personal information of more than 100,000 individuals or “sensitive” personal information of more than 10,000 individuals; or
- other conditions to be specified by the CAC.
According to the Draft Measures, data handlers that require a mandatory security assessment would need to submit certain materials in connection with it, which include an application form, the data handler’s self-security assessment, and the relevant data transfer agreement.
Upon receiving the data handler’s application, the CAC would confirm whether it will accept the application within seven business days. The CAC would have 45 business days to complete the assessment after issuing the notice of acceptance. This period could be extended in complex cases or where the CAC requires supplementary documents, however according to the Draft Measures the timeline should not exceed 60 business days.
In evaluating a data handler’s mandatory security assessment, the CAC would aim to focus on:
- the legality, propriety and necessity of the cross-border transfer;
- the data protection laws and regulations of the data recipient’s jurisdiction, the security of the data being transferred, and whether the protections provided by the data recipient satisfy Chinese laws and regulations and mandatory national standards;
- the volume, scope, type and sensitivity of the data being transferred and the risk of a leak, damage, corruption, loss and misuse;
- whether the data transfer agreement adequately allocates responsibilities for data protection;
- compliance with Chinese laws, administrative regulations and departmental regulations; and
- other matters that are deemed necessary by the CAC.
The CAC’s mandatory security assessment result would be effective for two years, after which a new assessment is necessary. Under circumstances, a re-evaluation would have to take place, e.g. in cases of changes to the purpose, means, scope and type of the cross-border transfer or processing of personal information and/or important data by the data recipient, an extension of the retention period for the personal information and/or important data and other circumstances that might affect the security of transferred data.
The public consultation period extends until November 28th, 2021, after which the CAC will review the public comments and recommendations.
27. October 2021
As COVID-19 vaccination campaigns are well under way, employers are faced with the question of whether they are legally permitted to ask employees about their COVID-19 related information (vaccinated, recovered) and, if so, how that information may be used.
COVID-19 related information, such as vaccination status, if an employee has recovered from an infection or whether an employee is infected with COVID-19, is considered health data. This type of data is considered particularly sensitive data in most data protection regimes, which may only be processed under strict conditions. Art. 9 (1) General Data Protection Regulation (GDPR)(EU), Art. 9 (1) UK-GDPR (UK), Art. 5 (II) General Personal Data Protection Law (LGPD) (Brazil), para. 1798.140. (b) California Consumer Privacy Act of 2018 (CCPA) (California) all consider health-related information as sensitive personal data.
The following discusses whether employers in various non-EEA countries are permitted to process COVID-19-related information about their employees.
Brazil: According to the Labor Code (CLT), employers in Brazil have the right to require their employees to be vaccinated. This is because the employer is responsible for the health and safety of its employees in the workplace and therefore has the right to take reasonable measures to ensure health and safety in the workplace. Since employers can require their employees to be vaccinated, they can also require proof of vaccination. Because LGPD considers this information to be sensitive personal data, special care must be taken in processing it.
Hong-Kong: An employer may require its employees to disclose their immunization status. Under the Occupational Safety and Health Ordinance (OSHO), employers are required to take all reasonably practicable steps to ensure the safety and health of all their employees in the workplace. The vaccine may be considered as part of COVID-19 risk assessments as a possible additional measure to mitigate the risks associated with contracting the virus in the workplace. The requirement for vaccination must be lawful and reasonable. Employers may decide, following such a risk assessment, that a vaccinated workforce is necessary and appropriate to mitigate risk. If the employer does so, it must comply with the Personal Data Privacy Ordinance (PDPO). Among other things, the PDPO requires that the collection of data must be necessary for the purpose for which it is collected and must not be kept longer than is necessary for that purpose. Under the PDPO, before collecting data, the employer must inform the employee whether the collection is mandatory or voluntary for the employee and, if mandatory, what the consequences are for the employee if he or she does not provide the data.
UK: An employer may inquire about an employee’s vaccination status or conduct tests on employees if it is proportionate and necessary for the employer to comply with its legal obligation to ensure health and safety at work. The employer must be able to demonstrate that the processing of this information is necessary for compliance with its health and safety obligations under employment law, Art. 9 (2) (b) UK GDPR. He must also conduct a data protection impact assessment to evaluate the necessity of the data collection and balance that necessity against the employee’s right to privacy. A policy for the collection of such data and its retention is also required. The information must be retained only as long as it is needed. There must also be no risk of unlawful discrimination, e.g. the reason for refusing vaccination could be protected from discrimination by the Equality Act 2010.
USA: The Equal Employment Opportunity Commission (EEOC) published a document in which it suggests that an employer may implement a vaccination policy as a condition of physically returning to the workplace. Before implementing a vaccination requirement, an employer should consider whether there are any relevant state laws or regulations that might change anything about the requirements for such a provision. If an employer asks an unvaccinated employee questions about why he or she has not been vaccinated or does not want to be vaccinated, such questions may elicit information about a disability and therefore would fall under the standard for disability-related questions. Because immunization records are personally identifiable information about an employee, the information must be recorded, handled, and stored as confidential medical information. If an employer self-administers the vaccine to its employees or contracts a third party to do so, the employer must demonstrate that the screening questions are “job-related and consistent with business necessity.”
Pages: Prev 1 2 3 4 5 6 7 8 9 10 ... 30 31 32 Next