Category: end-to-end encryption

Swiss Data Protection Commissioner: “Swiss-U.S. Privacy Shield not providing adequate level of Data Protection”

28. September 2020

Following the recent ruling by the Court of Justice of the European Union (“CJEU”) the Swiss Data Protection Commissioner (“EDÖB”) published a statement concerning the level of Data Protection of Data Transfers under the Swiss-U.S. Privacy Shield. The “Schrems II” decision by the CJEU is not legally binding in the Switzerland because Switzerland is neither a EU nor a EEA country. But as the EDÖB and the Joint European Data Protection Authorities work closely together, the decision has first implications for Swiss data exporters.

In accordance with Swiss Data Protection law (Art. 7 VDSG), the Swiss Data Protection Commissioner maintains a publicly accessible list of countries assessing the level of Data Protection guaranteed by these countries. This list shall serve Swiss data exporters as a guidance for their data exporting activities and acts as a rebuttable presumption. EU and EEA countries have continuously been listed in the first column of the list because they are regarded to provide an adequate level of Data Protection. The U.S. has been listed in the second column as a country providing “adequate protection under certain conditions”, which meant a certification of U.S. data importers under the Swiss-U.S. Privacy Shield.

Subsequent to the CJEU ruling, the EDÖB decided to list the U.S. in the third column as a country providing “inadequate protection”, thereby also acting on his past annual reviews of the Swiss-U.S. Privacy Shield. In his reviews, the EDÖB already criticised that data subjects in Switzerland lack access to the courts in the U.S. on account of Data Protection violations and that the Ombudsman-mechanism is ineffective in this regard.

Lastly, the EDÖB pointed out that the Swiss-U.S. Privacy Shield remains in effect since there has not been a decision by Swiss courts comparable to the CJEU decision and that his assessment has the status of a recommendation. However, the EDÖB advises Swiss data exporters to always make a risk assessment when transferring Personal Data to countries with “inadequate protection” and possibly to apply technical measures (e.g. BYOK encryption) in order to protect the data from access by foreign intelligence services.

Series on COVID-19 Contact Tracing Apps Part 2: The EDPB Guideline on the Use of Contact Tracing Tools

25. May 2020

Today we are continuing our miniseries on contact tracing apps and data protection with Part 2 of the series: The EDPB Guideline on the Use of Contact Tracing Tools. As mentioned in Part 1 of our miniseries, many Member States of the European Union have started to discuss using modern technologies to combat the spread of the Coronavirus. Now, the European Data Protection Board (“EDPB”) has issued a new guideline on the use of contact tracing tools in order to give European policy makers guidance on Data Protection concerns before implementing these tools.

The Legal Basis for Processing

In its guideline, the EDPB proposes that the most relevant legal basis for the processing of personal data using contact tracing apps will probably be the necessity for the performance of a task in the public interest, i.e. Art. 6 para. 1 lit. e) GDPR. In this context, Art. 6 para. 3 GDPR clarifies that the basis for the processing referred to in Art. 6 para. 1 lit. e) GDPR shall be laid down by Union or Members State law.

Another possible legal basis for processing could be consent pursuant to Art. 6 para. 1 lit. a) GDPR. However, the controller will have to ensure that the strict requirements for consent to be valid are met.

If the contact tracing application is specifically processing sensitive data, like health data, processing could be based on Art. 9 para. 2 lit. i) GDPR for reasons of public interest in the area of public health or on Art. 9 para. 2 lit. h) GDPR for health care purposes. Otherwise, processing may also be based on explicit consent pursuant to Art. 9 para. 2 lit. a) GDPR.

Compliance with General Data Protection Principles

The guideline is a prime example of the EDPB upholding that any data processing technology must comply with the general data protection principles which are stipulated in Art. 5 GDPR. Contact tracing technology will not be an exeption to this general rule. Thus, the guideline contains recommendations on what national governments and health agencies will need to be aware of in order to observe the data protection principles.

Principle of Lawfulness, fairness and transparency, Art. 5 para. 1 lit. a) GDPR: First and foremost, the EDPB points out that the contact tracing technology must ensure compliance with GDPR and Directive 2002/58/EC (the “ePrivacy Directive”). Also, the application’s algorithms must be auditable and should be regularly reviewed by independent experts. The application’s source code should be made publicly available.

Principle of Purpose limitation, Art. 5 para. 1 lit. b) GDPR: The national authorities’ purposes of processing personal data must be specific enough to exclude further processing for purposes unrelated to the management of the COVID-19 health crisis.

Principles of Data minimisation and Data Protection by Design and by Default, Art. 5 para. 1 lit. c) and Art. 25 GDPR:

  • Data processed should be reduced to the strict minimum. The application should not collect unrelated or unnecessary information, which may include civil status, communication identifiers, equipment directory items, messages, call logs, location data, device identifiers, etc.;
  • Contact tracing apps do not require tracking the location of individual users. Instead, proximity data should be used;
  • Appropriate measures should be put in place to prevent re-identification;
  • The collected information should reside on the terminal equipment of the user and only the relevant information should be collected when absolutely necessary.

Principle of Accuracy, Art. 5 para. 1 lit. d) GDPR: The EDPB advises that procedures and processes including respective algorithms implemented by the contact tracing apps should work under the strict supervision of qualified personnel in order to limit the occurrence of any false positives and negatives. Moreover, the applications should include the ability to correct data and subsequent analysis results.

Principle of Storage limitation, Art. 5 para. 1 lit. e) GDPR: With regards to data retention mandates, personal data should be kept only for the duration of the COVID-19 crisis. The EDPB also recommends including, as soon as practicable, the criteria to determine when the application shall be dismantled and which entity shall be responsible and accountable for making that determination.

Principle of Integrity and confidentiality, Art. 5 para. 1 lit. f) GDPR: Contact tracing apps should incorporate appropriate technical and organisational measures to ensure the security of processing. The EDPB places special emphasis on state-of-the-art cryptographic techniques which should be implemented to secure the data stored in servers and applications.

Principle of Accountability, Art. 5 para. 2 GDPR: To ensure accountability, the controller of any contact tracing application should be clearly defined. The EDPB suggests that national health authorities could be the controllers. Because contact tracing technology involves different actors in order to work effectively, their roles and responsibilities must be clearly established from the outset and be explained to the users.

Functional Requirements and Implementation

The EDPB also makes mention of the fact that the implementations for contact tracing apps may follow a centralised or a decentralised approach. Generally, both systems use Bluetooth signals to log when smartphone owners are close to each other.  If one owner was confirmed to have contracted COVID-19, an alert can be sent to other owners they may have infected. Under the centralised version, the anonymised data gathered by the app will be uploaded to a remote server where matches are made with other contacts. Under the decentralised version, the data is kept on the mobile device of the user, giving users more control over their data. The EDPB does not give a recommendation for using either approach. Instead, national authorities may consider both concepts and carefully weigh up the respective effects on privacy and the possible impacts on individuals rights.

Before implementing contact tracing apps, the EDPB also suggests that a Data Protection Impact Assessment (DPIA) must be carried out as the processing is considered likely high risk (health data, anticipated large-scale adoption, systematic monitoring, use of new technological solution). Furthermore, they strongly recommend the publication of DPIAs to ensure transparency.

Lastly, the EDPB proposes that the use of contact tracing applications should be voluntary and reiterates that it should not rely on tracing individual movements but rather on proximity information regarding users.

Outlook

The EDPB acknowledges that the systematic and large scale monitoring of contacts between natural persons is a grave intrusion into their privacy. Therefore, Data Protection is indispensable to build trust, create the conditions for social acceptability of any solution, and thereby guarantee the effectiveness of these measures. It further underlines that public authorities should not have to choose between an efficient response to the current pandemic and the protection of fundamental rights, but that both can be achieved at the same time.

In the third part of the series regarding COVID-19 contact tracing apps, we will take a closer look into the privacy issues that countries are facing when implementing contact tracing technologies.

The Video-conference service Zoom and its Data Security issues

20. April 2020

Amidst the Corona crisis, the video communications service Zoom gained enormous popularity. The rate of daily Zoom users skyrocketed from 10 Mio in December 2019 to 200 Mio in March 2020. As it outshined many of its competitors, Zoom labels itself as “the leader in modern enterprise video communications”. However, the company has been facing a lot of public criticism because of its weaknesses in data security and lack of awareness in data protection matters.

Basic data security weaknesses unfolded little by little starting in March 2020:

  • Zoom had to admit that it was wrongly advertising to provide full end-to-end encryption for all shared contents like video, audio or screen sharing.
  • Security experts revealed several bugs that could have allowed webcam and mic hijacking and the theft of login credentials.
  • An online Tech Magazine reported that Zoom leaked thousands of their users’ email addresses and photos to strangers.
  • Video-conferences which users did not protect with a password, enabled “Zoombombing”, a phenomenon in which strangers hijacked videocalls and disrupted them by posting pornographic and racist images as well as spamming the conversations with threatening language. In response, Zoom introduced the Waiting Room feature and additional password settings.

At the same time, Zoom’s data privacy practices came under scrutiny:

  • Zoom shared web analytics data with third-party companies for advertising purposes without having a legal basis or notifying users about this practice. In response to criticism, Zoom revised its privacy policy and now declares that it does not share data from meetings for advertising.
  • The company also shared more analytics data of its users with Facebook than stated on Zoom’s privacy policy, even if the user did not sign in with their Facebook account. Zoom introduced an update in which this sharing is terminated.
  • The New York Times revealed that Zoom used a data mining feature that matched Zoom users’ names and email addresses to their LinkedIn profiles without the users knowing about it. Zoom then enabled automatic sharing of the matched LinkedIn profiles with other meeting members that were subscribers of a LinkedIn service for sales prospecting (“LinkedIn Sales Navigator”). In response to criticism, Zoom removed this feature permanently.
  • Zoom hosted a feature called Attention Tracking, which let the meeting’s host know when an attendee had clicked away the meeting window for more than 30 seconds. In the meantime, Zoom disabled the feature.

The security and privacy issues of Zoom have led various public authorities and companies internationally to ban their workers from using the service.

On 1 April 2020, Zoom’s founder and CEO Eric S. Yuan announced a 90-day plan to significantly improve their data security in an effort to build greater trust with their users. This plan includes freezing the introduction of new features, enlarge their cybersecurity team and engage outside help from security advisors.

UK government to meet tech giants after Westminster attack

28. March 2017

In consequence of the Westminster Bridge attack in London, Home Secretary Amber Rudd announced that she wants to meet several tech giants in order to make sure law enforcement is able to access encrypted data for terrorism investigation.

The topic came up as the attacker reportedly used the messaging application WhatsApp shortly before his attack began. As WhatsApp uses end-to-end encryption, neither law enforcement nor WhatsApp itself can read messages. The same applies to Apple’s iMessage. While Rudd did not want to make public which tech companies she will meet in detail, Google confirmed that it will be meeting the UK government.

“We need to make sure that organisations like WhatsApp, and there are plenty of others like that, don’t provide a secret place for terrorists to communicate with each other,“ Rudd said. Labour leader Jeremy Corbin, however, stated that law enforcement already had enough powers and that there needed to be a balance between the right to know and the right to privacy.

In the meantime, Microsoft confirmed that it had provided email information relating to the Westminster Bridge attack to the British authorities after it had received lawful orders.

Use of encryption App increases after US election

6. December 2016

BuzzFeed News reported, that after electing Donald Trump the App called Signal has been faced with a 400 percent rise in daily downloads.

This App is a secure communications tool and therefore well-known in terms of technology, journalism and politics. When using this App people are able to text and speak with one another by encrypting end-to-end, so that only the sender and the intended recipient can read or hear the respective message.

The founder of the App called Signal, Moxie Marlinspike, released a statement saying that “There has never been a single event that has resulted in this kind of sustained, day-over-day increase.” Marlinspike explained that “Trump is about to be put in control of the most pervasive, largest, and least accountable surveillance infrastructure in the world (…) People are maybe a bit uncomfortable with him.”

 

“If you think instant messaging services are private, you are in for a big surprise …

24. October 2016

… The reality is that our communications are under constant threat from cybercriminals and spying by state authorities. Young people, the most prolific sharers of personal details and photos over apps like Snapchat, are especially at risk,” concluded Sherif Elsayed-Ali, the head of Amnesty International’s Technology and Human Rights Team, after ranking 11 of the most popular messaging apps in a Message Privacy Ranking.

In this ranking, both Snapchat and Skype received some of the lowest scores. Snapchat only got 26 out of 100 on the organization’s scale, whereas Skype received 40 out of 100. This is due to the fact that end-to-end encryption is not used, although it is highly recommendet to do so, according to Amnesty.

The report explaines that “The apps were marked on their use of encryption and privacy safeguards, as well as how well they advised their users of the app’s security, and whether they released details of government requests for user data.” Furthermore, Sherif Elsayed-Ali stated that “It is up to tech firms to respond to well-known threats to their users’ privacy and freedom of expression, yet many companies are falling at the first hurdle by failing to provide an adequate level of encryption”.

Therefore, it is to note that although they are the world-leading messaging applications, Skype and Snapchat are among the least secure on the market, according to Amnesty.

Newest Google instant messaging app criticized due to lack of end-to-end encryption by default

24. May 2016

Allo, the new instant messaging app from Google, has been presented this week and is expected to be available for users this summer. As many other technological companies, such as WhatsApp, Facebook, or Apple, Google has decided to implement end-to-end encryption in this app. End-to-end encryption ensures privacy in certain messaging and video call apps so that not even authorities have access to the information stored.

However, unlike WhatsApp, Facebook messenger or iMessage, end-to-end encryption in Allo has to be activated by the user by selecting the “incognito” mode, what has been subject to strong criticism. As Google explained, end-to-end encryption is not activated by default in order to be able to connect it with the functionalities of Google Assistant, which provides tailored recommendations to its users according to the data stored in Google apps. This means that queries to Google’s own servers may be necessary. If “incognito” mode is active Google Assistant’s features may not be able to be used.

Morey Haber, Vice-President of technology, at the cybersecurity company BeyondTrust, acknowledges the possibility to combine end-to-end encryption with the artificial intelligence feature, but he admits that in this case it is not possible that the queries to Google Assistant are fully processed.

Google engineer, Thai Duong, has posted in his personal blog about the security and privacy features of the app.