Category: end-to-end encryption

Microsoft Teams now offers end-to-end encryption for one-to-one calls

16. December 2021

On December 14th, 2021, John Gruszczyk, a technical product manager at Microsoft (MS), announced, that end-to-end encryption (E2EE) is now generally available for MS Teams calls between two users. MS launched a public preview of E2EE for calls back in October, after announcing the option earlier in 2021.

IT administrators now have the option to enable and manage the feature for their organization once the update is implemented. However, E2EE will not be enabled by default at the user even then. Once IT administrators have configured MS Teams to be used with E2EE enabled, users will still need to enable E2EE themselves in their Teams settings. E2EE encrypts audio, video and screen sharing.

Certain futures will not be available when E2EE is turned on. These include recording of a call, live caption and transcription, transferring a call to another device, adding participants, parking calls, call transfer, and merging calls. If any of these features are required for a call, E2EE must be turned off for that call.

Currently, MS Teams encrypts data, including chat content, in transit and at rest by default, and allows authorized services to decrypt content. MS also uses SharePoint encryption to secure files at rest and OneNote encryption for notes stored in MS Teams. E2EE is particularly suitable for one-on-one calls in situations requiring increased confidentiality.

MS also published an in depth explanation of how this option can me turned on.

With this step, MS is following the example of Zoom, which launched E2EE in October and is making it available for larger group sessions (up to 200 participants).

EU commission working on allowing automated searches of the content of private and encrypted communications

25. November 2021

The EU Commission is working on a legislative package to combat child abuse, which will also regulate the exchange of child pornography on the internet. The scope of these regulations is expected to include automated searches for private encrypted communications via messaging apps.

When questioned, Olivier Onidi, Deputy Director General of the Directorate-General Migration and Home Affairs at the European Commission, said the proposal aims to “cover all forms of communication, including private communication”.

The EU Commissioner of Home Affairs, Ylva Johansson, declared the fight against child sexual abuse to be her top priority. The current Slovenian EU Council Presidency has also declared the fight against child abuse to be one of its main priorities and intends to focus on the “digital dimension”.

In May 2021, the EU Commission, the Council and the European Parliament reached a provisional agreement on an exemption to the ePrivacy Directive that would allow web-based email and messaging services to detect, remove, and report child sexual abuse material. Previously, the European Electronic Communications Code (EECC) had extended the legal protection of the ePrivacy Directive to private communications related to electronic messaging services. Unlike the General Data Protection Regulation, the ePrivacy Directive does not contain a legal basis for the voluntary processing of content or traffic data for the purpose of detecting child sexual abuse. For this reason, such an exception was necessary.

Critics see this form of preventive mass surveillance as a threat to privacy, IT security, freedom of expression and democracy. A critic to the agreement states:

This unprecedented deal means all of our private e-mails and messages will be subjected to privatized real-time mass surveillance using error-prone incrimination machines inflicting devastating collateral damage on users, children and victims alike.

However, the new legislative initiative goes even further. Instead of allowing providers of such services to search for such content on a voluntary basis, all providers would be required to search the services they offer for such content.

How exactly such a law would be implemented from a technical perspective will probably not be clear from the text of the law and is likely to be left up to the providers.
One possibility would be that software checks the hash of an attachment before it is sent and compares it with a database of hashes that have already been identified as illegal once. Such software is offered by Microsoft, for example, and such a database is operated by the National Center of Missing and Exploited Children in the United States. A hash is a kind of digital fingerprint of a file.
Another possibility would be the monitoring technology “client-side scanning”. This involves scanning messages before they are encrypted on the user’s device. However, this technology has been heavily criticized by numerous IT security researchers and encryption software manufacturers in a joint study. They describe CSS as a threat to privacy, IT security, freedom of expression and democracy, among other things because the technology creates security loopholes and thus opens up gateways for state actors and hackers.

The consequence of this law would be a significant intrusion into the privacy of all EU citizens, as every message would be checked automatically and without suspicion. The introduction of such a law would also have massive consequences for the providers of encrypted messaging services, as they would have to change their software fundamentally and introduce corresponding control mechanisms, but without jeopardizing the security of users, e.g., from criminal hackers.

There is another danger that must be considered: The introduction of such legally mandated automated control of systems for one area of application can always lead to a lowering of the inhibition threshold to use such systems for other purposes as well. This is because the same powers that are introduced in the name of combating child abuse could, of course, also be introduced for investigations in other areas.

It remains to be seen when the relevant legislation will be introduced and when and how it will be implemented. Originally, the bill was scheduled to be presented on December 1st, 2021, but this item has since been removed from the Commission’s calendar.

US court unsuccessfully demanded extensive information about user of the messenger app Signal

16. November 2021

On October 27th, 2021 Signal published a search warrant for user data issued by a court in Santa Clara, California. The court ordered Signal to provide a variety of information, including a user’s name, address, correspondence, contacts, groups, and call records from the years 2019 and 2020. Signal was only able to provide two sets of data: the timestamp of when the account was created and the date of the last connection to the Signal server, as Signal does not store any other information about its users.

The warrant also included a confidentiality order that was extended four times. Signal stated:

Though the judge approved four consecutive non-disclosure orders, the court never acknowledged receipt of our motion to partially unseal, nor scheduled a hearing, and would not return counsel’s phone calls seeking to schedule a hearing.

A similar case was made public by Signal in 2016, when a court in Virginia requested the release of user data and ordered that the request not be made public. Signal fought the non-publication order in court and eventually won.

Signal is a messenger app that is highly regarded among privacy experts like Edward Snowden. That’s because Signal has used end-to-end encryption by default from the start, doesn’t ask its users for personal information or store personal data on its servers and is open source. The messenger is therefore considered particularly secure and trustworthy. Moreover, no security vulnerabilities have become known so far, which is definitely the case with numerous competing products.

Since 2018, Signal is beeing operated by the non-profit organization Signal Technology Foundation and the Signal Messenger LLC. At that time, WhatsApp co-founder Brian Acton, among others, joined the company and invested $50 million. Signal founder Moxie Marlinspike is also still on board.

The EU commission is planning a legislative package to fight the spread of child abuse on the Internet. The law will also include automated searches of the content of private and encrypted communications, for example via messenger apps. This would undermine the core functions of Signal in Europe. Critics call this form of preventive mass surveillance a threat to privacy, IT security, freedom of expression and democracy.

Update: The Council of the European Union publishes recommendations on encryption

8. December 2020

In November, the Austrian broadcasting network “Österreichischer Rundfunk” sparked a controversial discussion by publishing leaked drafts of the Council of the European Union (“EU Council”) on encryption (please see our blog post). After these drafts had been criticized by several politicians, journalists and NGOs, the EU Council published “Recommendations for a way forward on the topic of encryption” on December 1st, in which it considers it important to carefully balance between protecting fundamental rights with ensuring law enforcement investigative powers.

The EU Council sees a dilemma between the need for strong encryption in order to protect privacy on one hand, and the misuse of encryption by criminal subjects such as terrorists and organized crime on the other hand. They further note:

“We acknowledge this dilemma and are determined to find ways that will not compromise
either one, upholding the principle of security through encryption and security despite
encryption.”

The paper lists several intentions that are supposed to help find solutions to this dilemma.

First, it directly addresses EU institutions, agencies, and member states, asking them to coordinate their efforts in developing technical, legal and operational solutions. Part of this cooperation is supposed to be the joint implementation of standardized high-quality training programs for law enforcement officers that are tailored to the skilled criminal environment. International cooperation, particularly with the initiators of the “International Statement: End-to-End Encryption and Public Safety“, is proclaimed as a further intention.

Next the technology industry, civil society and academic world are acknowledged as important partners with whom EU institutions shall establish a permanent dialogue. The recommendations address internet service providers and social media platforms directly, noting that only with their involvement can the full potential of technical expertise be realized. Europol’s EU Innovation Hub and national research and development teams are named key EU institutions for maintaining this dialogue.

The EU Council concludes that the continuous development of encryption requires regular evaluation and review of technical, operational, and legal solutions.

These recommendations can be seen as a direct response to the discussion that arose in November. The EU Council is attempting to appease critics by emphasizing the value of encryption, while still reiterating the importance of law enforcement efficiency. It remains to be seen how willing the private sector will cooperate with the EU institutions and what measures exactly the EU Council intends to implement. This list of intentions lacks clear guidelines, recommendations or even a clearly formulated goal. Instead, the parties are asked to work together to find solutions that offer the highest level of security while maximizing law enforcement efficiency. In summary, these “recommendations” are more of a statement of intent than implementable recommendations on encryption.

The Controversy around the Council of the European Union’s Declaration on End-to-End Encryption

27. November 2020

In the course of November 2020, the Council of the European Union issued several draft versions of a joint declaration with the working title “Security through encryption and security despite encryption”. The drafts were initially intended only for internal purposes, but leaked and first published by the Austrian brodcasting network “Österreichischer Rundfunk” (“ORF”) in an article by journalist Erich Möchel. Since then, the matter has sparked widespread public interest and media attention.

The controversy around the declaration arose when the ORF commentator Möchel presented further information from unknown sources that “compentent authorities” shall be given “exceptional access” to the end-to-end encryption of communications. This would mean that communications service providers like WhatsApp, Signal etc. would be obliged to allow a backdoor and create a general key to encrypted communications which they would deposit with public authorities. From comparing the version of the declaration from 6 November 2020 with the previous version from 21 October 2020, he highlighted that in the previous version it states that additional practical powers shall be given to “law enforcement and judicial authorities”, whereas in the more recent version, the powers shall be given to “competent authorities in the area of security and criminal justice”. He adds that the new broader wording would include European intelligence agencies as well and allow them to undermine end-to-end encryption. Furthermore, he also indicated that plans to restrict end-to-end encyption in Western countries are not new, but originally proposed by the “Five Eyes” intelligence alliance of the United States, Canada, United Kingdom, Australia and New Zealand.

As a result of the ORF article, the supposed plans to restrict or ban end-to-end encryption have been widely criticised by Politicians, Journalists, and NGOs stating that any backdoors to end-to-end encryption would render any secure encryption impossible.

However, while it can be verified that the “Five Eyes” propose the creation of general keys to access end-to-end encrypted communications, similar plans for the EU cannot be clearly deduced from the EU Council’s declaration at hand. The declaration itself recognises end-to-end encryption as highly beneficial to protect governments, critical infrastructures, civil society, citizens and industry by ensuring privacy, confidentiality and data integrity of communications and personal data. Moreover, it mentions that EU data protection authorities have identified it as an important tool in light of the Schrems II decision of the CJEU. At the same time, the Council’s declaration illustrates that end-to-end encryption poses large challenges for criminal investigations when gathering evidencein cases of cyber crime, making it at times “practically impossible”. Lastly, the Council calls for an open, unbiased and active discussion with the tech industry, research and academia in order to achieve a better balance between “security through encryption and security despite encryption”.

Möchel’s sources for EU plans to ban end-to-end encryption through general keys remain unknown and unverifiable. Despite general concerns for overarching surveillance powers of governments, the public can only approach the controversy around the EU Council’s declaration with due objectivity and remain observant on whether or how the EU will regulate end-to-end encryption and find the right balance between the privacy rights of European citizens and the public security and criminal justice interests of governments.

Swiss Data Protection Commissioner: “Swiss-U.S. Privacy Shield not providing adequate level of Data Protection”

28. September 2020

Following the recent ruling by the Court of Justice of the European Union (“CJEU”) the Swiss Data Protection Commissioner (“EDÖB”) published a statement concerning the level of Data Protection of Data Transfers under the Swiss-U.S. Privacy Shield. The “Schrems II” decision by the CJEU is not legally binding in the Switzerland because Switzerland is neither a EU nor a EEA country. But as the EDÖB and the Joint European Data Protection Authorities work closely together, the decision has first implications for Swiss data exporters.

In accordance with Swiss Data Protection law (Art. 7 VDSG), the Swiss Data Protection Commissioner maintains a publicly accessible list of countries assessing the level of Data Protection guaranteed by these countries. This list shall serve Swiss data exporters as a guidance for their data exporting activities and acts as a rebuttable presumption. EU and EEA countries have continuously been listed in the first column of the list because they are regarded to provide an adequate level of Data Protection. The U.S. has been listed in the second column as a country providing “adequate protection under certain conditions”, which meant a certification of U.S. data importers under the Swiss-U.S. Privacy Shield.

Subsequent to the CJEU ruling, the EDÖB decided to list the U.S. in the third column as a country providing “inadequate protection”, thereby also acting on his past annual reviews of the Swiss-U.S. Privacy Shield. In his reviews, the EDÖB already criticised that data subjects in Switzerland lack access to the courts in the U.S. on account of Data Protection violations and that the Ombudsman-mechanism is ineffective in this regard.

Lastly, the EDÖB pointed out that the Swiss-U.S. Privacy Shield remains in effect since there has not been a decision by Swiss courts comparable to the CJEU decision and that his assessment has the status of a recommendation. However, the EDÖB advises Swiss data exporters to always make a risk assessment when transferring Personal Data to countries with “inadequate protection” and possibly to apply technical measures (e.g. BYOK encryption) in order to protect the data from access by foreign intelligence services.

Series on COVID-19 Contact Tracing Apps Part 2: The EDPB Guideline on the Use of Contact Tracing Tools

25. May 2020

Today we are continuing our miniseries on contact tracing apps and data protection with Part 2 of the series: The EDPB Guideline on the Use of Contact Tracing Tools. As mentioned in Part 1 of our miniseries, many Member States of the European Union have started to discuss using modern technologies to combat the spread of the Coronavirus. Now, the European Data Protection Board (“EDPB”) has issued a new guideline on the use of contact tracing tools in order to give European policy makers guidance on Data Protection concerns before implementing these tools.

The Legal Basis for Processing

In its guideline, the EDPB proposes that the most relevant legal basis for the processing of personal data using contact tracing apps will probably be the necessity for the performance of a task in the public interest, i.e. Art. 6 para. 1 lit. e) GDPR. In this context, Art. 6 para. 3 GDPR clarifies that the basis for the processing referred to in Art. 6 para. 1 lit. e) GDPR shall be laid down by Union or Members State law.

Another possible legal basis for processing could be consent pursuant to Art. 6 para. 1 lit. a) GDPR. However, the controller will have to ensure that the strict requirements for consent to be valid are met.

If the contact tracing application is specifically processing sensitive data, like health data, processing could be based on Art. 9 para. 2 lit. i) GDPR for reasons of public interest in the area of public health or on Art. 9 para. 2 lit. h) GDPR for health care purposes. Otherwise, processing may also be based on explicit consent pursuant to Art. 9 para. 2 lit. a) GDPR.

Compliance with General Data Protection Principles

The guideline is a prime example of the EDPB upholding that any data processing technology must comply with the general data protection principles which are stipulated in Art. 5 GDPR. Contact tracing technology will not be an exeption to this general rule. Thus, the guideline contains recommendations on what national governments and health agencies will need to be aware of in order to observe the data protection principles.

Principle of Lawfulness, fairness and transparency, Art. 5 para. 1 lit. a) GDPR: First and foremost, the EDPB points out that the contact tracing technology must ensure compliance with GDPR and Directive 2002/58/EC (the “ePrivacy Directive”). Also, the application’s algorithms must be auditable and should be regularly reviewed by independent experts. The application’s source code should be made publicly available.

Principle of Purpose limitation, Art. 5 para. 1 lit. b) GDPR: The national authorities’ purposes of processing personal data must be specific enough to exclude further processing for purposes unrelated to the management of the COVID-19 health crisis.

Principles of Data minimisation and Data Protection by Design and by Default, Art. 5 para. 1 lit. c) and Art. 25 GDPR:

  • Data processed should be reduced to the strict minimum. The application should not collect unrelated or unnecessary information, which may include civil status, communication identifiers, equipment directory items, messages, call logs, location data, device identifiers, etc.;
  • Contact tracing apps do not require tracking the location of individual users. Instead, proximity data should be used;
  • Appropriate measures should be put in place to prevent re-identification;
  • The collected information should reside on the terminal equipment of the user and only the relevant information should be collected when absolutely necessary.

Principle of Accuracy, Art. 5 para. 1 lit. d) GDPR: The EDPB advises that procedures and processes including respective algorithms implemented by the contact tracing apps should work under the strict supervision of qualified personnel in order to limit the occurrence of any false positives and negatives. Moreover, the applications should include the ability to correct data and subsequent analysis results.

Principle of Storage limitation, Art. 5 para. 1 lit. e) GDPR: With regards to data retention mandates, personal data should be kept only for the duration of the COVID-19 crisis. The EDPB also recommends including, as soon as practicable, the criteria to determine when the application shall be dismantled and which entity shall be responsible and accountable for making that determination.

Principle of Integrity and confidentiality, Art. 5 para. 1 lit. f) GDPR: Contact tracing apps should incorporate appropriate technical and organisational measures to ensure the security of processing. The EDPB places special emphasis on state-of-the-art cryptographic techniques which should be implemented to secure the data stored in servers and applications.

Principle of Accountability, Art. 5 para. 2 GDPR: To ensure accountability, the controller of any contact tracing application should be clearly defined. The EDPB suggests that national health authorities could be the controllers. Because contact tracing technology involves different actors in order to work effectively, their roles and responsibilities must be clearly established from the outset and be explained to the users.

Functional Requirements and Implementation

The EDPB also makes mention of the fact that the implementations for contact tracing apps may follow a centralised or a decentralised approach. Generally, both systems use Bluetooth signals to log when smartphone owners are close to each other.  If one owner was confirmed to have contracted COVID-19, an alert can be sent to other owners they may have infected. Under the centralised version, the anonymised data gathered by the app will be uploaded to a remote server where matches are made with other contacts. Under the decentralised version, the data is kept on the mobile device of the user, giving users more control over their data. The EDPB does not give a recommendation for using either approach. Instead, national authorities may consider both concepts and carefully weigh up the respective effects on privacy and the possible impacts on individuals rights.

Before implementing contact tracing apps, the EDPB also suggests that a Data Protection Impact Assessment (DPIA) must be carried out as the processing is considered likely high risk (health data, anticipated large-scale adoption, systematic monitoring, use of new technological solution). Furthermore, they strongly recommend the publication of DPIAs to ensure transparency.

Lastly, the EDPB proposes that the use of contact tracing applications should be voluntary and reiterates that it should not rely on tracing individual movements but rather on proximity information regarding users.

Outlook

The EDPB acknowledges that the systematic and large scale monitoring of contacts between natural persons is a grave intrusion into their privacy. Therefore, Data Protection is indispensable to build trust, create the conditions for social acceptability of any solution, and thereby guarantee the effectiveness of these measures. It further underlines that public authorities should not have to choose between an efficient response to the current pandemic and the protection of fundamental rights, but that both can be achieved at the same time.

In the third part of the series regarding COVID-19 contact tracing apps, we will take a closer look into the privacy issues that countries are facing when implementing contact tracing technologies.

The Video-conference service Zoom and its Data Security issues

20. April 2020

Amidst the Corona crisis, the video communications service Zoom gained enormous popularity. The rate of daily Zoom users skyrocketed from 10 Mio in December 2019 to 200 Mio in March 2020. As it outshined many of its competitors, Zoom labels itself as “the leader in modern enterprise video communications”. However, the company has been facing a lot of public criticism because of its weaknesses in data security and lack of awareness in data protection matters.

Basic data security weaknesses unfolded little by little starting in March 2020:

  • Zoom had to admit that it was wrongly advertising to provide full end-to-end encryption for all shared contents like video, audio or screen sharing.
  • Security experts revealed several bugs that could have allowed webcam and mic hijacking and the theft of login credentials.
  • An online Tech Magazine reported that Zoom leaked thousands of their users’ email addresses and photos to strangers.
  • Video-conferences which users did not protect with a password, enabled “Zoombombing”, a phenomenon in which strangers hijacked videocalls and disrupted them by posting pornographic and racist images as well as spamming the conversations with threatening language. In response, Zoom introduced the Waiting Room feature and additional password settings.

At the same time, Zoom’s data privacy practices came under scrutiny:

  • Zoom shared web analytics data with third-party companies for advertising purposes without having a legal basis or notifying users about this practice. In response to criticism, Zoom revised its privacy policy and now declares that it does not share data from meetings for advertising.
  • The company also shared more analytics data of its users with Facebook than stated on Zoom’s privacy policy, even if the user did not sign in with their Facebook account. Zoom introduced an update in which this sharing is terminated.
  • The New York Times revealed that Zoom used a data mining feature that matched Zoom users’ names and email addresses to their LinkedIn profiles without the users knowing about it. Zoom then enabled automatic sharing of the matched LinkedIn profiles with other meeting members that were subscribers of a LinkedIn service for sales prospecting (“LinkedIn Sales Navigator”). In response to criticism, Zoom removed this feature permanently.
  • Zoom hosted a feature called Attention Tracking, which let the meeting’s host know when an attendee had clicked away the meeting window for more than 30 seconds. In the meantime, Zoom disabled the feature.

The security and privacy issues of Zoom have led various public authorities and companies internationally to ban their workers from using the service.

On 1 April 2020, Zoom’s founder and CEO Eric S. Yuan announced a 90-day plan to significantly improve their data security in an effort to build greater trust with their users. This plan includes freezing the introduction of new features, enlarge their cybersecurity team and engage outside help from security advisors.

UK government to meet tech giants after Westminster attack

28. March 2017

In consequence of the Westminster Bridge attack in London, Home Secretary Amber Rudd announced that she wants to meet several tech giants in order to make sure law enforcement is able to access encrypted data for terrorism investigation.

The topic came up as the attacker reportedly used the messaging application WhatsApp shortly before his attack began. As WhatsApp uses end-to-end encryption, neither law enforcement nor WhatsApp itself can read messages. The same applies to Apple’s iMessage. While Rudd did not want to make public which tech companies she will meet in detail, Google confirmed that it will be meeting the UK government.

“We need to make sure that organisations like WhatsApp, and there are plenty of others like that, don’t provide a secret place for terrorists to communicate with each other,“ Rudd said. Labour leader Jeremy Corbin, however, stated that law enforcement already had enough powers and that there needed to be a balance between the right to know and the right to privacy.

In the meantime, Microsoft confirmed that it had provided email information relating to the Westminster Bridge attack to the British authorities after it had received lawful orders.

Use of encryption App increases after US election

6. December 2016

BuzzFeed News reported, that after electing Donald Trump the App called Signal has been faced with a 400 percent rise in daily downloads.

This App is a secure communications tool and therefore well-known in terms of technology, journalism and politics. When using this App people are able to text and speak with one another by encrypting end-to-end, so that only the sender and the intended recipient can read or hear the respective message.

The founder of the App called Signal, Moxie Marlinspike, released a statement saying that “There has never been a single event that has resulted in this kind of sustained, day-over-day increase.” Marlinspike explained that “Trump is about to be put in control of the most pervasive, largest, and least accountable surveillance infrastructure in the world (…) People are maybe a bit uncomfortable with him.”

 

Pages: 1 2 Next
1 2