Tag: End-to-end encryption

Microsoft Teams now offers end-to-end encryption for one-to-one calls

16. December 2021

On December 14th, 2021, John Gruszczyk, a technical product manager at Microsoft (MS), announced, that end-to-end encryption (E2EE) is now generally available for MS Teams calls between two users. MS launched a public preview of E2EE for calls back in October, after announcing the option earlier in 2021.

IT administrators now have the option to enable and manage the feature for their organization once the update is implemented. However, E2EE will not be enabled by default at the user even then. Once IT administrators have configured MS Teams to be used with E2EE enabled, users will still need to enable E2EE themselves in their Teams settings. E2EE encrypts audio, video and screen sharing.

Certain futures will not be available when E2EE is turned on. These include recording of a call, live caption and transcription, transferring a call to another device, adding participants, parking calls, call transfer, and merging calls. If any of these features are required for a call, E2EE must be turned off for that call.

Currently, MS Teams encrypts data, including chat content, in transit and at rest by default, and allows authorized services to decrypt content. MS also uses SharePoint encryption to secure files at rest and OneNote encryption for notes stored in MS Teams. E2EE is particularly suitable for one-on-one calls in situations requiring increased confidentiality.

MS also published an in depth explanation of how this option can me turned on.

With this step, MS is following the example of Zoom, which launched E2EE in October and is making it available for larger group sessions (up to 200 participants).

EU commission working on allowing automated searches of the content of private and encrypted communications

25. November 2021

The EU Commission is working on a legislative package to combat child abuse, which will also regulate the exchange of child pornography on the internet. The scope of these regulations is expected to include automated searches for private encrypted communications via messaging apps.

When questioned, Olivier Onidi, Deputy Director General of the Directorate-General Migration and Home Affairs at the European Commission, said the proposal aims to “cover all forms of communication, including private communication”.

The EU Commissioner of Home Affairs, Ylva Johansson, declared the fight against child sexual abuse to be her top priority. The current Slovenian EU Council Presidency has also declared the fight against child abuse to be one of its main priorities and intends to focus on the “digital dimension”.

In May 2021, the EU Commission, the Council and the European Parliament reached a provisional agreement on an exemption to the ePrivacy Directive that would allow web-based email and messaging services to detect, remove, and report child sexual abuse material. Previously, the European Electronic Communications Code (EECC) had extended the legal protection of the ePrivacy Directive to private communications related to electronic messaging services. Unlike the General Data Protection Regulation, the ePrivacy Directive does not contain a legal basis for the voluntary processing of content or traffic data for the purpose of detecting child sexual abuse. For this reason, such an exception was necessary.

Critics see this form of preventive mass surveillance as a threat to privacy, IT security, freedom of expression and democracy. A critic to the agreement states:

This unprecedented deal means all of our private e-mails and messages will be subjected to privatized real-time mass surveillance using error-prone incrimination machines inflicting devastating collateral damage on users, children and victims alike.

However, the new legislative initiative goes even further. Instead of allowing providers of such services to search for such content on a voluntary basis, all providers would be required to search the services they offer for such content.

How exactly such a law would be implemented from a technical perspective will probably not be clear from the text of the law and is likely to be left up to the providers.
One possibility would be that software checks the hash of an attachment before it is sent and compares it with a database of hashes that have already been identified as illegal once. Such software is offered by Microsoft, for example, and such a database is operated by the National Center of Missing and Exploited Children in the United States. A hash is a kind of digital fingerprint of a file.
Another possibility would be the monitoring technology “client-side scanning”. This involves scanning messages before they are encrypted on the user’s device. However, this technology has been heavily criticized by numerous IT security researchers and encryption software manufacturers in a joint study. They describe CSS as a threat to privacy, IT security, freedom of expression and democracy, among other things because the technology creates security loopholes and thus opens up gateways for state actors and hackers.

The consequence of this law would be a significant intrusion into the privacy of all EU citizens, as every message would be checked automatically and without suspicion. The introduction of such a law would also have massive consequences for the providers of encrypted messaging services, as they would have to change their software fundamentally and introduce corresponding control mechanisms, but without jeopardizing the security of users, e.g., from criminal hackers.

There is another danger that must be considered: The introduction of such legally mandated automated control of systems for one area of application can always lead to a lowering of the inhibition threshold to use such systems for other purposes as well. This is because the same powers that are introduced in the name of combating child abuse could, of course, also be introduced for investigations in other areas.

It remains to be seen when the relevant legislation will be introduced and when and how it will be implemented. Originally, the bill was scheduled to be presented on December 1st, 2021, but this item has since been removed from the Commission’s calendar.

US court unsuccessfully demanded extensive information about user of the messenger app Signal

16. November 2021

On October 27th, 2021 Signal published a search warrant for user data issued by a court in Santa Clara, California. The court ordered Signal to provide a variety of information, including a user’s name, address, correspondence, contacts, groups, and call records from the years 2019 and 2020. Signal was only able to provide two sets of data: the timestamp of when the account was created and the date of the last connection to the Signal server, as Signal does not store any other information about its users.

The warrant also included a confidentiality order that was extended four times. Signal stated:

Though the judge approved four consecutive non-disclosure orders, the court never acknowledged receipt of our motion to partially unseal, nor scheduled a hearing, and would not return counsel’s phone calls seeking to schedule a hearing.

A similar case was made public by Signal in 2016, when a court in Virginia requested the release of user data and ordered that the request not be made public. Signal fought the non-publication order in court and eventually won.

Signal is a messenger app that is highly regarded among privacy experts like Edward Snowden. That’s because Signal has used end-to-end encryption by default from the start, doesn’t ask its users for personal information or store personal data on its servers and is open source. The messenger is therefore considered particularly secure and trustworthy. Moreover, no security vulnerabilities have become known so far, which is definitely the case with numerous competing products.

Since 2018, Signal is beeing operated by the non-profit organization Signal Technology Foundation and the Signal Messenger LLC. At that time, WhatsApp co-founder Brian Acton, among others, joined the company and invested $50 million. Signal founder Moxie Marlinspike is also still on board.

The EU commission is planning a legislative package to fight the spread of child abuse on the Internet. The law will also include automated searches of the content of private and encrypted communications, for example via messenger apps. This would undermine the core functions of Signal in Europe. Critics call this form of preventive mass surveillance a threat to privacy, IT security, freedom of expression and democracy.

Update: The Council of the European Union publishes recommendations on encryption

8. December 2020

In November, the Austrian broadcasting network “Österreichischer Rundfunk” sparked a controversial discussion by publishing leaked drafts of the Council of the European Union (“EU Council”) on encryption (please see our blog post). After these drafts had been criticized by several politicians, journalists and NGOs, the EU Council published “Recommendations for a way forward on the topic of encryption” on December 1st, in which it considers it important to carefully balance between protecting fundamental rights with ensuring law enforcement investigative powers.

The EU Council sees a dilemma between the need for strong encryption in order to protect privacy on one hand, and the misuse of encryption by criminal subjects such as terrorists and organized crime on the other hand. They further note:

“We acknowledge this dilemma and are determined to find ways that will not compromise
either one, upholding the principle of security through encryption and security despite
encryption.”

The paper lists several intentions that are supposed to help find solutions to this dilemma.

First, it directly addresses EU institutions, agencies, and member states, asking them to coordinate their efforts in developing technical, legal and operational solutions. Part of this cooperation is supposed to be the joint implementation of standardized high-quality training programs for law enforcement officers that are tailored to the skilled criminal environment. International cooperation, particularly with the initiators of the “International Statement: End-to-End Encryption and Public Safety“, is proclaimed as a further intention.

Next the technology industry, civil society and academic world are acknowledged as important partners with whom EU institutions shall establish a permanent dialogue. The recommendations address internet service providers and social media platforms directly, noting that only with their involvement can the full potential of technical expertise be realized. Europol’s EU Innovation Hub and national research and development teams are named key EU institutions for maintaining this dialogue.

The EU Council concludes that the continuous development of encryption requires regular evaluation and review of technical, operational, and legal solutions.

These recommendations can be seen as a direct response to the discussion that arose in November. The EU Council is attempting to appease critics by emphasizing the value of encryption, while still reiterating the importance of law enforcement efficiency. It remains to be seen how willing the private sector will cooperate with the EU institutions and what measures exactly the EU Council intends to implement. This list of intentions lacks clear guidelines, recommendations or even a clearly formulated goal. Instead, the parties are asked to work together to find solutions that offer the highest level of security while maximizing law enforcement efficiency. In summary, these “recommendations” are more of a statement of intent than implementable recommendations on encryption.

The Controversy around the Council of the European Union’s Declaration on End-to-End Encryption

27. November 2020

In the course of November 2020, the Council of the European Union issued several draft versions of a joint declaration with the working title “Security through encryption and security despite encryption”. The drafts were initially intended only for internal purposes, but leaked and first published by the Austrian brodcasting network “Österreichischer Rundfunk” (“ORF”) in an article by journalist Erich Möchel. Since then, the matter has sparked widespread public interest and media attention.

The controversy around the declaration arose when the ORF commentator Möchel presented further information from unknown sources that “compentent authorities” shall be given “exceptional access” to the end-to-end encryption of communications. This would mean that communications service providers like WhatsApp, Signal etc. would be obliged to allow a backdoor and create a general key to encrypted communications which they would deposit with public authorities. From comparing the version of the declaration from 6 November 2020 with the previous version from 21 October 2020, he highlighted that in the previous version it states that additional practical powers shall be given to “law enforcement and judicial authorities”, whereas in the more recent version, the powers shall be given to “competent authorities in the area of security and criminal justice”. He adds that the new broader wording would include European intelligence agencies as well and allow them to undermine end-to-end encryption. Furthermore, he also indicated that plans to restrict end-to-end encyption in Western countries are not new, but originally proposed by the “Five Eyes” intelligence alliance of the United States, Canada, United Kingdom, Australia and New Zealand.

As a result of the ORF article, the supposed plans to restrict or ban end-to-end encryption have been widely criticised by Politicians, Journalists, and NGOs stating that any backdoors to end-to-end encryption would render any secure encryption impossible.

However, while it can be verified that the “Five Eyes” propose the creation of general keys to access end-to-end encrypted communications, similar plans for the EU cannot be clearly deduced from the EU Council’s declaration at hand. The declaration itself recognises end-to-end encryption as highly beneficial to protect governments, critical infrastructures, civil society, citizens and industry by ensuring privacy, confidentiality and data integrity of communications and personal data. Moreover, it mentions that EU data protection authorities have identified it as an important tool in light of the Schrems II decision of the CJEU. At the same time, the Council’s declaration illustrates that end-to-end encryption poses large challenges for criminal investigations when gathering evidencein cases of cyber crime, making it at times “practically impossible”. Lastly, the Council calls for an open, unbiased and active discussion with the tech industry, research and academia in order to achieve a better balance between “security through encryption and security despite encryption”.

Möchel’s sources for EU plans to ban end-to-end encryption through general keys remain unknown and unverifiable. Despite general concerns for overarching surveillance powers of governments, the public can only approach the controversy around the EU Council’s declaration with due objectivity and remain observant on whether or how the EU will regulate end-to-end encryption and find the right balance between the privacy rights of European citizens and the public security and criminal justice interests of governments.

The Video-conference service Zoom and its Data Security issues

20. April 2020

Amidst the Corona crisis, the video communications service Zoom gained enormous popularity. The rate of daily Zoom users skyrocketed from 10 Mio in December 2019 to 200 Mio in March 2020. As it outshined many of its competitors, Zoom labels itself as “the leader in modern enterprise video communications”. However, the company has been facing a lot of public criticism because of its weaknesses in data security and lack of awareness in data protection matters.

Basic data security weaknesses unfolded little by little starting in March 2020:

  • Zoom had to admit that it was wrongly advertising to provide full end-to-end encryption for all shared contents like video, audio or screen sharing.
  • Security experts revealed several bugs that could have allowed webcam and mic hijacking and the theft of login credentials.
  • An online Tech Magazine reported that Zoom leaked thousands of their users’ email addresses and photos to strangers.
  • Video-conferences which users did not protect with a password, enabled “Zoombombing”, a phenomenon in which strangers hijacked videocalls and disrupted them by posting pornographic and racist images as well as spamming the conversations with threatening language. In response, Zoom introduced the Waiting Room feature and additional password settings.

At the same time, Zoom’s data privacy practices came under scrutiny:

  • Zoom shared web analytics data with third-party companies for advertising purposes without having a legal basis or notifying users about this practice. In response to criticism, Zoom revised its privacy policy and now declares that it does not share data from meetings for advertising.
  • The company also shared more analytics data of its users with Facebook than stated on Zoom’s privacy policy, even if the user did not sign in with their Facebook account. Zoom introduced an update in which this sharing is terminated.
  • The New York Times revealed that Zoom used a data mining feature that matched Zoom users’ names and email addresses to their LinkedIn profiles without the users knowing about it. Zoom then enabled automatic sharing of the matched LinkedIn profiles with other meeting members that were subscribers of a LinkedIn service for sales prospecting (“LinkedIn Sales Navigator”). In response to criticism, Zoom removed this feature permanently.
  • Zoom hosted a feature called Attention Tracking, which let the meeting’s host know when an attendee had clicked away the meeting window for more than 30 seconds. In the meantime, Zoom disabled the feature.

The security and privacy issues of Zoom have led various public authorities and companies internationally to ban their workers from using the service.

On 1 April 2020, Zoom’s founder and CEO Eric S. Yuan announced a 90-day plan to significantly improve their data security in an effort to build greater trust with their users. This plan includes freezing the introduction of new features, enlarge their cybersecurity team and engage outside help from security advisors.