Category: Encryption

Update: The Council of the European Union publishes recommendations on encryption

8. December 2020

In November, the Austrian broadcasting network “Österreichischer Rundfunk” sparked a controversial discussion by publishing leaked drafts of the Council of the European Union (“EU Council”) on encryption (please see our blog post). After these drafts had been criticized by several politicians, journalists and NGOs, the EU Council published “Recommendations for a way forward on the topic of encryption” on December 1st, in which it considers it important to carefully balance between protecting fundamental rights with ensuring law enforcement investigative powers.

The EU Council sees a dilemma between the need for strong encryption in order to protect privacy on one hand, and the misuse of encryption by criminal subjects such as terrorists and organized crime on the other hand. They further note:

“We acknowledge this dilemma and are determined to find ways that will not compromise
either one, upholding the principle of security through encryption and security despite
encryption.”

The paper lists several intentions that are supposed to help find solutions to this dilemma.

First, it directly addresses EU institutions, agencies, and member states, asking them to coordinate their efforts in developing technical, legal and operational solutions. Part of this cooperation is supposed to be the joint implementation of standardized high-quality training programs for law enforcement officers that are tailored to the skilled criminal environment. International cooperation, particularly with the initiators of the “International Statement: End-to-End Encryption and Public Safety“, is proclaimed as a further intention.

Next the technology industry, civil society and academic world are acknowledged as important partners with whom EU institutions shall establish a permanent dialogue. The recommendations address internet service providers and social media platforms directly, noting that only with their involvement can the full potential of technical expertise be realized. Europol’s EU Innovation Hub and national research and development teams are named key EU institutions for maintaining this dialogue.

The EU Council concludes that the continuous development of encryption requires regular evaluation and review of technical, operational, and legal solutions.

These recommendations can be seen as a direct response to the discussion that arose in November. The EU Council is attempting to appease critics by emphasizing the value of encryption, while still reiterating the importance of law enforcement efficiency. It remains to be seen how willing the private sector will cooperate with the EU institutions and what measures exactly the EU Council intends to implement. This list of intentions lacks clear guidelines, recommendations or even a clearly formulated goal. Instead, the parties are asked to work together to find solutions that offer the highest level of security while maximizing law enforcement efficiency. In summary, these “recommendations” are more of a statement of intent than implementable recommendations on encryption.

The Controversy around the Council of the European Union’s Declaration on End-to-End Encryption

27. November 2020

In the course of November 2020, the Council of the European Union issued several draft versions of a joint declaration with the working title “Security through encryption and security despite encryption”. The drafts were initially intended only for internal purposes, but leaked and first published by the Austrian brodcasting network “Österreichischer Rundfunk” (“ORF”) in an article by journalist Erich Möchel. Since then, the matter has sparked widespread public interest and media attention.

The controversy around the declaration arose when the ORF commentator Möchel presented further information from unknown sources that “compentent authorities” shall be given “exceptional access” to the end-to-end encryption of communications. This would mean that communications service providers like WhatsApp, Signal etc. would be obliged to allow a backdoor and create a general key to encrypted communications which they would deposit with public authorities. From comparing the version of the declaration from 6 November 2020 with the previous version from 21 October 2020, he highlighted that in the previous version it states that additional practical powers shall be given to “law enforcement and judicial authorities”, whereas in the more recent version, the powers shall be given to “competent authorities in the area of security and criminal justice”. He adds that the new broader wording would include European intelligence agencies as well and allow them to undermine end-to-end encryption. Furthermore, he also indicated that plans to restrict end-to-end encyption in Western countries are not new, but originally proposed by the “Five Eyes” intelligence alliance of the United States, Canada, United Kingdom, Australia and New Zealand.

As a result of the ORF article, the supposed plans to restrict or ban end-to-end encryption have been widely criticised by Politicians, Journalists, and NGOs stating that any backdoors to end-to-end encryption would render any secure encryption impossible.

However, while it can be verified that the “Five Eyes” propose the creation of general keys to access end-to-end encrypted communications, similar plans for the EU cannot be clearly deduced from the EU Council’s declaration at hand. The declaration itself recognises end-to-end encryption as highly beneficial to protect governments, critical infrastructures, civil society, citizens and industry by ensuring privacy, confidentiality and data integrity of communications and personal data. Moreover, it mentions that EU data protection authorities have identified it as an important tool in light of the Schrems II decision of the CJEU. At the same time, the Council’s declaration illustrates that end-to-end encryption poses large challenges for criminal investigations when gathering evidencein cases of cyber crime, making it at times “practically impossible”. Lastly, the Council calls for an open, unbiased and active discussion with the tech industry, research and academia in order to achieve a better balance between “security through encryption and security despite encryption”.

Möchel’s sources for EU plans to ban end-to-end encryption through general keys remain unknown and unverifiable. Despite general concerns for overarching surveillance powers of governments, the public can only approach the controversy around the EU Council’s declaration with due objectivity and remain observant on whether or how the EU will regulate end-to-end encryption and find the right balance between the privacy rights of European citizens and the public security and criminal justice interests of governments.

Microsoft reacts on EDPB’s data transfer recommendations

24. November 2020

Microsoft (“MS”) is among the first companies to react to the European Data Protection Board’s data transfer recommendations (please see our article), as the tech giant announced in a blog post on November 19th. MS calls these additional safeguards “Defending Your Data” and will immediately start implementing them in contracts with public sector and enterprise customers.

In light of the Schrems II ruling by the Court of Justice of the European Union (“CJEU”) on June 16th, the EDPB issued recommendations on how to transfer data into non-EEA countries in accordance with the GDPR on November 17th (please see our article). The recommendations lay out a six-step plan on how to assess whether a data transfer is up to GDPR standards or not. These steps include mapping all data transfer, assessing a third countries legislation, assessing the tool used for transferring data and adding supplementary measures to that tool. Among the latter is a list of technical, organizational, and contractual measures to be implemented to ensure the effectiveness of the tool.

Julie Brill, Corporate Vice President for Global Privacy and Regulatory Affairs and Chief Privacy Officer at Microsoft, issued the statement in which she declares MS to be the first company responding to the EDPB’s guidance. These safeguards include an obligation for MS to challenge all government requests for public sector or enterprise customer data, where it has a lawful basis for doing so; to try and redirect data requests; and to notify the customer promptly if legally allowed, about any data request by an authority, concerning that customer. This was one of the main ETDB recommendations and also included in a draft for new Standard Contractual Clauses published by the European Commission on November 12th. MS announces to monetary compensate customers, whose personal data has to be disclosed in response to government requests.  These changes are additions to the SCC’s MS is using ever since Schrems II. Which include (as MS states) data encrypted to a high standard during transition and storage, transparency regarding government access requests to data (“U.S. National Security Orders Report” dating back to 2011; “Law Enforcement Requests Report“) .

Recently European authorities have been criticizing MS and especially its Microsoft 365 (“MS 365”) (formerly Office 365) tools for not being GDPR compliant. In July 2019 the Ministry of Justice in the Netherlands issued a Data Protection Impact Assessment (DPIA), warning authorities not to use Office 365 ProPlus, Windows 10 Enterprise, as well as Office Online and Mobile, since they do not comply with GDPR standards. The European Data Protection Supervisor issued a warning in July 2020 stating that the use of MS 365 by EU authorities and contracts between EU institutions and MS do not comply with the GDPR. Also, the German Data Security Congress (“GDSC”) issued a statement in October, in which it declared MS 365 as not being compliant with the GDPR. The GDSC is a board made up of the regional data security authorities of all 16 german states and the national data security authority. This declaration was reached by a narrow vote of 9 to 8. Some of the 8 regional authorities later even issued a press release explaining why they voted against the declaration. They criticized a missing involvement and hearing of MS during the process, the GDSC’s use of MS’ Online Service Terms and Data Processing Addendum dating back to January 2020 and the declaration for being too undifferentiated.

Some of the German data protection authorities opposing the GDSC’s statement were quick in welcoming the new developments in a joint press release. Although, they stress that the main issues in data transfer from the EU to the U.S. still were not solved. Especially the CJEU main reserves regarding the mass monitoring of data streams by U.S. intelligence agencies (such as the NSA) are hard to prevent and make up for. Still, they announced the GDSC would resume its talks with MS before the end of 2020.

This quick reaction to the EDPB recommendations should bring some ease into the discussion surrounding MS’ GDPR compliance. It will most likely help MS case, especially with the German authorities, and might even lead to a prompt resolution in a conflict regarding tools that are omnipresent at workplaces all over the globe.

Swiss Data Protection Commissioner: “Swiss-U.S. Privacy Shield not providing adequate level of Data Protection”

28. September 2020

Following the recent ruling by the Court of Justice of the European Union (“CJEU”) the Swiss Data Protection Commissioner (“EDÖB”) published a statement concerning the level of Data Protection of Data Transfers under the Swiss-U.S. Privacy Shield. The “Schrems II” decision by the CJEU is not legally binding in the Switzerland because Switzerland is neither a EU nor a EEA country. But as the EDÖB and the Joint European Data Protection Authorities work closely together, the decision has first implications for Swiss data exporters.

In accordance with Swiss Data Protection law (Art. 7 VDSG), the Swiss Data Protection Commissioner maintains a publicly accessible list of countries assessing the level of Data Protection guaranteed by these countries. This list shall serve Swiss data exporters as a guidance for their data exporting activities and acts as a rebuttable presumption. EU and EEA countries have continuously been listed in the first column of the list because they are regarded to provide an adequate level of Data Protection. The U.S. has been listed in the second column as a country providing “adequate protection under certain conditions”, which meant a certification of U.S. data importers under the Swiss-U.S. Privacy Shield.

Subsequent to the CJEU ruling, the EDÖB decided to list the U.S. in the third column as a country providing “inadequate protection”, thereby also acting on his past annual reviews of the Swiss-U.S. Privacy Shield. In his reviews, the EDÖB already criticised that data subjects in Switzerland lack access to the courts in the U.S. on account of Data Protection violations and that the Ombudsman-mechanism is ineffective in this regard.

Lastly, the EDÖB pointed out that the Swiss-U.S. Privacy Shield remains in effect since there has not been a decision by Swiss courts comparable to the CJEU decision and that his assessment has the status of a recommendation. However, the EDÖB advises Swiss data exporters to always make a risk assessment when transferring Personal Data to countries with “inadequate protection” and possibly to apply technical measures (e.g. BYOK encryption) in order to protect the data from access by foreign intelligence services.

Series on COVID-19 Contact Tracing Apps Part 2: The EDPB Guideline on the Use of Contact Tracing Tools

25. May 2020

Today we are continuing our miniseries on contact tracing apps and data protection with Part 2 of the series: The EDPB Guideline on the Use of Contact Tracing Tools. As mentioned in Part 1 of our miniseries, many Member States of the European Union have started to discuss using modern technologies to combat the spread of the Coronavirus. Now, the European Data Protection Board (“EDPB”) has issued a new guideline on the use of contact tracing tools in order to give European policy makers guidance on Data Protection concerns before implementing these tools.

The Legal Basis for Processing

In its guideline, the EDPB proposes that the most relevant legal basis for the processing of personal data using contact tracing apps will probably be the necessity for the performance of a task in the public interest, i.e. Art. 6 para. 1 lit. e) GDPR. In this context, Art. 6 para. 3 GDPR clarifies that the basis for the processing referred to in Art. 6 para. 1 lit. e) GDPR shall be laid down by Union or Members State law.

Another possible legal basis for processing could be consent pursuant to Art. 6 para. 1 lit. a) GDPR. However, the controller will have to ensure that the strict requirements for consent to be valid are met.

If the contact tracing application is specifically processing sensitive data, like health data, processing could be based on Art. 9 para. 2 lit. i) GDPR for reasons of public interest in the area of public health or on Art. 9 para. 2 lit. h) GDPR for health care purposes. Otherwise, processing may also be based on explicit consent pursuant to Art. 9 para. 2 lit. a) GDPR.

Compliance with General Data Protection Principles

The guideline is a prime example of the EDPB upholding that any data processing technology must comply with the general data protection principles which are stipulated in Art. 5 GDPR. Contact tracing technology will not be an exeption to this general rule. Thus, the guideline contains recommendations on what national governments and health agencies will need to be aware of in order to observe the data protection principles.

Principle of Lawfulness, fairness and transparency, Art. 5 para. 1 lit. a) GDPR: First and foremost, the EDPB points out that the contact tracing technology must ensure compliance with GDPR and Directive 2002/58/EC (the “ePrivacy Directive”). Also, the application’s algorithms must be auditable and should be regularly reviewed by independent experts. The application’s source code should be made publicly available.

Principle of Purpose limitation, Art. 5 para. 1 lit. b) GDPR: The national authorities’ purposes of processing personal data must be specific enough to exclude further processing for purposes unrelated to the management of the COVID-19 health crisis.

Principles of Data minimisation and Data Protection by Design and by Default, Art. 5 para. 1 lit. c) and Art. 25 GDPR:

  • Data processed should be reduced to the strict minimum. The application should not collect unrelated or unnecessary information, which may include civil status, communication identifiers, equipment directory items, messages, call logs, location data, device identifiers, etc.;
  • Contact tracing apps do not require tracking the location of individual users. Instead, proximity data should be used;
  • Appropriate measures should be put in place to prevent re-identification;
  • The collected information should reside on the terminal equipment of the user and only the relevant information should be collected when absolutely necessary.

Principle of Accuracy, Art. 5 para. 1 lit. d) GDPR: The EDPB advises that procedures and processes including respective algorithms implemented by the contact tracing apps should work under the strict supervision of qualified personnel in order to limit the occurrence of any false positives and negatives. Moreover, the applications should include the ability to correct data and subsequent analysis results.

Principle of Storage limitation, Art. 5 para. 1 lit. e) GDPR: With regards to data retention mandates, personal data should be kept only for the duration of the COVID-19 crisis. The EDPB also recommends including, as soon as practicable, the criteria to determine when the application shall be dismantled and which entity shall be responsible and accountable for making that determination.

Principle of Integrity and confidentiality, Art. 5 para. 1 lit. f) GDPR: Contact tracing apps should incorporate appropriate technical and organisational measures to ensure the security of processing. The EDPB places special emphasis on state-of-the-art cryptographic techniques which should be implemented to secure the data stored in servers and applications.

Principle of Accountability, Art. 5 para. 2 GDPR: To ensure accountability, the controller of any contact tracing application should be clearly defined. The EDPB suggests that national health authorities could be the controllers. Because contact tracing technology involves different actors in order to work effectively, their roles and responsibilities must be clearly established from the outset and be explained to the users.

Functional Requirements and Implementation

The EDPB also makes mention of the fact that the implementations for contact tracing apps may follow a centralised or a decentralised approach. Generally, both systems use Bluetooth signals to log when smartphone owners are close to each other.  If one owner was confirmed to have contracted COVID-19, an alert can be sent to other owners they may have infected. Under the centralised version, the anonymised data gathered by the app will be uploaded to a remote server where matches are made with other contacts. Under the decentralised version, the data is kept on the mobile device of the user, giving users more control over their data. The EDPB does not give a recommendation for using either approach. Instead, national authorities may consider both concepts and carefully weigh up the respective effects on privacy and the possible impacts on individuals rights.

Before implementing contact tracing apps, the EDPB also suggests that a Data Protection Impact Assessment (DPIA) must be carried out as the processing is considered likely high risk (health data, anticipated large-scale adoption, systematic monitoring, use of new technological solution). Furthermore, they strongly recommend the publication of DPIAs to ensure transparency.

Lastly, the EDPB proposes that the use of contact tracing applications should be voluntary and reiterates that it should not rely on tracing individual movements but rather on proximity information regarding users.

Outlook

The EDPB acknowledges that the systematic and large scale monitoring of contacts between natural persons is a grave intrusion into their privacy. Therefore, Data Protection is indispensable to build trust, create the conditions for social acceptability of any solution, and thereby guarantee the effectiveness of these measures. It further underlines that public authorities should not have to choose between an efficient response to the current pandemic and the protection of fundamental rights, but that both can be achieved at the same time.

In the third part of the series regarding COVID-19 contact tracing apps, we will take a closer look into the privacy issues that countries are facing when implementing contact tracing technologies.

The Video-conference service Zoom and its Data Security issues

20. April 2020

Amidst the Corona crisis, the video communications service Zoom gained enormous popularity. The rate of daily Zoom users skyrocketed from 10 Mio in December 2019 to 200 Mio in March 2020. As it outshined many of its competitors, Zoom labels itself as “the leader in modern enterprise video communications”. However, the company has been facing a lot of public criticism because of its weaknesses in data security and lack of awareness in data protection matters.

Basic data security weaknesses unfolded little by little starting in March 2020:

  • Zoom had to admit that it was wrongly advertising to provide full end-to-end encryption for all shared contents like video, audio or screen sharing.
  • Security experts revealed several bugs that could have allowed webcam and mic hijacking and the theft of login credentials.
  • An online Tech Magazine reported that Zoom leaked thousands of their users’ email addresses and photos to strangers.
  • Video-conferences which users did not protect with a password, enabled “Zoombombing”, a phenomenon in which strangers hijacked videocalls and disrupted them by posting pornographic and racist images as well as spamming the conversations with threatening language. In response, Zoom introduced the Waiting Room feature and additional password settings.

At the same time, Zoom’s data privacy practices came under scrutiny:

  • Zoom shared web analytics data with third-party companies for advertising purposes without having a legal basis or notifying users about this practice. In response to criticism, Zoom revised its privacy policy and now declares that it does not share data from meetings for advertising.
  • The company also shared more analytics data of its users with Facebook than stated on Zoom’s privacy policy, even if the user did not sign in with their Facebook account. Zoom introduced an update in which this sharing is terminated.
  • The New York Times revealed that Zoom used a data mining feature that matched Zoom users’ names and email addresses to their LinkedIn profiles without the users knowing about it. Zoom then enabled automatic sharing of the matched LinkedIn profiles with other meeting members that were subscribers of a LinkedIn service for sales prospecting (“LinkedIn Sales Navigator”). In response to criticism, Zoom removed this feature permanently.
  • Zoom hosted a feature called Attention Tracking, which let the meeting’s host know when an attendee had clicked away the meeting window for more than 30 seconds. In the meantime, Zoom disabled the feature.

The security and privacy issues of Zoom have led various public authorities and companies internationally to ban their workers from using the service.

On 1 April 2020, Zoom’s founder and CEO Eric S. Yuan announced a 90-day plan to significantly improve their data security in an effort to build greater trust with their users. This plan includes freezing the introduction of new features, enlarge their cybersecurity team and engage outside help from security advisors.

Uber to pay another fine for 2016 data breach

27. December 2018

Uber’s major data breach of 2016 still has consequences as it has also been addressed by the French Data Protection Authority “CNIL”.

As reported in November 2017 and September 2018, the company had tried to hide that personal data of 50 million Uber customers had been stolen and chose to pay the hackers instead of disclosing the incident to the public.

1,4 million French customers were affected as well which is why the CNIL has now fined Uber 400K Euros (next to the settlement with the US authorities amounting to $148 Million).

The CNIL came to find out that the breach could have been avoided by implementing certain basic security measures such as stronger authentication.

Great Britain and the Netherlands have also already imposed a fine totalling €1 million.

New and surprising password guidelines released by NIST

21. December 2017

The National Institute of Standards and Technology (NIST), a non-regulatory federal agency within the U.S. Department of Commerce that promotes innovation and industrial competitiveness often by recommending best practices in matters of security, has released its Digital Identity Guidelines uttering advice for user password management.

Considering that Bill Burr, the pioneer of password management, has admitted regretting his recommendations in a publication back in 2003, the NIST is taking appropriate action by revising wide-spread practices.

For over a decade, people were encouraged to create complex passwords with capital letters, numbers and „obscure“ characters – along with frequent changes.

Research has now shown that these requirements don’t necessarily improve the level of security, but instead might even make it easier for hackers to crack the code as people tend to make minor changes when they have to change their already complex password – usually pressed for time.

This is why the NIST is now recommending to let go of periodic password change requirements alongside of algorithmic complexity.

Rather than holding on to these practices, the experts emphasize the importance of password length. The NIST states, that „password length has been found to be a primary factor in characterizing password strength. Passwords that are too short yield to brute force attacks as well as to dictionary attacks using words and commonly chosen passwords.“

It takes years for computers to figure out passwords with 20 or more characters as long as the password is not commonly used.

The NIST advises to screen new passwords against specific lists: „For example, the list may include, but is not limited to passwords obtained from previous breach corpuses, dictionary words, repetitive or sequential characters (e.g. ‘aaaaaa’, ‚1234abcd’), context-specific words, such as the name of the service, the username, and derivatives thereof.“

Subsequently, the NIST completely abandons its own suggestions and causes great relief for industries all over:

„Length and complexity requirements beyond those recommended here significantly increase the difficulty of memorized secrets and increase user frustration. As a result, users often work around these restrictions in a way that is counterproductive. Furthermore, other mitigations such as blacklists, secure hashed storage, and rate limiting are more effective at preventing modern brute-force attacks. Therefore, no additional complexity requirements are imposed.“

UK government to meet tech giants after Westminster attack

28. March 2017

In consequence of the Westminster Bridge attack in London, Home Secretary Amber Rudd announced that she wants to meet several tech giants in order to make sure law enforcement is able to access encrypted data for terrorism investigation.

The topic came up as the attacker reportedly used the messaging application WhatsApp shortly before his attack began. As WhatsApp uses end-to-end encryption, neither law enforcement nor WhatsApp itself can read messages. The same applies to Apple’s iMessage. While Rudd did not want to make public which tech companies she will meet in detail, Google confirmed that it will be meeting the UK government.

“We need to make sure that organisations like WhatsApp, and there are plenty of others like that, don’t provide a secret place for terrorists to communicate with each other,“ Rudd said. Labour leader Jeremy Corbin, however, stated that law enforcement already had enough powers and that there needed to be a balance between the right to know and the right to privacy.

In the meantime, Microsoft confirmed that it had provided email information relating to the Westminster Bridge attack to the British authorities after it had received lawful orders.

CIA´s circumvention methods on Wikileaks

10. March 2017

Tuesday, 7th March on Wikileaks there was a release of around 9,000 pages of documents on the U.S. Central Intelligence Agency hacking methods, called “Year Zero”, which revealed CIA´s hardware and software world´s top technology products circumvention methods (including smartphone operating systems exploitation). These methods are believed to allow agents to circumvent encryption apps.

According to a Reuters report U.S. government contractors are suspected by the law enforcement and U.S. intelligence to have likely handed over the information to Wikileaks.

However, after it has already occurred in government contractor employees´ cases (Harold Thomas Martin´s and Edward Snowden´s), sensitive government information leak nowadays remains no wonder anymore.

Google Director, Apple, Microsoft and Samsung believe that they are continuously and accurately looking into any identified vulnerabilities in order to implement necessary protections.

Even though the authenticity of the leaks still awaits the confirmation, the CIA has expressed its concern about the topic.

Open Whisper Systems confirm that there was no Signal protocol encryption break, even though the New York Times originally reported that the CIA could break the encryption of WhatsApp, Signal and Telegram apps.

Category: Cyber security · Encryption · USA
Tags: ,
Pages: 1 2 Next
1 2