Category: Encryption

Swiss Data Protection Commissioner: “Swiss-U.S. Privacy Shield not providing adequate level of Data Protection”

28. September 2020

Following the recent ruling by the Court of Justice of the European Union (“CJEU”) the Swiss Data Protection Commissioner (“EDÖB”) published a statement concerning the level of Data Protection of Data Transfers under the Swiss-U.S. Privacy Shield. The “Schrems II” decision by the CJEU is not legally binding in the Switzerland because Switzerland is neither a EU nor a EEA country. But as the EDÖB and the Joint European Data Protection Authorities work closely together, the decision has first implications for Swiss data exporters.

In accordance with Swiss Data Protection law (Art. 7 VDSG), the Swiss Data Protection Commissioner maintains a publicly accessible list of countries assessing the level of Data Protection guaranteed by these countries. This list shall serve Swiss data exporters as a guidance for their data exporting activities and acts as a rebuttable presumption. EU and EEA countries have continuously been listed in the first column of the list because they are regarded to provide an adequate level of Data Protection. The U.S. has been listed in the second column as a country providing “adequate protection under certain conditions”, which meant a certification of U.S. data importers under the Swiss-U.S. Privacy Shield.

Subsequent to the CJEU ruling, the EDÖB decided to list the U.S. in the third column as a country providing “inadequate protection”, thereby also acting on his past annual reviews of the Swiss-U.S. Privacy Shield. In his reviews, the EDÖB already criticised that data subjects in Switzerland lack access to the courts in the U.S. on account of Data Protection violations and that the Ombudsman-mechanism is ineffective in this regard.

Lastly, the EDÖB pointed out that the Swiss-U.S. Privacy Shield remains in effect since there has not been a decision by Swiss courts comparable to the CJEU decision and that his assessment has the status of a recommendation. However, the EDÖB advises Swiss data exporters to always make a risk assessment when transferring Personal Data to countries with “inadequate protection” and possibly to apply technical measures (e.g. BYOK encryption) in order to protect the data from access by foreign intelligence services.

Series on COVID-19 Contact Tracing Apps Part 2: The EDPB Guideline on the Use of Contact Tracing Tools

25. May 2020

Today we are continuing our miniseries on contact tracing apps and data protection with Part 2 of the series: The EDPB Guideline on the Use of Contact Tracing Tools. As mentioned in Part 1 of our miniseries, many Member States of the European Union have started to discuss using modern technologies to combat the spread of the Coronavirus. Now, the European Data Protection Board (“EDPB”) has issued a new guideline on the use of contact tracing tools in order to give European policy makers guidance on Data Protection concerns before implementing these tools.

The Legal Basis for Processing

In its guideline, the EDPB proposes that the most relevant legal basis for the processing of personal data using contact tracing apps will probably be the necessity for the performance of a task in the public interest, i.e. Art. 6 para. 1 lit. e) GDPR. In this context, Art. 6 para. 3 GDPR clarifies that the basis for the processing referred to in Art. 6 para. 1 lit. e) GDPR shall be laid down by Union or Members State law.

Another possible legal basis for processing could be consent pursuant to Art. 6 para. 1 lit. a) GDPR. However, the controller will have to ensure that the strict requirements for consent to be valid are met.

If the contact tracing application is specifically processing sensitive data, like health data, processing could be based on Art. 9 para. 2 lit. i) GDPR for reasons of public interest in the area of public health or on Art. 9 para. 2 lit. h) GDPR for health care purposes. Otherwise, processing may also be based on explicit consent pursuant to Art. 9 para. 2 lit. a) GDPR.

Compliance with General Data Protection Principles

The guideline is a prime example of the EDPB upholding that any data processing technology must comply with the general data protection principles which are stipulated in Art. 5 GDPR. Contact tracing technology will not be an exeption to this general rule. Thus, the guideline contains recommendations on what national governments and health agencies will need to be aware of in order to observe the data protection principles.

Principle of Lawfulness, fairness and transparency, Art. 5 para. 1 lit. a) GDPR: First and foremost, the EDPB points out that the contact tracing technology must ensure compliance with GDPR and Directive 2002/58/EC (the “ePrivacy Directive”). Also, the application’s algorithms must be auditable and should be regularly reviewed by independent experts. The application’s source code should be made publicly available.

Principle of Purpose limitation, Art. 5 para. 1 lit. b) GDPR: The national authorities’ purposes of processing personal data must be specific enough to exclude further processing for purposes unrelated to the management of the COVID-19 health crisis.

Principles of Data minimisation and Data Protection by Design and by Default, Art. 5 para. 1 lit. c) and Art. 25 GDPR:

  • Data processed should be reduced to the strict minimum. The application should not collect unrelated or unnecessary information, which may include civil status, communication identifiers, equipment directory items, messages, call logs, location data, device identifiers, etc.;
  • Contact tracing apps do not require tracking the location of individual users. Instead, proximity data should be used;
  • Appropriate measures should be put in place to prevent re-identification;
  • The collected information should reside on the terminal equipment of the user and only the relevant information should be collected when absolutely necessary.

Principle of Accuracy, Art. 5 para. 1 lit. d) GDPR: The EDPB advises that procedures and processes including respective algorithms implemented by the contact tracing apps should work under the strict supervision of qualified personnel in order to limit the occurrence of any false positives and negatives. Moreover, the applications should include the ability to correct data and subsequent analysis results.

Principle of Storage limitation, Art. 5 para. 1 lit. e) GDPR: With regards to data retention mandates, personal data should be kept only for the duration of the COVID-19 crisis. The EDPB also recommends including, as soon as practicable, the criteria to determine when the application shall be dismantled and which entity shall be responsible and accountable for making that determination.

Principle of Integrity and confidentiality, Art. 5 para. 1 lit. f) GDPR: Contact tracing apps should incorporate appropriate technical and organisational measures to ensure the security of processing. The EDPB places special emphasis on state-of-the-art cryptographic techniques which should be implemented to secure the data stored in servers and applications.

Principle of Accountability, Art. 5 para. 2 GDPR: To ensure accountability, the controller of any contact tracing application should be clearly defined. The EDPB suggests that national health authorities could be the controllers. Because contact tracing technology involves different actors in order to work effectively, their roles and responsibilities must be clearly established from the outset and be explained to the users.

Functional Requirements and Implementation

The EDPB also makes mention of the fact that the implementations for contact tracing apps may follow a centralised or a decentralised approach. Generally, both systems use Bluetooth signals to log when smartphone owners are close to each other.  If one owner was confirmed to have contracted COVID-19, an alert can be sent to other owners they may have infected. Under the centralised version, the anonymised data gathered by the app will be uploaded to a remote server where matches are made with other contacts. Under the decentralised version, the data is kept on the mobile device of the user, giving users more control over their data. The EDPB does not give a recommendation for using either approach. Instead, national authorities may consider both concepts and carefully weigh up the respective effects on privacy and the possible impacts on individuals rights.

Before implementing contact tracing apps, the EDPB also suggests that a Data Protection Impact Assessment (DPIA) must be carried out as the processing is considered likely high risk (health data, anticipated large-scale adoption, systematic monitoring, use of new technological solution). Furthermore, they strongly recommend the publication of DPIAs to ensure transparency.

Lastly, the EDPB proposes that the use of contact tracing applications should be voluntary and reiterates that it should not rely on tracing individual movements but rather on proximity information regarding users.

Outlook

The EDPB acknowledges that the systematic and large scale monitoring of contacts between natural persons is a grave intrusion into their privacy. Therefore, Data Protection is indispensable to build trust, create the conditions for social acceptability of any solution, and thereby guarantee the effectiveness of these measures. It further underlines that public authorities should not have to choose between an efficient response to the current pandemic and the protection of fundamental rights, but that both can be achieved at the same time.

In the third part of the series regarding COVID-19 contact tracing apps, we will take a closer look into the privacy issues that countries are facing when implementing contact tracing technologies.

The Video-conference service Zoom and its Data Security issues

20. April 2020

Amidst the Corona crisis, the video communications service Zoom gained enormous popularity. The rate of daily Zoom users skyrocketed from 10 Mio in December 2019 to 200 Mio in March 2020. As it outshined many of its competitors, Zoom labels itself as “the leader in modern enterprise video communications”. However, the company has been facing a lot of public criticism because of its weaknesses in data security and lack of awareness in data protection matters.

Basic data security weaknesses unfolded little by little starting in March 2020:

  • Zoom had to admit that it was wrongly advertising to provide full end-to-end encryption for all shared contents like video, audio or screen sharing.
  • Security experts revealed several bugs that could have allowed webcam and mic hijacking and the theft of login credentials.
  • An online Tech Magazine reported that Zoom leaked thousands of their users’ email addresses and photos to strangers.
  • Video-conferences which users did not protect with a password, enabled “Zoombombing”, a phenomenon in which strangers hijacked videocalls and disrupted them by posting pornographic and racist images as well as spamming the conversations with threatening language. In response, Zoom introduced the Waiting Room feature and additional password settings.

At the same time, Zoom’s data privacy practices came under scrutiny:

  • Zoom shared web analytics data with third-party companies for advertising purposes without having a legal basis or notifying users about this practice. In response to criticism, Zoom revised its privacy policy and now declares that it does not share data from meetings for advertising.
  • The company also shared more analytics data of its users with Facebook than stated on Zoom’s privacy policy, even if the user did not sign in with their Facebook account. Zoom introduced an update in which this sharing is terminated.
  • The New York Times revealed that Zoom used a data mining feature that matched Zoom users’ names and email addresses to their LinkedIn profiles without the users knowing about it. Zoom then enabled automatic sharing of the matched LinkedIn profiles with other meeting members that were subscribers of a LinkedIn service for sales prospecting (“LinkedIn Sales Navigator”). In response to criticism, Zoom removed this feature permanently.
  • Zoom hosted a feature called Attention Tracking, which let the meeting’s host know when an attendee had clicked away the meeting window for more than 30 seconds. In the meantime, Zoom disabled the feature.

The security and privacy issues of Zoom have led various public authorities and companies internationally to ban their workers from using the service.

On 1 April 2020, Zoom’s founder and CEO Eric S. Yuan announced a 90-day plan to significantly improve their data security in an effort to build greater trust with their users. This plan includes freezing the introduction of new features, enlarge their cybersecurity team and engage outside help from security advisors.

Uber to pay another fine for 2016 data breach

27. December 2018

Uber’s major data breach of 2016 still has consequences as it has also been addressed by the French Data Protection Authority “CNIL”.

As reported in November 2017 and September 2018, the company had tried to hide that personal data of 50 million Uber customers had been stolen and chose to pay the hackers instead of disclosing the incident to the public.

1,4 million French customers were affected as well which is why the CNIL has now fined Uber 400K Euros (next to the settlement with the US authorities amounting to $148 Million).

The CNIL came to find out that the breach could have been avoided by implementing certain basic security measures such as stronger authentication.

Great Britain and the Netherlands have also already imposed a fine totalling €1 million.

New and surprising password guidelines released by NIST

21. December 2017

The National Institute of Standards and Technology (NIST), a non-regulatory federal agency within the U.S. Department of Commerce that promotes innovation and industrial competitiveness often by recommending best practices in matters of security, has released its Digital Identity Guidelines uttering advice for user password management.

Considering that Bill Burr, the pioneer of password management, has admitted regretting his recommendations in a publication back in 2003, the NIST is taking appropriate action by revising wide-spread practices.

For over a decade, people were encouraged to create complex passwords with capital letters, numbers and „obscure“ characters – along with frequent changes.

Research has now shown that these requirements don’t necessarily improve the level of security, but instead might even make it easier for hackers to crack the code as people tend to make minor changes when they have to change their already complex password – usually pressed for time.

This is why the NIST is now recommending to let go of periodic password change requirements alongside of algorithmic complexity.

Rather than holding on to these practices, the experts emphasize the importance of password length. The NIST states, that „password length has been found to be a primary factor in characterizing password strength. Passwords that are too short yield to brute force attacks as well as to dictionary attacks using words and commonly chosen passwords.“

It takes years for computers to figure out passwords with 20 or more characters as long as the password is not commonly used.

The NIST advises to screen new passwords against specific lists: „For example, the list may include, but is not limited to passwords obtained from previous breach corpuses, dictionary words, repetitive or sequential characters (e.g. ‘aaaaaa’, ‚1234abcd’), context-specific words, such as the name of the service, the username, and derivatives thereof.“

Subsequently, the NIST completely abandons its own suggestions and causes great relief for industries all over:

„Length and complexity requirements beyond those recommended here significantly increase the difficulty of memorized secrets and increase user frustration. As a result, users often work around these restrictions in a way that is counterproductive. Furthermore, other mitigations such as blacklists, secure hashed storage, and rate limiting are more effective at preventing modern brute-force attacks. Therefore, no additional complexity requirements are imposed.“

UK government to meet tech giants after Westminster attack

28. March 2017

In consequence of the Westminster Bridge attack in London, Home Secretary Amber Rudd announced that she wants to meet several tech giants in order to make sure law enforcement is able to access encrypted data for terrorism investigation.

The topic came up as the attacker reportedly used the messaging application WhatsApp shortly before his attack began. As WhatsApp uses end-to-end encryption, neither law enforcement nor WhatsApp itself can read messages. The same applies to Apple’s iMessage. While Rudd did not want to make public which tech companies she will meet in detail, Google confirmed that it will be meeting the UK government.

“We need to make sure that organisations like WhatsApp, and there are plenty of others like that, don’t provide a secret place for terrorists to communicate with each other,“ Rudd said. Labour leader Jeremy Corbin, however, stated that law enforcement already had enough powers and that there needed to be a balance between the right to know and the right to privacy.

In the meantime, Microsoft confirmed that it had provided email information relating to the Westminster Bridge attack to the British authorities after it had received lawful orders.

CIA´s circumvention methods on Wikileaks

10. March 2017

Tuesday, 7th March on Wikileaks there was a release of around 9,000 pages of documents on the U.S. Central Intelligence Agency hacking methods, called “Year Zero”, which revealed CIA´s hardware and software world´s top technology products circumvention methods (including smartphone operating systems exploitation). These methods are believed to allow agents to circumvent encryption apps.

According to a Reuters report U.S. government contractors are suspected by the law enforcement and U.S. intelligence to have likely handed over the information to Wikileaks.

However, after it has already occurred in government contractor employees´ cases (Harold Thomas Martin´s and Edward Snowden´s), sensitive government information leak nowadays remains no wonder anymore.

Google Director, Apple, Microsoft and Samsung believe that they are continuously and accurately looking into any identified vulnerabilities in order to implement necessary protections.

Even though the authenticity of the leaks still awaits the confirmation, the CIA has expressed its concern about the topic.

Open Whisper Systems confirm that there was no Signal protocol encryption break, even though the New York Times originally reported that the CIA could break the encryption of WhatsApp, Signal and Telegram apps.

Category: Cyber Security · Encryption · USA
Tags: ,

Hundreds of thousands of users affected by CloudPets data breach

2. March 2017

Yet another toy maker named Spiral Toys hit the headlines. The company suffered a big data breach with its stuffed animals called CloudPets resulting in the disclosure of 800,000 users’ personal data such as email addresses, passwords, profile pictures and 2 million voice recordings.

Spiral Toys’ CloudPets are able to connect to an app on a smartphone via Bluetooth so that parents can provide the toy with voice messages for their children.

The personal data were stored in an online database without authentication requirements so that hackers could easily access the database. According to Troy Hunt, a web security expert, the passwords were encrypted but Spiral Toys set no requirements for the password strength. That means hackers “could crack a large number of passwords, log on to accounts and pull down the voice recordings”.

Spiral Toys’ Mark Meyers denied that voice records were stolen. Still the company wants to increase the requirements for the password strength after the data breach was made public.

Both the decision of the German Federal Network Agency to take the doll “My friend Cayla” off the market in Germany and the data breach suffered by Spiral Toys, show that the privacy concerns smart toy producers are exposed to, should be taken seriously.

Use of encryption App increases after US election

6. December 2016

BuzzFeed News reported, that after electing Donald Trump the App called Signal has been faced with a 400 percent rise in daily downloads.

This App is a secure communications tool and therefore well-known in terms of technology, journalism and politics. When using this App people are able to text and speak with one another by encrypting end-to-end, so that only the sender and the intended recipient can read or hear the respective message.

The founder of the App called Signal, Moxie Marlinspike, released a statement saying that “There has never been a single event that has resulted in this kind of sustained, day-over-day increase.” Marlinspike explained that “Trump is about to be put in control of the most pervasive, largest, and least accountable surveillance infrastructure in the world (…) People are maybe a bit uncomfortable with him.”

 

EU Member States address issues on encryption in criminal investigations

30. November 2016

Recently, Italy, Latvia, Poland, Hungary and Croatia, have proposed a new legislation, which could facilitate police investigators to access the different entities’ encrypted information in order to make it easier to crack open encryption technology.

According to the Polish officials, “One of the most crucial aspects will be adopting new legislation that allows acquisition of data stored in EU countries in the cloud”.

European countries were asked by the Slovakian government (which holds the current presidency of the EU Council) to identify the way, in which their law enforcement authorities deal with technology preventing from the communication interception as long as they are not authorised to get the information.

Via a freedom of information request, twelve countries, amongst others Finland, Italy, Swedem or Poland, responded to the Dutch internet rights NGO Bits of Freedom, that they frequently encounter encrypted data while carrying out criminal investigations. The UK and Latvia indicated that it happens ‘almost always’.

Ultimately a dispute on prohibiting or creating backdoors in order to weaken encryption for digital and telecommunication services has raised among Germany and European Union.

Even though Germany has dismissed charges that the government is pushing companies to create encryption backdoors in their products, Angela Merkel has announced that investigators will pay more attention to tracing criminals who use the darknet and encryption, especially since the shooting in Munich in July.

So far however, Europol, ENISA and the Commission´s vice president Andrus Ansip oppose creating the backdoors weakening encryption.

Pages: Prev 1 2 3 Next
1 2 3