Category: Cyber security

Transatlantic Data Transfers in light of the Two Year Anniversary of GDPR Application

7. July 2020

In the last two years since the General Data Protection Regulation (GDPR) came into effect on May 25, 2018, it has received an overall positive feedback and structured the data protection culture not only in the European Union, but has set an example for international privacy standards.

However, especially from the American side of the world, criticism has been constant. Different principles are a prerequisite for different opinions and priorities, and the effort to bring European data protection standards and American personal data business together has been a challenge on both sides.

One of the main criticisms coming from the US government is the increasing obstacles the GDPR poses in case of cybercrime investigations and law enforcement. Not only the restrictive implications of the GDPR are an issue, but also the divergent interpretations due to national adaptations of the GDPR are seen as a problem by government officials.

In the cases of cybercrime, the main issue for the US critics is the now less effective database of domain name owners, WHOIS. The online directory, which was created in the 1970s, is an important tool for law enforcement combatting cybercrime. Before the GDPR came into effect in 2018, the request for information on domain owners was straightforward. Now, due to the restrictions of the GDPR, this process has been made long and tedious.

But fighting cybercrime is not the only tension between the EU and the USA concerning data protection. In a judgement in the Schrems II case, expected for July 16, 2020, the European Court of Justice (ECJ) is expected to take a stance on transatlantic data transfers and the current Privacy Shield, which is the basis for the EU-US dataflows under adequate data protection standards. If the Privacy Shield is deemed insufficient protection, it will have a major effect on EU-US business transactions.

However, these are issues that the European Commission (EC) is very aware of. In their communication concerning the two-year review of the GDPR, the Commission stated that they are planning to balance out diverging and fragmented interpretations of the GDPR on national levels and find a common data protection culture within Europe.

In addition, the restrictions the GDPR poses to law enforcement are another point the European Commission knows it needs to fix. The plan for the future is a bilateral and multilateral framework that can allow for simple requests to share data for law enforcement purposes and avoid conflicts of law, while keeping data protection safeguards intact.

The upcoming judgement of the ECJ is seen with watchful eyes by the Commission, and will be incorporated in their upcoming adequacy decisions and re-evaluations, as well as their development of a modern international transfer toolbox, which includes a modernized version of the standard contractual clauses.

Overall, the two-year mark of the existence of the GDPR is seen more as a success, despite the clear areas for future improvement. One of the big challenges in transatlantic data transfers ahead is without a doubt the outcome of the judgement in the Schrems case in mid-July, the implications of which are, at this point in time, not yet able to be defined.

German State Data Protection Commissioner imposes 1.2 million € GDPR fine

1. July 2020

The German State Data Protection Commissioner of Baden-Württemberg (“LfDI Ba-Wü”)  imposed a GDPR fine of 1.240.000€ on the German statutory health insurance provider AOK Baden-Württemberg (“AOK”). The fine was a result of the health insurance’s lack of technical and organisational measures pursuant to Art. 32 GDPR. It is the highest fine the LfDI Ba-Wü has ever imposed.

Between 2015 and 2019 the AOK organised lotteries on various occasions and collected personal data of the participants, including their contact details and current health insurance affiliations. The AOK wanted to use the data of the lottery participants for advertising purposes, insofar as the participants gave their consent to this. To ensure the security of processing, the AOK implemented internal guidelines and data protection training of their staff as technical and organisatioal measures. However, these measures were not sufficient to comply with Art. 32 GDPR because AOK staff used the personal data of more than 500 lottery participants for advertising purposes without their prior consent.

Following the investigation of the LfDI Ba-Wü, the AOK immediately stopped all marketing activities in order to revise their internal policies and processes against the GDPR. The LfDI Ba-Wü explained that in determining the extent of the fine, it considered the following mitigating factors:

  • the cooperation of the AOK with the Data Protection Authority,
  • the fact that the AOK as a statutory health insurance provider is an important part of the German healthcare system, and
  • the burdens of the current Corona-Pandemic on the healthcare system.

Finally, the Commissioner pointed out that technical and organisational measures must be regularly adjusted to the actual conditions of each processing activity, in order to ensure an adequate level of data protection in the long term.

Thailand postpones Enforcement of new Personal Data Protection Act

22. June 2020

In response to the European General Data Protection Regulation (“GDPR”) becoming applicable in 2018, Thailand adopted its first-ever Personal Data Protection Act (“PDPA”) into law on 28 May 2019. As it is fashioned after the GDPR, the PDPA is built around principles that vastly align with the GDPR, especially in the areas of data protection principles, legal bases, and data subject rights. Originally, it was determined that the PDPA would start its applicability one year after its adoption, on 27 May 2020.

Now, the Thai Government has approved of a draft decree by the Ministry of Digital Economy and Society (“MDES”) to postpone the enforcement of most sections of the PDPA to 31 May 2021. The MDES explained that the reasons for delay are the current Corona pandemic and its strain on businesses, as well as many businesses not being prepared for PDPA compliance. Notably, Brasil also postponed the enforcement of its new Data Protecion Law (“LGPD”) for similar reasons (we reported).

The only sections of the PDPA that will be enforced as originally planned include the appointment of the Personal Data Protection Committee members and the establishment of the Office of the Personal Data Protection Committee. Whilst the delay allows companys more time to become PDPA compliant, the lack of enforcement regarding data subject rights in the meantime are a big concern of critics, especially in light of the recent adoption of Thailand’s controversial new cybersecurity law.

Series on COVID-19 Contact Tracing Apps Part 2: The EDPB Guideline on the Use of Contact Tracing Tools

25. May 2020

Today we are continuing our miniseries on contact tracing apps and data protection with Part 2 of the series: The EDPB Guideline on the Use of Contact Tracing Tools. As mentioned in Part 1 of our miniseries, many Member States of the European Union have started to discuss using modern technologies to combat the spread of the Coronavirus. Now, the European Data Protection Board (“EDPB”) has issued a new guideline on the use of contact tracing tools in order to give European policy makers guidance on Data Protection concerns before implementing these tools.

The Legal Basis for Processing

In its guideline, the EDPB proposes that the most relevant legal basis for the processing of personal data using contact tracing apps will probably be the necessity for the performance of a task in the public interest, i.e. Art. 6 para. 1 lit. e) GDPR. In this context, Art. 6 para. 3 GDPR clarifies that the basis for the processing referred to in Art. 6 para. 1 lit. e) GDPR shall be laid down by Union or Members State law.

Another possible legal basis for processing could be consent pursuant to Art. 6 para. 1 lit. a) GDPR. However, the controller will have to ensure that the strict requirements for consent to be valid are met.

If the contact tracing application is specifically processing sensitive data, like health data, processing could be based on Art. 9 para. 2 lit. i) GDPR for reasons of public interest in the area of public health or on Art. 9 para. 2 lit. h) GDPR for health care purposes. Otherwise, processing may also be based on explicit consent pursuant to Art. 9 para. 2 lit. a) GDPR.

Compliance with General Data Protection Principles

The guideline is a prime example of the EDPB upholding that any data processing technology must comply with the general data protection principles which are stipulated in Art. 5 GDPR. Contact tracing technology will not be an exeption to this general rule. Thus, the guideline contains recommendations on what national governments and health agencies will need to be aware of in order to observe the data protection principles.

Principle of Lawfulness, fairness and transparency, Art. 5 para. 1 lit. a) GDPR: First and foremost, the EDPB points out that the contact tracing technology must ensure compliance with GDPR and Directive 2002/58/EC (the “ePrivacy Directive”). Also, the application’s algorithms must be auditable and should be regularly reviewed by independent experts. The application’s source code should be made publicly available.

Principle of Purpose limitation, Art. 5 para. 1 lit. b) GDPR: The national authorities’ purposes of processing personal data must be specific enough to exclude further processing for purposes unrelated to the management of the COVID-19 health crisis.

Principles of Data minimisation and Data Protection by Design and by Default, Art. 5 para. 1 lit. c) and Art. 25 GDPR:

  • Data processed should be reduced to the strict minimum. The application should not collect unrelated or unnecessary information, which may include civil status, communication identifiers, equipment directory items, messages, call logs, location data, device identifiers, etc.;
  • Contact tracing apps do not require tracking the location of individual users. Instead, proximity data should be used;
  • Appropriate measures should be put in place to prevent re-identification;
  • The collected information should reside on the terminal equipment of the user and only the relevant information should be collected when absolutely necessary.

Principle of Accuracy, Art. 5 para. 1 lit. d) GDPR: The EDPB advises that procedures and processes including respective algorithms implemented by the contact tracing apps should work under the strict supervision of qualified personnel in order to limit the occurrence of any false positives and negatives. Moreover, the applications should include the ability to correct data and subsequent analysis results.

Principle of Storage limitation, Art. 5 para. 1 lit. e) GDPR: With regards to data retention mandates, personal data should be kept only for the duration of the COVID-19 crisis. The EDPB also recommends including, as soon as practicable, the criteria to determine when the application shall be dismantled and which entity shall be responsible and accountable for making that determination.

Principle of Integrity and confidentiality, Art. 5 para. 1 lit. f) GDPR: Contact tracing apps should incorporate appropriate technical and organisational measures to ensure the security of processing. The EDPB places special emphasis on state-of-the-art cryptographic techniques which should be implemented to secure the data stored in servers and applications.

Principle of Accountability, Art. 5 para. 2 GDPR: To ensure accountability, the controller of any contact tracing application should be clearly defined. The EDPB suggests that national health authorities could be the controllers. Because contact tracing technology involves different actors in order to work effectively, their roles and responsibilities must be clearly established from the outset and be explained to the users.

Functional Requirements and Implementation

The EDPB also makes mention of the fact that the implementations for contact tracing apps may follow a centralised or a decentralised approach. Generally, both systems use Bluetooth signals to log when smartphone owners are close to each other.  If one owner was confirmed to have contracted COVID-19, an alert can be sent to other owners they may have infected. Under the centralised version, the anonymised data gathered by the app will be uploaded to a remote server where matches are made with other contacts. Under the decentralised version, the data is kept on the mobile device of the user, giving users more control over their data. The EDPB does not give a recommendation for using either approach. Instead, national authorities may consider both concepts and carefully weigh up the respective effects on privacy and the possible impacts on individuals rights.

Before implementing contact tracing apps, the EDPB also suggests that a Data Protection Impact Assessment (DPIA) must be carried out as the processing is considered likely high risk (health data, anticipated large-scale adoption, systematic monitoring, use of new technological solution). Furthermore, they strongly recommend the publication of DPIAs to ensure transparency.

Lastly, the EDPB proposes that the use of contact tracing applications should be voluntary and reiterates that it should not rely on tracing individual movements but rather on proximity information regarding users.

Outlook

The EDPB acknowledges that the systematic and large scale monitoring of contacts between natural persons is a grave intrusion into their privacy. Therefore, Data Protection is indispensable to build trust, create the conditions for social acceptability of any solution, and thereby guarantee the effectiveness of these measures. It further underlines that public authorities should not have to choose between an efficient response to the current pandemic and the protection of fundamental rights, but that both can be achieved at the same time.

In the third part of the series regarding COVID-19 contact tracing apps, we will take a closer look into the privacy issues that countries are facing when implementing contact tracing technologies.

easyJet Data Breach: 9 million customers affected

22. May 2020

The British airline ‘easyJet’ has been hacked. The hackers have been able to access personal data of approximately 9 million customers.

easyJet published a statement on the hacker attack and announced that e-mail addresses and travel details were among the concerned personal data of customers. Which personal data in detail belong to ‘travel data’ was not disclosed. In some cases, the hackers could also access credit card data. easyJet stated that there is no proof, that the accessed personal data was abused. easyjet now warns about fake mails in his name as well as in the name of ‘easyJet Holidays’.

The hack was noticed by easyJet in January, but was only made public this week. With becoming aware of the attack, easyJet took several measures and has blocked the unauthorized access in the meantime. easyJet is also in contact with the British Data Protection Authority ‘ICO’ and the National Security Center.

At this time, easyJet has not yet been able to evaluate how the attack could have occurred, but easyJet explained, that the hacker attack was no ‘general’ hacker attack, since the attack was very sophisticated compared to other hacker attacks. It is suspected that the attack originated from a group that has already hacked other airlines, such as British Airways in 2018.

easyJet announced that they will get in contact with concerned data subjects until May 26th to inform those about the breach and to explain further measures which should be taken in order to decrease the risk. easyJet customers who will not receive a statement until then are not concerned by the breach.

In connection with hacker attacks like these the risk for phishing attacks is the highest. In phishing attacks, criminals use fake e-mails, for example on behalf of well-known companies or authorities, to try to persuade users to pass on personal data or to click on prepared e-mail attachments containing malware.

The Video-conference service Zoom and its Data Security issues

20. April 2020

Amidst the Corona crisis, the video communications service Zoom gained enormous popularity. The rate of daily Zoom users skyrocketed from 10 Mio in December 2019 to 200 Mio in March 2020. As it outshined many of its competitors, Zoom labels itself as “the leader in modern enterprise video communications”. However, the company has been facing a lot of public criticism because of its weaknesses in data security and lack of awareness in data protection matters.

Basic data security weaknesses unfolded little by little starting in March 2020:

  • Zoom had to admit that it was wrongly advertising to provide full end-to-end encryption for all shared contents like video, audio or screen sharing.
  • Security experts revealed several bugs that could have allowed webcam and mic hijacking and the theft of login credentials.
  • An online Tech Magazine reported that Zoom leaked thousands of their users’ email addresses and photos to strangers.
  • Video-conferences which users did not protect with a password, enabled “Zoombombing”, a phenomenon in which strangers hijacked videocalls and disrupted them by posting pornographic and racist images as well as spamming the conversations with threatening language. In response, Zoom introduced the Waiting Room feature and additional password settings.

At the same time, Zoom’s data privacy practices came under scrutiny:

  • Zoom shared web analytics data with third-party companies for advertising purposes without having a legal basis or notifying users about this practice. In response to criticism, Zoom revised its privacy policy and now declares that it does not share data from meetings for advertising.
  • The company also shared more analytics data of its users with Facebook than stated on Zoom’s privacy policy, even if the user did not sign in with their Facebook account. Zoom introduced an update in which this sharing is terminated.
  • The New York Times revealed that Zoom used a data mining feature that matched Zoom users’ names and email addresses to their LinkedIn profiles without the users knowing about it. Zoom then enabled automatic sharing of the matched LinkedIn profiles with other meeting members that were subscribers of a LinkedIn service for sales prospecting (“LinkedIn Sales Navigator”). In response to criticism, Zoom removed this feature permanently.
  • Zoom hosted a feature called Attention Tracking, which let the meeting’s host know when an attendee had clicked away the meeting window for more than 30 seconds. In the meantime, Zoom disabled the feature.

The security and privacy issues of Zoom have led various public authorities and companies internationally to ban their workers from using the service.

On 1 April 2020, Zoom’s founder and CEO Eric S. Yuan announced a 90-day plan to significantly improve their data security in an effort to build greater trust with their users. This plan includes freezing the introduction of new features, enlarge their cybersecurity team and engage outside help from security advisors.

Apple and Google join forces during Corona Pandemic

17. April 2020

Apple and Google two of the biggest internet giants announced that they will partner on the development of a COVD-19 contact tracing technology.

According to a statement, both of them published on their blogs, aim of the partnership is to develop an App respectively a technical tool which should support the protection of people and to help combat the virus. Furthermore, the tracing technology should help governments and health agencies reduce the spread of the virus.

Apple and Google want to develop a Bluetooth technology which can be used on iOS and Android devices as well as that it can be implemented in Apps of other providers via an API (Application Programming Interface) – which should be published in May.

The tracing technology, using the Bluetooth function and encryption, is designed to detect the distance between two devices in order to identify potentially vulnerable people who have been in close contact with a person tested positive for corona. Therefore, the devices should exchange temporarily ID numbers. In case, one person is tested positive he or she should change the status in the used app in order to inform all persons to which the data subject had contact in the past two weeks.

Both, Apple and Google, ensure that they take data protection requirements seriously. According to the provided information the data should firstly be stored on the respective devices and deleted automatically after two weeks. The data should only be uploaded to a server after change of status to tested positive and obtaining consent of the data subject. The exchanged ID numbers are planned to be uploaded to a list anonymously. In order to increase trust, it is planned to publish the software source codes. This would allow everyone to understand how the data is handled. In addition, this is to ensure that no data will be used for advertising purposes.

CNIL publishes new Guidance on Teleworking

14. April 2020

The French Data Protection Authority (CNIL) has released a guidance on teleworking on April 1st, which is intended to help employers master the new working situation. In particular, it is supposed to bring clarity on the IT requirements in order to ensure a safe and well-functioning remote working environment.

In particular, the guidance touches on these following points to form a basis for coping with teleworking from an employer’s perspective:

  • It is recommended that employers formulate an IT Charter or internal regulation on how to use the teleworking systems which are to be followed by the employees,
  • Necessary measures have to be taken in case the systems have to be changed or adapted to the new situation,
  • It should be ensured that employee work stations have the minimum requirements of a firewall, anti-virus software and a tool blocking access to malicious websites,
  • To keep from being exposed on the internet and ensure security, a VPN is recommended to be put in use.

Furthermore, the CNIL has also given guidance on the cases where an organization’s services are mainly performed over the internet. In such cases, it recommended to follow a few necessary requirements in order to make sure the services can be delivered safely and smoothly:

  • Web protocols that guarantee confidentiality and authentication of the processes (such as https and sftp), and keeping them up to date,
  • Double factor authentication,
  • No access to interfaces of non-secure servers,
  • Reviewing logs of access to remotely accessible services to detect suspicious behaviors,
  • Ensuring that the used equipment follows latest security patches.

The CNIL also offered some best practices for employees to follow in cases of working remotely, to give both sides pointers on how to deal with the changing situation.

Specifically, employees are being recommended to ensure their WIFI is secure by using encryption such as WPA 2 or WPA 3, along with a secure password. In addition, the CNIL recommends work equipment given by the employer, as well as using a VPN provided by the company. In the case of using own devices, a firewall and an anti-virus software are the necessary requirements to ensure security of the equipment, as well as updating the operating system and software to the newest patches.

Lastly, the CNIL warns of increased phishing attempts in relation to the COVID-19 outbreak.

Overall, the guidance and best practices the CNIL has published indicate a need for continuous and active vigilance in regards to teleworking, as well as the sharing of personal data in the process.

This guidance is in line with our past assessment of the remote working situation, which you are welcome to check out in the respective blogpost in our Series on Data Protection and Corona.

Germany: Large Data leak reveals Personal Data of more than 3 Million Customers

27. January 2020

The German car rental company Buchbinder is responsible for leaking Personal Data of more than 3 Million customers from all over Europe. The data leak exposed more than 10 Terabyte of sensitive customer data over several weeks without the company noticing it.

A German cybersecurity firm was executing routine network scans when it found the data leak. The firm reported it twice to Buchbinder via e-mail, but did not receive a reply. After that, the cybersecurity firm reported the leak to the Bavarian Data Protection Authority (DPA) and informed the German computer magazine c’t and newspaper DIE ZEIT.

According to c’t, a configuration error of a Backup-Server was the cause of the leak. The Personal Data exposed included customers’ names, private addresses, birth dates, telephone numbers, rental data, bank details, accident reports, legal documents, as well as Buchbinder employees’ e-mails and access data to internal networks.

The data leak is particularly serious because of the vast amount of leaked Personal Data that could easily be abused through Spam e-mails, Fraud, Phishing, or Identity theft. It is therefore likely that the German DPA will impose a GDPR fine on the company in the future.

Buchbinder released a press statement apologising for the data leak and promising to enhance the level of their defense and cybersecurity system.

Washington State Lawmakers Propose new Privacy Bill

23. January 2020

Washington lawmakers introduced in January 2020, a law that would give state residents new privacy rights. The law is called “Washington Privacy Act” (WPA).

If passed, the Privacy Act would enact a comprehensive data protection framework for Washington that includes individual rights that are very similar and go beyond the rights in the California Consumer Privacy Act (CCPA), as well as a range of other obligations on businesses that do not yet exist in any U.S. privacy law.

Furthermore, the new draft bill contains strong provisions that largely align with the EU’s General Data Protection Regulation (GDPR), and commercial facial recognition provisions that start with a legal default of affirmative consent. Nonetheless, legislators must work within a remarkably short time-frame to pass a law that can be embraced by both House and Senate within the next six weeks of Washington’s legislative session. If passed, the bill would go into effect on July 31, 2021.

The current draft provides  data protection to all Washington State residents, and would apply to entities that conduct business in Washington or produce products or services targeted to Washington residents. Such entities must control or process data of at least 100,000 consumers; or derive 50% of gross revenue from the sale of personal data and process or control personal data of at least 25,000 consumers (with “consumers” defined as natural persons who are Washington residents, acting in an individual or household context). The draft bill will not apply to state and local governments or municipal corporations. The new bill would further provide all state residents, among other rights, the ability to opt out of targeted advertising.

The new draft bill will  regulate companies that process “personal data,” defined broadly as “any information that is linked or reasonably linkable to an identified or identifiable natural person” (not including de-identified data or publicly available information “information that is lawfully made available from federal, state, or local government records”), with specific provisions for pseudonymous data.

Category: Cyber security · GDPR · USA
Tags:
Pages: 1 2 3 4 5 6 7 Next
1 2 3 7