France’s supreme court, the Conseil d’État, restricts the CNIL’s Cookie Guidelines

22. June 2020

On June 19th, 2020, the French Conseil d’État has ordered the Commission Nationale de l’Informatique et des Libertés (CNIL) in a court decision to dismiss particular provisions made in its Guidelines on the subject of cookies and other tracers, which it published in 2019.

The Conseil d’État has received several complaints by businesses and professional associations, who turned to the supreme court in order to have the CNIL’s Guidelines refuted.

The main focus of the decision was the ban on cookie walls. Cookie walls are cookie consent pages which, upon declining consent to the processing of the cookies used for the website, deny the user access to the website. In their Guideline on cookies and other tracers from 2019, the CNIL had declared that such cookie walls were not in accordance with the principles of the General Data Protection Regulation (GDPR), causing a lot of businesses to appeal such a provision in front of the Conseil d’État.

In their decision on the matter, the Conseil d’État has declared that the CNIL, as only having suggestive and recommendatory competence in data protection matters, did not have the competence to issue a ban on cookie walls in the Guidelines. The Conseil d’État focused on the fact that the CNIL’s competence was only recommendatory, and did not have the finality to issue such a provision.

However, in its decision, the supreme court did not put to question whether the ban of cookie walls was in itself lawful or not. The Conseil d’État refrained from giving any substantive statement on the matter, leaving that question unanswered for the moment.

The Conseil d’État has further stated in its decision that in the case of the ability of data subjects to give their consent to processing activities, it is indeed necessary, in order to form free and informed consent, that the data subject is informed individually about each processing activity and its purpose before giving consent. However, business have the margin to decide if they collect the data subject’s consent througha one time, global consent with specifically individualized privacy policies, or over individual consent for each processing activity.

In the rest of its decision, the Conseil d’État has confirmed the remainder of the CNIL’s guidelines and provision on the matter as being lawful and applicable, giving the complainants only limited reason to rejoice.

Thailand postpones Enforcement of new Personal Data Protection Act

In response to the European General Data Protection Regulation (“GDPR”) becoming applicable in 2018, Thailand adopted its first-ever Personal Data Protection Act (“PDPA”) into law on 28 May 2019. As it is fashioned after the GDPR, the PDPA is built around principles that vastly align with the GDPR, especially in the areas of data protection principles, legal bases, and data subject rights. Originally, it was determined that the PDPA would start its applicability one year after its adoption, on 27 May 2020.

Now, the Thai Government has approved of a draft decree by the Ministry of Digital Economy and Society (“MDES”) to postpone the enforcement of most sections of the PDPA to 31 May 2021. The MDES explained that the reasons for delay are the current Corona pandemic and its strain on businesses, as well as many businesses not being prepared for PDPA compliance. Notably, Brasil also postponed the enforcement of its new Data Protecion Law (“LGPD”) for similar reasons (we reported).

The only sections of the PDPA that will be enforced as originally planned include the appointment of the Personal Data Protection Committee members and the establishment of the Office of the Personal Data Protection Committee. Whilst the delay allows companys more time to become PDPA compliant, the lack of enforcement regarding data subject rights in the meantime are a big concern of critics, especially in light of the recent adoption of Thailand’s controversial new cybersecurity law.

EDPB shares concerns over UK-US data deal in light of future UK adequacy decision

18. June 2020

On June 17th, 2020, the European Data Protection Board (EDPB) has written an open letter to the Members of the European Parliament over its concerns regarding the Agreement between the United Kingdom (UK) and the USA on Access to Electronic Data for the Purpose of Countering Serious Crime in relation to a future UK adequacy decision after the country’s exit out of the European Union.

In its letter, the EDPB states that it is concerned with the applicability of the safeguards in the Brexit withdrawal agreement with the EU once the UK leaves the Union at the beginning of 2021. The Agreement between the UK and the US allows for easy data access in the case of the prosecution of serious crimes, and facilitates an access request to be made to UK authorities and businesses under the US Cloud Act, for which it is unsure if the safeguards agreed upon between the EU and the UK apply.

The EDPB also stresses that, in the light of a potential data sharing agreement between the EU and the US, it is mandatory that the European safeguards in such an agreement “must prevail over US domestic laws” in order to be “fully compatible with European laws”.

Furthermore, the letter also states that “it is also essential that the safeguards include a mandatory prior judicial authorisation as an essential guarantee for access to metadata and content data”. In its preliminary assessment, the EDPB could not distinguish such a provision in the UK-US Agreement.

While right now the EDPB can only make a preliminary assessment of the situation based on the current elements at its disposal, it states clearly that the Agreement between the UK and the US will have to be considered in any relevant adequacy decision in the future. This is especially important as there is a “requirement to ensure continuity of protection in cases of onwards transfers from the UK to another third country”.

In any case, the EDPB intends to release its own opinion on the matter if the European Commission should release a draft of the adequacy decision for the UK.

Germany Update: Covid-19 Tracing App launched in mid June

On June 16th, 2020 Germany has introduced their new COVID-19 tracing app called “Corona-Warn-App” and released it for download. Within the first day, over six million citizens downloaded the app, and the government hopes to see the number increase for better effectiveness of the method.

As an Open Source project from the start, giving unhindered access to the programming code, it was able to work on safety and data protection issues throughout the seven weeks of its development, as well as keep the entire process transparent to future users.

Overall, the first impressions on the side of data protection have been good, with the Federal Data Protection Officer (Bundesdatenschutzbeauftragter) Ulrich Kelber stating to the Saarbrückener Zeitung that the app “gives a solid impression”, but he would like to “have seen a Data Protection Impact Assessment before the launch”.

The data protection aspects

The German contact tracing app claims to put the highest importance on data protection, and the transparency for the users to know what happens with their data.

Upon the download, the app gives the chance to read through a thorough privacy policy, giving the user all the information necessary to be able to understand and consent to the use of their data. In effect, the personal data collected and stored remains minimal: the consent to the usage of the Exposure Notification Framework, TANs for testing verification, as well as consent for a daily upload of the diagnostics key, which is only stored for 14 days.

The app, developed by SAP and Telekom, uses Bluetooth technology to judge exposure based on two criteria: the distance between two smartphones and the duration of the encounter. If the threshold requirements of those two criteria are met, the phones exchange a random key code, which are stored for 14 days on the phone devices, and checked for positive test results there. It will then tell you if your exposure is low or high risk, and will give you suggestions on how to act based on the level of risk to exposure. Due to this procedure, there is no need for the collection of personal information regarding the identity of the person. Especially, the notification in case of exposure is not in real time, making it impossible to securely identify the coronavirus positive person that has been encountered.

Furthermore, the app not only puts an emphasis on anonymity, but also on voluntariness. Whether and how you want to use the app is entirely up to the user. The user may disable to Exposure Notification Framework, and decide for themselves if they want to share the results of a test with the app. This comes, of course, with limitations to the effectiveness of the app, but it gives the user more control over his own data shared.

One of the current deficits is that due to the lack of hardware systems, the testing laboratories cannot verify the test results through the users scanning a QR-code, as originally planned. However, in the meantime, a notification hotline has been set up, although this raises data protection concerns due to the fact that it could be taken advantage of or abused.

Lastly, one of the big data protection aspects, which has caused a big stir in the cases of the tracing apps, is the storage of the information. The Corona-Warn-App stores the data of the users in a decentralized manner, which means that there is no direct upload to a cloud, but instead the entire process happens on the users’ devices. This shields from potential misuse of the data by parties involved in the development as well as the government, and was recommended by the European Parliament and the European Data Protection Board as the safer storage option for these types of contact tracing apps.

Overview

While the app is only in its first few days of launch, it has received a lot of praise for the way it handles the different problems with data protection and IT security. It remains to be seen if the necessary 60% of citizens using the contact tracing app can be mobilized in order to ensure maximum effectiveness.

Future plans involve cross border cooperation with different countries and their own contact tracing apps in order to ensure the practicability and effectiveness of these apps, as the containment of the pandemic is an international venture.

Overall, the Corona-Warn-App seems to be a decent development despite its hurried creation period. However, at this point it is only the beginning of the contact tracing app, and it remains to be seen how the developers incorporate fixes for upcoming problems.

Hungary Update: EDPB publishes Statement on Art. 23 GDPR

17. June 2020

Since March 2020, Hungary has been in a “state of emergency” following the COVID-19 pandemic. The country’s COVID-19 related emergency laws and state of emergency received worldwide criticism from constitutional experts, politicians and civil rights groups, because it allows the Prime Minister to rule by decree during the state of emergency and does not provide a predefined end date. During the state of emergency, Prime Minister Victor Orbán made extensive use of his newly gained powers by passing more than a hundred decrees, including Decree No. 179/2020, which suspended the GDPR data subject rights in Art. 15-22 GDPR with respect to personal data processing for the purpose of preventing, understanding, detecting the coronavirus disease and impeding its further spread (we reported).

In response to this suspension of GDPR rights, the European Data Protection Board (“EDPB”) has recently published a Statement on restrictions on data subject rights pursuant to Art. 23 GDPR, which is the provision that Hungary’s measure was based on. This article allows the member states to restrict, by way of a legislative measure, the scope of the obligations and rights provided for in Articles 12 to 22 and Article 34, when such a restriction respects the essence of the fundamental rights and freedoms and is a necessary and proportionate measure in a democratic society to safeguard, inter alia, important objectives of general public interest of the Union or of a Member State such as public health.

In its Statement, the EDPB points out that any restriction must respect the essence of the right that is being restricted. If the essence of the right is compromised, the restriction must be considered unlawful. Since the data subject’s right of access and the right to rectification are fundamental rights according to Art. 8 para. 2 of the Charter of Fundamental Rights of the European Union, any restriction of those rights must be carefully weighed up by the member states, in order respect the essence of the rights. The EDPB considers that restrictions adopted in the context of a state of emergency suspending or postponing the application of data subject rights, without any clear limitation in time, equate to a de facto blanket suspension and denial of those rights and are not be compatible with the essence of the fundamental rights and freedoms.

The EDPB also recalls that the restrictions under Art. 23 GDPR must be necessary and proportionate. It argues that restrictions that are imposed for a duration not precisely limited in time or which apply retroactively or are subject to undefined conditions, are not foreseeable to data subjects and thus disproportionate.

Furthermore, the EDPB takes the view that in order to safeguard important objectives of general public interest such as public health (Art. 23 para. 1 lit. e GDPR), there must be a clearly established and demonstrated link between the foreseen restrictions and the objective pursued. The mere existence of a pandemic or any other emergency situation alone does not justify a restriction of data subject rights, especially if it is not clearly established, how the restrictions can help dealing with the emergency.

Following the international public backlash, the Parliament of Hungary passed legislation on 16 June 2020 to revoke the emergency laws as soons as the current state of emergency will be terminated by the Government. Hungary’s Government announced in May that it intends to lift the state of emergency on 20 June 2020. After that, the restrictions on the GDPR rights shall be lifted as well, so that data subject may exercise their Art. 15-22 GDPR rights again.

Germany’s Constitutional Court curbs Federal Intelligence Service’s competence

16. June 2020

In a court ruling from May 19th 2020 with regards to the German Federal Intelligence Service (BND) and their manner of operation, the German Constitutional Court has proclaimed that the BND is bound by fundamental rights in cases of surveillance of foreigners, even outside of Germany’ federal territory.

 Background

The case, which was brought to the court in the manner of a constitutional complaint by a collective of foreign journalists, found its origin initially through the disclosures made by Edward Snowden back in 2013, where some of the BND’s practices in relation to strategic foreign surveillance came to light. In 2016, German legislators passed a new law with the purpose to regulate surveillance done by the BND. However, that new law mainly restricted surveillance of German citizens, as well as foreigner living in Germany. It has been criticized that the new law did nothing to restrict and regulate the BND’s actions abroad by not having to abide by any legal provisions. The constitutional complaint brought to the German Constitutional Court deals with strategic surveillance from foreign reporters and journalists with regards to their highly confidential data necessary to perform their work through the BND, which risks to be exchanged with their own country’s intelligence agencies and in the process put them at risk of federal measures taken against them.

The key points

Territorial Scope. One of the biggest points of the court ruling has been the definition of the territorial scope of the fundamental rights at risk in this case. Since the complainants are journalists from outside the German territory, the Constitutional Court had to specify if the constitutional rights that would shield them from surveillance by the BND would find application in the matter. In this instance, the court has ruled that the fundamental rights are not limited to the German territory, but rather apply wherever the German state authority is acting. This is derived from Art. 1 III of the German Constitution (GG), which binds the German state authority to conformity with the Constitution. In such, as the fundamental rights from Art. 10 I, Art. 5 I GG are not simply applicable to Germans, the Constitutional Court has extended the range of application to foreigners in foreign countries, and given them international importance.

Current legislation is unconstitutional. In effect, the Constitutional Court has further analysed the new intelligence law from 2016, and ruled it unconstitutional in the current state. The main reason is that, due to the fact that the legislators assumed that the fundamental rights did not apply, they did not conform with the requirements set out in the Constitution for such law. In such, the new law violates the privacy of telecommunications and its requirements from Art. 10 I GG, and in addition does not meet the key requirements deriving from other fundamental rights, such as Art. 19 I GG. However, the Constitutional Court has stated that the law can be amended to follow fundamental rights and comply with the constitution. The court declared several points which are necessary to implement in the amended law, some of which we will present further below.

Independent oversight. The Constitutional Court stated that in order to ensure conformity with the Constitution and regulate the BND in a way that would ensure the protection of fundamental rights of the people under surveillance, it would be necessary to establish a new, independent oversight regime that would act to judge and regulate strategic surveillance. Its main purposes would be the legal oversight of the BND and protection of the surveillance subjects, as well as the control of the surveillance process, from the analysing of data to the transfer of information between agencies, etc.

Legislative suggestions. In the ruling of the case, the Constitutional Court also made a few suggestions in regards to potential statutory regulation in order to regulate the BND and its area of action better than it was in the past. Part of those suggestions were the necessity of defining the purpose of surveillance measures with precision and clarity, in order to ensure transparency, as well as the necessity for the legislator to set out essential framework for the analysis of the collected data, like a cease in analysis as soon as it becomes clear that the surveillance has touched the core of private life. The court also suggested that special requirements have to apply to the protection of professional groups with communications of increased confidentiality, and that the surveillance in these cases must be tied to qualified thresholds. The court also mentioned the storage and deletion of surveillance data, stating that the traffic data obtained should not be stored for longer than six months, while a systematic deletion policy needs to be established. In the terms of the transfer of information to other (foreign) intelligence agencies, the Constitutional Court made it clear that such transfers will need an official statutory basis in order to be lawful.

The court has given the German government until the end of 2021 to amend the law and make statutory changes to comply with the ruling and the decision of the international scope of the fundamental rights. While this may seem like a big set back for the BND, it is a chance to show that intelligence agencies can work on a high constitutional standard while also being successful in their purpose.

USA: Multi-Billion Dollar Class Action lawsuit against Google

4. June 2020

Google users in the USA accuse Google of tracking their surfing behaviour even though they use the incognito mode. The complaint was filed with the federal court in San Jose, California on Tuesday, June 2nd 2020.

Background of the lawsuit is the accusation of three Google users that “Google tracks and collects users’ browsing history and other information about web activity, regardless of what measures they take to protect it”. In other words, users accuse Google of tracking their behaviour through Google Analytics, plug-ins or apps, evaluating it and using it for advertising – despite using the incognito mode.

The complaint is based on a violation of US wiretapping laws and California Privacy laws. Each plaintiff is claiming $5,000.00 in damages. Since the three plaintiffs allegedly represent thousands more plaintiffs the volume of the lawsuit could run into billions.

Google spokesman Jose Castaneda denies the allegations, citing that by opening an incognito tab on Chrome, it is indicated that websites may continue to collect information about surfing behavior. The incognito mode is about the browser and the device used not storing this data. He announced that Google would take action against the accusations.

Series on COVID-19 Contact Tracing Apps Part 3: Data Protection Issues

28. May 2020

In today’s blogpost, we will finish the miniseries on COVID-19 contact tracing apps with a final part on the issues that are created by them with regards to data protection and users’ privacy. As we have presented in the first part of this series, different approaches to contact tracing apps are in use or are being developed in different countries. These different operating approaches have different data protection issues, some of which can, in the European Union, be mitigated by following data protection regulations and the guidelines the European Data Protection Board has published, which we presented in the second part of this series.

The arising data protection issues that come with COVID-19 contact tracing apps and their impact highly depend on the API design of the apps used. However, there are common points which can cause privacy problems that may apply to all contact tracing apps due to the sensitivity of the data processed.

The biggest risks of contact tracing apps

While contact tracing apps have the potential to pose risks to data protection and their users’ privacy in all terms of data protection aspects, the following are the risks politicians, scientists and users are most worried about:

  • The risk of loss of trust
  • The risk of unauthorized access
  • The risk of processing too much data
  • The risk of abuse of the personal data collected

The risk of loss of trust: In order to work properly and reach the effectiveness necessary to contain the spread of the virus and break the chain of transmission, scientists and researches have pinpointed that at least 60% of a country’s population has to use the contact tracing apps properly. But for this to be able to happen, user satisfaction and trust in the app and its use of their personal data have to remain high. A lot of the research done on the issue shares the concern that lack of transparency in the development of the apps as well as in regard to the data they collect and process might cause the population to be sceptical and distrustful to the technologies being developed. The European Data Protection Board (EDPB) as well as the European Parliament have stated that in order for contact tracing apps to be data protection compliant, their development as well as processing of data need to be transparent throughout the entirety of the use of the apps.

The risk of unauthorized access: While the risk that the apps and the data they process can be hacked is relatively low, there is the concern that in some cases unauthorized access may result in a big privacy issue. Especially in contact tracing apps that use GPS location data as well as apps that use a centralized approach to the storage of the data processed, the risks of unauthorized access is higher due to the information being readily available. In the case of GPS data, it is easily possible to track users’ movements, allowing for a very detailed potential to analyse their behaviour. The centralized storage stores all the collected data in one cloud space, which in the case of a hacking incident may result in easy access to not only information about social behaviour and health details, but also, if used in conjunction with GPS tracking data, an easy to identify user behaviour analysis. Therefore, it has been recommended to conduct a Data Protection Impact Assessment before launching the apps, and ensure that the encryption standards are high. The Bluetooth method of phones pinging each other anonymized IDs that change every 15 minutes in case of contact closer than 10 feet has been recommended as the ideal technology to minimize location data being collected. Furthermore, most scientists and researchers recommend that in order to prevent damage, a decentralized storage method is better suited to protect the data of the users, as this method only stores the information on the users’ device instead of a central cloud.

The risk of processing too much data: In the case of contact tracing apps, one of the big risks is the processing of too much data. This is an issue which can apply to apps using GPS location tracking, the necessity to collect sensitive health data other than the COVID-19 infection status, transactional information, contacts, etc. In general, contact tracing apps should not require much additional information except the user’s contact information, since it is only necessary to log the other devices their device has come in contact with. However, there are some countries that use contact tracing apps through GPS location tracking instead of Bluetooth exchange of IDs, in which case the location data and movements of the user are automatically recorded. Other countries, like for example India, have launched an app where additional health data is being processed, as well as other information unnecessary to follow up on the contact tracing. Contact tracing apps should follow the concept of minimization of data collection in order to ensure that only personal data necessary to the purpose of the contact tracing apps are being processed. That is also one of the important ground rules the EDPB has portrayed in their guideline on the subject. However, different countries have different data protection laws, which makes a unified approach and handling of personal data difficult in cases like these.

The risk of abuse of the personal data collected: One of the biggest fears of scientists and users regarding contact tracing apps is the potential risk of abuse of the personal data collected once the pandemic is over. Especially with the centralized storage, even now there are apps that give access to the data to the government, like in India, Hong Kong and Singapore. A majority of critics is demanding regulation which will ensure that the data cannot be used after the pandemic is over and the need for the apps has ceased. This is a specifically high risk in the case of tracing apps that locate the user through GPS location tracking rather than through Bluetooth technology, since the movements of the devices lead to a very detailed and easy to analyse movement tracking of the users. This potential risk is one the most prominent ones regarding the Apple and Google project for a joint contact tracing API, as both companies have been known to face severe data protection issues in the past. However, both companies have stated that they plan on completely discontinuing the developed API once the pandemic is over, which would disable the apps working with that API. Since the Bluetooth approach they are using stores the data on users’ devices, the data will be locked and inaccessible once the API cannot read it anymore. But there are still a lot of other countries with their own APIs and apps, which may lead to a risk of government surveillance and even abuse by foreign powers. For Europe, the EDPB and the European Parliament have clearly stated that the data must be deleted and the apps dismantled after they are no longer necessary, as the purpose and legal basis for processing will not apply anymore once the pandemic is under control.

The bottom line

Needless to say, the pandemic has driven the need for new technologies and approaches to handle the spread of viruses. However, in a modern world this brings risks to the personal data used to contain the pandemic and break the chain of transmission, especially due to the fact that it is not only a nationwide, but also an international effort. It is important for users to keep in mind that their right to privacy is not entirely overpowered by the public interest to contain the virus. However, in order to keep the balance, it is important for the contact tracing apps to face criticism and be developed in a way that is compliant with data protection regulations in order to minimize the potential risks that come with the new technology. It is the only way to ensure that the people’s personal freedom and private life can continue without having to take high toll from the potential attacks that could result from these risks. Transparency is the bottom line in these projects, and it can ensure that regulations are being met and the people’s trust is kept in order to be able to reach the effectiveness needed for the tracing apps to be successful in their purpose.

Series on COVID-19 Contact Tracing Apps Part 2: The EDPB Guideline on the Use of Contact Tracing Tools

25. May 2020

Today we are continuing our miniseries on contact tracing apps and data protection with Part 2 of the series: The EDPB Guideline on the Use of Contact Tracing Tools. As mentioned in Part 1 of our miniseries, many Member States of the European Union have started to discuss using modern technologies to combat the spread of the Coronavirus. Now, the European Data Protection Board (“EDPB”) has issued a new guideline on the use of contact tracing tools in order to give European policy makers guidance on Data Protection concerns before implementing these tools.

The Legal Basis for Processing

In its guideline, the EDPB proposes that the most relevant legal basis for the processing of personal data using contact tracing apps will probably be the necessity for the performance of a task in the public interest, i.e. Art. 6 para. 1 lit. e) GDPR. In this context, Art. 6 para. 3 GDPR clarifies that the basis for the processing referred to in Art. 6 para. 1 lit. e) GDPR shall be laid down by Union or Members State law.

Another possible legal basis for processing could be consent pursuant to Art. 6 para. 1 lit. a) GDPR. However, the controller will have to ensure that the strict requirements for consent to be valid are met.

If the contact tracing application is specifically processing sensitive data, like health data, processing could be based on Art. 9 para. 2 lit. i) GDPR for reasons of public interest in the area of public health or on Art. 9 para. 2 lit. h) GDPR for health care purposes. Otherwise, processing may also be based on explicit consent pursuant to Art. 9 para. 2 lit. a) GDPR.

Compliance with General Data Protection Principles

The guideline is a prime example of the EDPB upholding that any data processing technology must comply with the general data protection principles which are stipulated in Art. 5 GDPR. Contact tracing technology will not be an exeption to this general rule. Thus, the guideline contains recommendations on what national governments and health agencies will need to be aware of in order to observe the data protection principles.

Principle of Lawfulness, fairness and transparency, Art. 5 para. 1 lit. a) GDPR: First and foremost, the EDPB points out that the contact tracing technology must ensure compliance with GDPR and Directive 2002/58/EC (the “ePrivacy Directive”). Also, the application’s algorithms must be auditable and should be regularly reviewed by independent experts. The application’s source code should be made publicly available.

Principle of Purpose limitation, Art. 5 para. 1 lit. b) GDPR: The national authorities’ purposes of processing personal data must be specific enough to exclude further processing for purposes unrelated to the management of the COVID-19 health crisis.

Principles of Data minimisation and Data Protection by Design and by Default, Art. 5 para. 1 lit. c) and Art. 25 GDPR:

  • Data processed should be reduced to the strict minimum. The application should not collect unrelated or unnecessary information, which may include civil status, communication identifiers, equipment directory items, messages, call logs, location data, device identifiers, etc.;
  • Contact tracing apps do not require tracking the location of individual users. Instead, proximity data should be used;
  • Appropriate measures should be put in place to prevent re-identification;
  • The collected information should reside on the terminal equipment of the user and only the relevant information should be collected when absolutely necessary.

Principle of Accuracy, Art. 5 para. 1 lit. d) GDPR: The EDPB advises that procedures and processes including respective algorithms implemented by the contact tracing apps should work under the strict supervision of qualified personnel in order to limit the occurrence of any false positives and negatives. Moreover, the applications should include the ability to correct data and subsequent analysis results.

Principle of Storage limitation, Art. 5 para. 1 lit. e) GDPR: With regards to data retention mandates, personal data should be kept only for the duration of the COVID-19 crisis. The EDPB also recommends including, as soon as practicable, the criteria to determine when the application shall be dismantled and which entity shall be responsible and accountable for making that determination.

Principle of Integrity and confidentiality, Art. 5 para. 1 lit. f) GDPR: Contact tracing apps should incorporate appropriate technical and organisational measures to ensure the security of processing. The EDPB places special emphasis on state-of-the-art cryptographic techniques which should be implemented to secure the data stored in servers and applications.

Principle of Accountability, Art. 5 para. 2 GDPR: To ensure accountability, the controller of any contact tracing application should be clearly defined. The EDPB suggests that national health authorities could be the controllers. Because contact tracing technology involves different actors in order to work effectively, their roles and responsibilities must be clearly established from the outset and be explained to the users.

Functional Requirements and Implementation

The EDPB also makes mention of the fact that the implementations for contact tracing apps may follow a centralised or a decentralised approach. Generally, both systems use Bluetooth signals to log when smartphone owners are close to each other.  If one owner was confirmed to have contracted COVID-19, an alert can be sent to other owners they may have infected. Under the centralised version, the anonymised data gathered by the app will be uploaded to a remote server where matches are made with other contacts. Under the decentralised version, the data is kept on the mobile device of the user, giving users more control over their data. The EDPB does not give a recommendation for using either approach. Instead, national authorities may consider both concepts and carefully weigh up the respective effects on privacy and the possible impacts on individuals rights.

Before implementing contact tracing apps, the EDPB also suggests that a Data Protection Impact Assessment (DPIA) must be carried out as the processing is considered likely high risk (health data, anticipated large-scale adoption, systematic monitoring, use of new technological solution). Furthermore, they strongly recommend the publication of DPIAs to ensure transparency.

Lastly, the EDPB proposes that the use of contact tracing applications should be voluntary and reiterates that it should not rely on tracing individual movements but rather on proximity information regarding users.

Outlook

The EDPB acknowledges that the systematic and large scale monitoring of contacts between natural persons is a grave intrusion into their privacy. Therefore, Data Protection is indispensable to build trust, create the conditions for social acceptability of any solution, and thereby guarantee the effectiveness of these measures. It further underlines that public authorities should not have to choose between an efficient response to the current pandemic and the protection of fundamental rights, but that both can be achieved at the same time.

In the third part of the series regarding COVID-19 contact tracing apps, we will take a closer look into the privacy issues that countries are facing when implementing contact tracing technologies.

easyJet Data Breach: 9 million customers affected

22. May 2020

The British airline ‘easyJet’ has been hacked. The hackers have been able to access personal data of approximately 9 million customers.

easyJet published a statement on the hacker attack and announced that e-mail addresses and travel details were among the concerned personal data of customers. Which personal data in detail belong to ‘travel data’ was not disclosed. In some cases, the hackers could also access credit card data. easyJet stated that there is no proof, that the accessed personal data was abused. easyjet now warns about fake mails in his name as well as in the name of ‘easyJet Holidays’.

The hack was noticed by easyJet in January, but was only made public this week. With becoming aware of the attack, easyJet took several measures and has blocked the unauthorized access in the meantime. easyJet is also in contact with the British Data Protection Authority ‘ICO’ and the National Security Center.

At this time, easyJet has not yet been able to evaluate how the attack could have occurred, but easyJet explained, that the hacker attack was no ‘general’ hacker attack, since the attack was very sophisticated compared to other hacker attacks. It is suspected that the attack originated from a group that has already hacked other airlines, such as British Airways in 2018.

easyJet announced that they will get in contact with concerned data subjects until May 26th to inform those about the breach and to explain further measures which should be taken in order to decrease the risk. easyJet customers who will not receive a statement until then are not concerned by the breach.

In connection with hacker attacks like these the risk for phishing attacks is the highest. In phishing attacks, criminals use fake e-mails, for example on behalf of well-known companies or authorities, to try to persuade users to pass on personal data or to click on prepared e-mail attachments containing malware.

Pages: Prev 1 2 3 4 5 6 7 8 9 10 ... 46 47 48 Next
1 2 3 4 48