Category: COVID-19-Virus

Data Breach made 136,000 COVID-19 test results publicly accessible

18. March 2021

Personal health data are considered a special category of personal data under Art. 9 of the GDPR and are therefore given special protections. A group of IT experts, including members of the German Chaos Computer Club (CCC), has now revealed security gaps in the software for test centres by which more than 136,000 COVID-19 test results of more than 80,000 data subjects have apparently been unprotected on the internet for weeks.

The IT-Security experts’ findings concern the software “SafePlay” of the Austrian company Medicus AI. Many test centres use this software to allocate appointments and to make test results digitally available to those tested. In fact, more than 100 test centres and mobile test teams in Germany and Austria are affected by the recent data breach. These include public facilities in Munich, Berlin, Mannheim as well as fixed and temporary testing stations in companies, schools and daycare centres.

In order to view the test results unlawfully, one only needed to create an account for a COVID-19 test. The URL for the test result contained the number of the test. If this number was simply counted up or down, the “test certificates” of other people became freely accessible. In addition to the test result, the test certificate also contained the name, date of birth, private address, nationality and ID number of the person concerned.

It remains unresolved whether the vulnerabilities have been exploited prior to the discovery by the CCC. The CCC notified both Medius AI and the Data Protection Authorities about the leak which led to a quick response by the company. However, IT experts and Privacy-focused NGOs commented that Medicus AI was irresponsible and grossly negligent with respect to their security measures leading to the potential disclosure of an enormous amount of sensitive personal health data.

Dutch data scandal: illegal trade of COVID-19 patient data

19. February 2021

In recent months, a RTL Nieuws reporter Daniël Verlaan has discovered widespread trade in the personal data of Dutch COVID-19 test subjects. He found ads consisting of photos of computer screens listing data of Dutch citizens. Apparently, the data had been offered for sale on various instant messaging apps such as Telegram, Snapchat and Wickr. The prices ranged from €30 to €50 per person. The data included home addresses, email addresses, telephone numbers, dates of birth and BSN identifiers (Dutch social security number).

The personal data were registered in the two main IT systems of the Dutch Municipal Health Service (GGD) – CoronIT, containing details about citizens who took a COVID-19 test, and HPzone Light, a contact-tracing system, which contains the personal data of people infected with the coronavirus.

After becoming aware of the illegal trade, the GGD reported it to the Dutch Data Protection Authority and the police. The cybercrime team of the Midden-Nederland police immediately started an investigation. It showed that at least two GGD employees had maliciously stolen the data, as they had access to the official Dutch government COVID-19 systems and databases. Within 24 hours of the complaint, two men were arrested. Several days later, a third suspect was tracked down as well. The investigation continues, since the extent of the data theft is unclear and whether the suspects in fact managed to sell the data. Therefore, more arrests are certainly not excluded.

Chair of the Dutch Institute for Vulnerability Disclosure, Victor Gevers, told ZDNet in an interview:

Because people are working from home, they can easily take photos of their screens. This is one of the issues when your administrative staff is working from home.

Many people expressed their disapproval of the insufficient security measures concerning the COVID-19 systems. Since the databases include very sensitive data, the government has a duty to protect these properly in order to prevent criminal misuse. People must be able to rely on their personal data being treated confidentially.

In a press release, the Dutch police also raised awareness of the cybercrime risks, like scam or identity fraud. Moreover, they informed about the possibilities of protection against such crimes and the need to report them. This prevents victims and allows the police to immediately track down suspects and stop their criminal practices.

16 Million brazilian COVID-19 patients’ personal data exposed online

7. December 2020

In November 2020, personal and sensitive health data of about 16 Million brazilian COVID-19 patients has been leaked on the online platform GitHub. The cause was a hospital employee, that uploaded a spreadsheet with usernames, passwords, and access keys to sensitive government systems on the online platforms. Under those affected were also the brazilian President Jair Bolsonaro and his family as well as seven ministers and 17 provincial governors.

Under the exposed systems were two government databases used to store information on COVID-19 patients. The first “E-SUS-VE” was used for recording COVID-19 patients with mild symptoms, while the second “Sivep-Gripe” was used to keep track of hospitalized cases across the country.

However, both systems contained highly sensitive personal information such as patient names, addresses, telephone numbers, individual taxpayer’s ID information, but also healthcare records such as medical history and medication regimes.

The leak was discovered after a GitHub user spotted the spreadsheet containing the password information on the personal GitHub account of an employee of the Albert Einstein Hospital in Sao Paolo. The user informed the Brazilian newspaper Estadao, which analysed the information shared on the platform before it notified the hospital and the health ministry of Brazil.

The spreadsheet was ultimately removed from GitHub, while government officials changed passwords and revoked access keys to secure their systems after the leak.

However, Estadao reporters confirmed that the leaked data included personal data of Brazilians across all 27 states.

Admonition for revealing a list of people quarantined in Poland

27. November 2020

The President of the Personal Data Protection Office in Poland (UODO) imposed an admonition on a company dealing with waste management liable for a data breach and ordered to notify the concerned data subjects. The admonition is based on a violation of personal data pertaining to data subjects under medical quarantine. The city name, street name, building/flat number and the fact of remaining under quarantine of the affected data subjects have been provided by the company to unauthorized recipients. The various recipients were required to verify whether, in a given period, waste was to be collected from places determined in the above-mentioned list.

The incident already happened in April 2020. Back then, a list of data subjects was made public, containing information on who had been quarantined by the administrative decision of the District Sanitary-Epidemiological Station (PPIS) in Gniezno as well as information on quarantined data subjects in connection with crossing the country border and on data subjects undergoing home isolation due to a confirmed SARS-CoV-2 infection. After becoming aware of the revelation, the Director of PPIS notified the relevant authorities – the District Prosecutor’s Office and the President of UODO – about the incident.

PPIS informed them that it had carried out explanatory activities showing that the source of disclosure of these data was not PPIS. These data were provided to the District Police Headquarters, the Head of the Polish Post Office, Social Welfare Centres and the Headquarters of the State Fire Service. Considering the fact that these data had been processed by various parties involved, it was necessary to establish in which of them the breach may have occurred.

UODO took steps to clarify the situation. In the course of the proceedings, it requested information from a company dealing with waste management being one of the recipients of the personal data. The company, acting as the data controller, had to explain whether, when establishing the procedures related to the processing of personal data, it had carried out an assessment of the impact of the envisaged processing operations on the protection of personal data according to Art. 35 GDPR. The assessment persists in an analysis of the distribution method in electronic and paper form in terms of risks related to the loss of confidentiality. Furthermore, the data controller had to inform UODO about the result of this analysis.

The data controller stated that it had conducted an analysis considering the circumstances related to non-compliance with the procedures in force by data processors and circumstances related to theft or removal of data. Moreover, the data controller expressed the view that the list, received from the District Police Headquarters, only included administrative (police) addresses and did not contain names, surnames and other data allowing the identification of a natural person. Thus, the GDPR would not apply, because the data has to be seen as anonymized. However, from the list also emerged the fact that residents of these buildings/flats were placed in quarantine, which made it possible to identify them. It came out that the confidentiality of the processed data had been violated in the course of the performance of employee duties of the data processor, who had left the printed list on the desk without proper supervision. During this time, another employee had recorded the list in the form of a photo and had shared it with another person.

Following the review of the entirety of the collected material in this case, UODO considered that the information regarding the city name, street name, building/flat number and placing a data subject in medical quarantine, constitute personal data within the meaning of Art. 4 (1) GDPR, while the last comprises a special category of personal data concerning health according to Art. 9 (1) GDPR. Based on the above, it is possible to identify the data subjects, and therefore the data controller is bound to the obligations arising from the GDPR.

In the opinion of UODO, the protective measures indicated in the risk analysis are general formulations, which do not refer to specific activities undertaken by authorized employees. The measures are insufficient and inadequate to the risks of processing special categories of data. In addition, the data controller should have considered factors, such as recklessness and carelessness of employees and a lack of due diligence.

According to Art. 33 (1) GDPR, the data controller shall without undue delay and, where feasible, not later than 72 hours after having become aware of the data breach, notify it to the competent supervisory authority. Moreover, in a situation of high risk to the rights and freedoms of the data subjects, resulting from the data breach (which undoubtedly arose from the disclosure), the data controller is obliged to inform the data subject without undue delay in accordance with Art. 34 (1) GDPR. Despite this, the company did not report the infringement, neither to the President of UODO nor to the concerned data subjects.

First judicial application of Schrems II in France

20. October 2020

France’s highest administrative court (Conseil d’État) issued a summary judgment that rejected a request for the suspension of France’s centralized health data platform – Health Data Hub (HDH) – on October 13th, 2020. The Conseil d’État further recognized that there is a risk of U.S. intelligence services requesting the data and called for additional guarantees.

For background, France’s HDH is a data hub supposed to consolidate all health data of people receiving medical care in France in order to facilitate data sharing and promote medical research. The French Government initially chose to partner with Microsoft and its cloud platform Azure. On April 15th, 2020, the HDH signed a contract with Microsoft’s Irish affiliate to host the health data in data centers in the EU. On September 28th, 2020, several associations, unions and individual applicants appealed to the summary proceedings judge of the Conseil d’État, asking for the suspension of the processing of health data related to the COVID-19 pandemic in the HDH. The worry was that the hosting of data by a company which is subject to U.S. laws entails data protection risks due to the potential surveillance done under U.S. national surveillance laws, as has been presented and highlighted in the Schrems II case.

On October 8th, 2020, the Commission Nationale de l’Informatique et Libertées (CNIL) submitted comments on the summary proceeding before the Conseil d’État. The CNIL considered that, despite all of the technical measures implemented by Microsoft (including data encryption), Microsoft could still be able to access the data it processes on behalf of the HDH and could be subject, in theory, to requests from U.S. intelligence services under FISA (or even EO 12333) that would require Microsoft to transfer personal data stored and processed in the EU.
Further, the CNIL recognized that the Court of Justice of the European Union (CJEU) in the Schrems II case only examined the situation where an operator transfers, on its own initiative, personal data to the U.S. However, according to the CNIL, the reasons for the CJEU’s decision also require examining the lawfulness of a situation in which an operator processes personal data in the EU but faces the possibility of having to transfer the data following an administrative or judicial order or request from U.S. intelligence services, which was not clearly stated in the Schrems II ruling. In that case, the CNIL considered that U.S. laws (FISA and EO 12333) also apply to personal data stored outside of the U.S.

In the decision of the Conseil d’État, it agreed with the CNIL that it cannot be totally discounted that U.S. public authorities could request Microsoft and its Irish affiliate to access some of the data held in the HDH. However, the summary proceedings judge did not consider the CJEU’s ruling in the Schrems II case to also require examination of the conditions under which personal data may be processed in the EU by U.S. companies or their affiliates as data processors. EU law does not prohibit subcontracting U.S. companies to process personal data in the EU. In addition, the Conseil d’État considered the violation of the GDPR in this case was purely hypothetical because it presupposes that U.S. authorities are interested in accessing the health data held in the HDH. Further, the summary proceedings judge noted that the health data is pseudonymized before being shared within the HDH, and is then further encrypted by Microsoft.

In the end, the judge highlighted that, in light of the COVID-19 pandemic, there is an important public interest in continuing the processing of health data as enabled by the HDH. The conclusion reached by the Conseil d’ètat was that there is no adequate justification for suspending the data processing activities conducted by the HDH, but the judge ordered the HDH to work with Microsoft to further strengthen privacy rights.

Contact Tracing Apps: U.K. Update and EDPB Interoperability Statement

23. June 2020

In another update about contact tracing apps, we are going to talk about the new path of contact tracing in the United Kingdom (UK), as well as the European Data Protection Board’s (EDPB) statement in regards to the cross-border interoperability of the contact tracing apps being deployed in the European Union.

UK Contact Tracing App Update

Since starting the field tests on the NHS COVID-19 App on the Isle of Wight, the UK government has decided to change their approach towards the contact tracing model. It has been decided to abandon the centralized app model in favour of the decentralized Google/Apple alternative.

The change was brought on by technical issues and privacy challenges which surfaced during the trial period on the Isle of Wight, and in the end were direct consequences of the centralized model and important enough to motivate the change of approach.

The technical problems included issues with the background Bluetooth access, as well as operation problems in the light of cross-border interoperability. Further, the data protection risks of mission creep and a lack of transparency only urged on the of the app.

The new model is widely used throughout the European Union, and provides more data protection as well as better technical support. The only deficit in comparison with the centralized model is the lesser access to data by epidemiologists, which seems to be a trade off that the UK government is willing to take for the increase in data protection and technical compatibility.

EDPB statement on cross-border interoperability

On June 17th, 2020, the EDPB has released a statement with regards to the cross-border interoperability of contact tracing apps. The statement builds on the EDPB Guideline from 04/2020 with regards to data protection aspects of contact tracing apps, emphasising the importance of the issues presented.

The statement stems from an agreement between EU-Member states and the European Commission formed in May 2020 with regards to the basic guidelines for cross-border interoperability of contact tracing apps, as well as the newly settled technical specs for the achievement of such an interoperability.

The EDPB states key aspects that have to be kept in mind during the entirety of the project, namely transparency, legal basis, controllership, data subject’s rights, as well as data retention and minimisation rules.

Further, the statement emphasises that the sharing of data about individuals which have been diagnosed or tested positively should only be triggered by a voluntary action of the users themselves. In the end, the goal of interoperability should not be used as an argument to extend the collection of personal data further than necessary.

Overall, this type of sharing of personal data can pose an increased data protection risk to the personal data of the users, which is why it needs to be made sure that the principles set down by the GDPR are being upheld, and made sure that there is no less intrusive method to be used in the matter.

Thailand postpones Enforcement of new Personal Data Protection Act

22. June 2020

In response to the European General Data Protection Regulation (“GDPR”) becoming applicable in 2018, Thailand adopted its first-ever Personal Data Protection Act (“PDPA”) into law on 28 May 2019. As it is fashioned after the GDPR, the PDPA is built around principles that vastly align with the GDPR, especially in the areas of data protection principles, legal bases, and data subject rights. Originally, it was determined that the PDPA would start its applicability one year after its adoption, on 27 May 2020.

Now, the Thai Government has approved of a draft decree by the Ministry of Digital Economy and Society (“MDES”) to postpone the enforcement of most sections of the PDPA to 31 May 2021. The MDES explained that the reasons for delay are the current Corona pandemic and its strain on businesses, as well as many businesses not being prepared for PDPA compliance. Notably, Brasil also postponed the enforcement of its new Data Protecion Law (“LGPD”) for similar reasons (we reported).

The only sections of the PDPA that will be enforced as originally planned include the appointment of the Personal Data Protection Committee members and the establishment of the Office of the Personal Data Protection Committee. Whilst the delay allows companys more time to become PDPA compliant, the lack of enforcement regarding data subject rights in the meantime are a big concern of critics, especially in light of the recent adoption of Thailand’s controversial new cybersecurity law.

Germany Update: Covid-19 Tracing App launched in mid June

18. June 2020

On June 16th, 2020 Germany has introduced their new COVID-19 tracing app called “Corona-Warn-App” and released it for download. Within the first day, over six million citizens downloaded the app, and the government hopes to see the number increase for better effectiveness of the method.

As an Open Source project from the start, giving unhindered access to the programming code, it was able to work on safety and data protection issues throughout the seven weeks of its development, as well as keep the entire process transparent to future users.

Overall, the first impressions on the side of data protection have been good, with the Federal Data Protection Officer (Bundesdatenschutzbeauftragter) Ulrich Kelber stating to the Saarbrückener Zeitung that the app “gives a solid impression”, but he would like to “have seen a Data Protection Impact Assessment before the launch”.

The data protection aspects

The German contact tracing app claims to put the highest importance on data protection, and the transparency for the users to know what happens with their data.

Upon the download, the app gives the chance to read through a thorough privacy policy, giving the user all the information necessary to be able to understand and consent to the use of their data. In effect, the personal data collected and stored remains minimal: the consent to the usage of the Exposure Notification Framework, TANs for testing verification, as well as consent for a daily upload of the diagnostics key, which is only stored for 14 days.

The app, developed by SAP and Telekom, uses Bluetooth technology to judge exposure based on two criteria: the distance between two smartphones and the duration of the encounter. If the threshold requirements of those two criteria are met, the phones exchange a random key code, which are stored for 14 days on the phone devices, and checked for positive test results there. It will then tell you if your exposure is low or high risk, and will give you suggestions on how to act based on the level of risk to exposure. Due to this procedure, there is no need for the collection of personal information regarding the identity of the person. Especially, the notification in case of exposure is not in real time, making it impossible to securely identify the coronavirus positive person that has been encountered.

Furthermore, the app not only puts an emphasis on anonymity, but also on voluntariness. Whether and how you want to use the app is entirely up to the user. The user may disable to Exposure Notification Framework, and decide for themselves if they want to share the results of a test with the app. This comes, of course, with limitations to the effectiveness of the app, but it gives the user more control over his own data shared.

One of the current deficits is that due to the lack of hardware systems, the testing laboratories cannot verify the test results through the users scanning a QR-code, as originally planned. However, in the meantime, a notification hotline has been set up, although this raises data protection concerns due to the fact that it could be taken advantage of or abused.

Lastly, one of the big data protection aspects, which has caused a big stir in the cases of the tracing apps, is the storage of the information. The Corona-Warn-App stores the data of the users in a decentralized manner, which means that there is no direct upload to a cloud, but instead the entire process happens on the users’ devices. This shields from potential misuse of the data by parties involved in the development as well as the government, and was recommended by the European Parliament and the European Data Protection Board as the safer storage option for these types of contact tracing apps.

Overview

While the app is only in its first few days of launch, it has received a lot of praise for the way it handles the different problems with data protection and IT security. It remains to be seen if the necessary 60% of citizens using the contact tracing app can be mobilized in order to ensure maximum effectiveness.

Future plans involve cross border cooperation with different countries and their own contact tracing apps in order to ensure the practicability and effectiveness of these apps, as the containment of the pandemic is an international venture.

Overall, the Corona-Warn-App seems to be a decent development despite its hurried creation period. However, at this point it is only the beginning of the contact tracing app, and it remains to be seen how the developers incorporate fixes for upcoming problems.

Hungary Update: EDPB publishes Statement on Art. 23 GDPR

17. June 2020

Since March 2020, Hungary has been in a “state of emergency” following the COVID-19 pandemic. The country’s COVID-19 related emergency laws and state of emergency received worldwide criticism from constitutional experts, politicians and civil rights groups, because it allows the Prime Minister to rule by decree during the state of emergency and does not provide a predefined end date. During the state of emergency, Prime Minister Victor Orbán made extensive use of his newly gained powers by passing more than a hundred decrees, including Decree No. 179/2020, which suspended the GDPR data subject rights in Art. 15-22 GDPR with respect to personal data processing for the purpose of preventing, understanding, detecting the coronavirus disease and impeding its further spread (we reported).

In response to this suspension of GDPR rights, the European Data Protection Board (“EDPB”) has recently published a Statement on restrictions on data subject rights pursuant to Art. 23 GDPR, which is the provision that Hungary’s measure was based on. This article allows the member states to restrict, by way of a legislative measure, the scope of the obligations and rights provided for in Articles 12 to 22 and Article 34, when such a restriction respects the essence of the fundamental rights and freedoms and is a necessary and proportionate measure in a democratic society to safeguard, inter alia, important objectives of general public interest of the Union or of a Member State such as public health.

In its Statement, the EDPB points out that any restriction must respect the essence of the right that is being restricted. If the essence of the right is compromised, the restriction must be considered unlawful. Since the data subject’s right of access and the right to rectification are fundamental rights according to Art. 8 para. 2 of the Charter of Fundamental Rights of the European Union, any restriction of those rights must be carefully weighed up by the member states, in order respect the essence of the rights. The EDPB considers that restrictions adopted in the context of a state of emergency suspending or postponing the application of data subject rights, without any clear limitation in time, equate to a de facto blanket suspension and denial of those rights and are not be compatible with the essence of the fundamental rights and freedoms.

The EDPB also recalls that the restrictions under Art. 23 GDPR must be necessary and proportionate. It argues that restrictions that are imposed for a duration not precisely limited in time or which apply retroactively or are subject to undefined conditions, are not foreseeable to data subjects and thus disproportionate.

Furthermore, the EDPB takes the view that in order to safeguard important objectives of general public interest such as public health (Art. 23 para. 1 lit. e GDPR), there must be a clearly established and demonstrated link between the foreseen restrictions and the objective pursued. The mere existence of a pandemic or any other emergency situation alone does not justify a restriction of data subject rights, especially if it is not clearly established, how the restrictions can help dealing with the emergency.

Following the international public backlash, the Parliament of Hungary passed legislation on 16 June 2020 to revoke the emergency laws as soons as the current state of emergency will be terminated by the Government. Hungary’s Government announced in May that it intends to lift the state of emergency on 20 June 2020. After that, the restrictions on the GDPR rights shall be lifted as well, so that data subject may exercise their Art. 15-22 GDPR rights again.

Series on COVID-19 Contact Tracing Apps Part 3: Data Protection Issues

28. May 2020

In today’s blogpost, we will finish the miniseries on COVID-19 contact tracing apps with a final part on the issues that are created by them with regards to data protection and users’ privacy. As we have presented in the first part of this series, different approaches to contact tracing apps are in use or are being developed in different countries. These different operating approaches have different data protection issues, some of which can, in the European Union, be mitigated by following data protection regulations and the guidelines the European Data Protection Board has published, which we presented in the second part of this series.

The arising data protection issues that come with COVID-19 contact tracing apps and their impact highly depend on the API design of the apps used. However, there are common points which can cause privacy problems that may apply to all contact tracing apps due to the sensitivity of the data processed.

The biggest risks of contact tracing apps

While contact tracing apps have the potential to pose risks to data protection and their users’ privacy in all terms of data protection aspects, the following are the risks politicians, scientists and users are most worried about:

  • The risk of loss of trust
  • The risk of unauthorized access
  • The risk of processing too much data
  • The risk of abuse of the personal data collected

The risk of loss of trust: In order to work properly and reach the effectiveness necessary to contain the spread of the virus and break the chain of transmission, scientists and researches have pinpointed that at least 60% of a country’s population has to use the contact tracing apps properly. But for this to be able to happen, user satisfaction and trust in the app and its use of their personal data have to remain high. A lot of the research done on the issue shares the concern that lack of transparency in the development of the apps as well as in regard to the data they collect and process might cause the population to be sceptical and distrustful to the technologies being developed. The European Data Protection Board (EDPB) as well as the European Parliament have stated that in order for contact tracing apps to be data protection compliant, their development as well as processing of data need to be transparent throughout the entirety of the use of the apps.

The risk of unauthorized access: While the risk that the apps and the data they process can be hacked is relatively low, there is the concern that in some cases unauthorized access may result in a big privacy issue. Especially in contact tracing apps that use GPS location data as well as apps that use a centralized approach to the storage of the data processed, the risks of unauthorized access is higher due to the information being readily available. In the case of GPS data, it is easily possible to track users’ movements, allowing for a very detailed potential to analyse their behaviour. The centralized storage stores all the collected data in one cloud space, which in the case of a hacking incident may result in easy access to not only information about social behaviour and health details, but also, if used in conjunction with GPS tracking data, an easy to identify user behaviour analysis. Therefore, it has been recommended to conduct a Data Protection Impact Assessment before launching the apps, and ensure that the encryption standards are high. The Bluetooth method of phones pinging each other anonymized IDs that change every 15 minutes in case of contact closer than 10 feet has been recommended as the ideal technology to minimize location data being collected. Furthermore, most scientists and researchers recommend that in order to prevent damage, a decentralized storage method is better suited to protect the data of the users, as this method only stores the information on the users’ device instead of a central cloud.

The risk of processing too much data: In the case of contact tracing apps, one of the big risks is the processing of too much data. This is an issue which can apply to apps using GPS location tracking, the necessity to collect sensitive health data other than the COVID-19 infection status, transactional information, contacts, etc. In general, contact tracing apps should not require much additional information except the user’s contact information, since it is only necessary to log the other devices their device has come in contact with. However, there are some countries that use contact tracing apps through GPS location tracking instead of Bluetooth exchange of IDs, in which case the location data and movements of the user are automatically recorded. Other countries, like for example India, have launched an app where additional health data is being processed, as well as other information unnecessary to follow up on the contact tracing. Contact tracing apps should follow the concept of minimization of data collection in order to ensure that only personal data necessary to the purpose of the contact tracing apps are being processed. That is also one of the important ground rules the EDPB has portrayed in their guideline on the subject. However, different countries have different data protection laws, which makes a unified approach and handling of personal data difficult in cases like these.

The risk of abuse of the personal data collected: One of the biggest fears of scientists and users regarding contact tracing apps is the potential risk of abuse of the personal data collected once the pandemic is over. Especially with the centralized storage, even now there are apps that give access to the data to the government, like in India, Hong Kong and Singapore. A majority of critics is demanding regulation which will ensure that the data cannot be used after the pandemic is over and the need for the apps has ceased. This is a specifically high risk in the case of tracing apps that locate the user through GPS location tracking rather than through Bluetooth technology, since the movements of the devices lead to a very detailed and easy to analyse movement tracking of the users. This potential risk is one the most prominent ones regarding the Apple and Google project for a joint contact tracing API, as both companies have been known to face severe data protection issues in the past. However, both companies have stated that they plan on completely discontinuing the developed API once the pandemic is over, which would disable the apps working with that API. Since the Bluetooth approach they are using stores the data on users’ devices, the data will be locked and inaccessible once the API cannot read it anymore. But there are still a lot of other countries with their own APIs and apps, which may lead to a risk of government surveillance and even abuse by foreign powers. For Europe, the EDPB and the European Parliament have clearly stated that the data must be deleted and the apps dismantled after they are no longer necessary, as the purpose and legal basis for processing will not apply anymore once the pandemic is under control.

The bottom line

Needless to say, the pandemic has driven the need for new technologies and approaches to handle the spread of viruses. However, in a modern world this brings risks to the personal data used to contain the pandemic and break the chain of transmission, especially due to the fact that it is not only a nationwide, but also an international effort. It is important for users to keep in mind that their right to privacy is not entirely overpowered by the public interest to contain the virus. However, in order to keep the balance, it is important for the contact tracing apps to face criticism and be developed in a way that is compliant with data protection regulations in order to minimize the potential risks that come with the new technology. It is the only way to ensure that the people’s personal freedom and private life can continue without having to take high toll from the potential attacks that could result from these risks. Transparency is the bottom line in these projects, and it can ensure that regulations are being met and the people’s trust is kept in order to be able to reach the effectiveness needed for the tracing apps to be successful in their purpose.

Pages: 1 2 3 Next
1 2 3