Category: Personal Data

Series on COVID-19 Contact Tracing Apps Part 3: Data Protection Issues

28. May 2020

In today’s blogpost, we will finish the miniseries on COVID-19 contact tracing apps with a final part on the issues that are created by them with regards to data protection and users’ privacy. As we have presented in the first part of this series, different approaches to contact tracing apps are in use or are being developed in different countries. These different operating approaches have different data protection issues, some of which can, in the European Union, be mitigated by following data protection regulations and the guidelines the European Data Protection Board has published, which we presented in the second part of this series.

The arising data protection issues that come with COVID-19 contact tracing apps and their impact highly depend on the API design of the apps used. However, there are common points which can cause privacy problems that may apply to all contact tracing apps due to the sensitivity of the data processed.

The biggest risks of contact tracing apps

While contact tracing apps have the potential to pose risks to data protection and their users’ privacy in all terms of data protection aspects, the following are the risks politicians, scientists and users are most worried about:

  • The risk of loss of trust
  • The risk of unauthorized access
  • The risk of processing too much data
  • The risk of abuse of the personal data collected

The risk of loss of trust: In order to work properly and reach the effectiveness necessary to contain the spread of the virus and break the chain of transmission, scientists and researches have pinpointed that at least 60% of a country’s population has to use the contact tracing apps properly. But for this to be able to happen, user satisfaction and trust in the app and its use of their personal data have to remain high. A lot of the research done on the issue shares the concern that lack of transparency in the development of the apps as well as in regard to the data they collect and process might cause the population to be sceptical and distrustful to the technologies being developed. The European Data Protection Board (EDPB) as well as the European Parliament have stated that in order for contact tracing apps to be data protection compliant, their development as well as processing of data need to be transparent throughout the entirety of the use of the apps.

The risk of unauthorized access: While the risk that the apps and the data they process can be hacked is relatively low, there is the concern that in some cases unauthorized access may result in a big privacy issue. Especially in contact tracing apps that use GPS location data as well as apps that use a centralized approach to the storage of the data processed, the risks of unauthorized access is higher due to the information being readily available. In the case of GPS data, it is easily possible to track users’ movements, allowing for a very detailed potential to analyse their behaviour. The centralized storage stores all the collected data in one cloud space, which in the case of a hacking incident may result in easy access to not only information about social behaviour and health details, but also, if used in conjunction with GPS tracking data, an easy to identify user behaviour analysis. Therefore, it has been recommended to conduct a Data Protection Impact Assessment before launching the apps, and ensure that the encryption standards are high. The Bluetooth method of phones pinging each other anonymized IDs that change every 15 minutes in case of contact closer than 10 feet has been recommended as the ideal technology to minimize location data being collected. Furthermore, most scientists and researchers recommend that in order to prevent damage, a decentralized storage method is better suited to protect the data of the users, as this method only stores the information on the users’ device instead of a central cloud.

The risk of processing too much data: In the case of contact tracing apps, one of the big risks is the processing of too much data. This is an issue which can apply to apps using GPS location tracking, the necessity to collect sensitive health data other than the COVID-19 infection status, transactional information, contacts, etc. In general, contact tracing apps should not require much additional information except the user’s contact information, since it is only necessary to log the other devices their device has come in contact with. However, there are some countries that use contact tracing apps through GPS location tracking instead of Bluetooth exchange of IDs, in which case the location data and movements of the user are automatically recorded. Other countries, like for example India, have launched an app where additional health data is being processed, as well as other information unnecessary to follow up on the contact tracing. Contact tracing apps should follow the concept of minimization of data collection in order to ensure that only personal data necessary to the purpose of the contact tracing apps are being processed. That is also one of the important ground rules the EDPB has portrayed in their guideline on the subject. However, different countries have different data protection laws, which makes a unified approach and handling of personal data difficult in cases like these.

The risk of abuse of the personal data collected: One of the biggest fears of scientists and users regarding contact tracing apps is the potential risk of abuse of the personal data collected once the pandemic is over. Especially with the centralized storage, even now there are apps that give access to the data to the government, like in India, Hong Kong and Singapore. A majority of critics is demanding regulation which will ensure that the data cannot be used after the pandemic is over and the need for the apps has ceased. This is a specifically high risk in the case of tracing apps that locate the user through GPS location tracking rather than through Bluetooth technology, since the movements of the devices lead to a very detailed and easy to analyse movement tracking of the users. This potential risk is one the most prominent ones regarding the Apple and Google project for a joint contact tracing API, as both companies have been known to face severe data protection issues in the past. However, both companies have stated that they plan on completely discontinuing the developed API once the pandemic is over, which would disable the apps working with that API. Since the Bluetooth approach they are using stores the data on users’ devices, the data will be locked and inaccessible once the API cannot read it anymore. But there are still a lot of other countries with their own APIs and apps, which may lead to a risk of government surveillance and even abuse by foreign powers. For Europe, the EDPB and the European Parliament have clearly stated that the data must be deleted and the apps dismantled after they are no longer necessary, as the purpose and legal basis for processing will not apply anymore once the pandemic is under control.

The bottom line

Needless to say, the pandemic has driven the need for new technologies and approaches to handle the spread of viruses. However, in a modern world this brings risks to the personal data used to contain the pandemic and break the chain of transmission, especially due to the fact that it is not only a nationwide, but also an international effort. It is important for users to keep in mind that their right to privacy is not entirely overpowered by the public interest to contain the virus. However, in order to keep the balance, it is important for the contact tracing apps to face criticism and be developed in a way that is compliant with data protection regulations in order to minimize the potential risks that come with the new technology. It is the only way to ensure that the people’s personal freedom and private life can continue without having to take high toll from the potential attacks that could result from these risks. Transparency is the bottom line in these projects, and it can ensure that regulations are being met and the people’s trust is kept in order to be able to reach the effectiveness needed for the tracing apps to be successful in their purpose.

Series on COVID-19 Contact Tracing Apps Part 2: The EDPB Guideline on the Use of Contact Tracing Tools

25. May 2020

Today we are continuing our miniseries on contact tracing apps and data protection with Part 2 of the series: The EDPB Guideline on the Use of Contact Tracing Tools. As mentioned in Part 1 of our miniseries, many Member States of the European Union have started to discuss using modern technologies to combat the spread of the Coronavirus. Now, the European Data Protection Board (“EDPB”) has issued a new guideline on the use of contact tracing tools in order to give European policy makers guidance on Data Protection concerns before implementing these tools.

The Legal Basis for Processing

In its guideline, the EDPB proposes that the most relevant legal basis for the processing of personal data using contact tracing apps will probably be the necessity for the performance of a task in the public interest, i.e. Art. 6 para. 1 lit. e) GDPR. In this context, Art. 6 para. 3 GDPR clarifies that the basis for the processing referred to in Art. 6 para. 1 lit. e) GDPR shall be laid down by Union or Members State law.

Another possible legal basis for processing could be consent pursuant to Art. 6 para. 1 lit. a) GDPR. However, the controller will have to ensure that the strict requirements for consent to be valid are met.

If the contact tracing application is specifically processing sensitive data, like health data, processing could be based on Art. 9 para. 2 lit. i) GDPR for reasons of public interest in the area of public health or on Art. 9 para. 2 lit. h) GDPR for health care purposes. Otherwise, processing may also be based on explicit consent pursuant to Art. 9 para. 2 lit. a) GDPR.

Compliance with General Data Protection Principles

The guideline is a prime example of the EDPB upholding that any data processing technology must comply with the general data protection principles which are stipulated in Art. 5 GDPR. Contact tracing technology will not be an exeption to this general rule. Thus, the guideline contains recommendations on what national governments and health agencies will need to be aware of in order to observe the data protection principles.

Principle of Lawfulness, fairness and transparency, Art. 5 para. 1 lit. a) GDPR: First and foremost, the EDPB points out that the contact tracing technology must ensure compliance with GDPR and Directive 2002/58/EC (the “ePrivacy Directive”). Also, the application’s algorithms must be auditable and should be regularly reviewed by independent experts. The application’s source code should be made publicly available.

Principle of Purpose limitation, Art. 5 para. 1 lit. b) GDPR: The national authorities’ purposes of processing personal data must be specific enough to exclude further processing for purposes unrelated to the management of the COVID-19 health crisis.

Principles of Data minimisation and Data Protection by Design and by Default, Art. 5 para. 1 lit. c) and Art. 25 GDPR:

  • Data processed should be reduced to the strict minimum. The application should not collect unrelated or unnecessary information, which may include civil status, communication identifiers, equipment directory items, messages, call logs, location data, device identifiers, etc.;
  • Contact tracing apps do not require tracking the location of individual users. Instead, proximity data should be used;
  • Appropriate measures should be put in place to prevent re-identification;
  • The collected information should reside on the terminal equipment of the user and only the relevant information should be collected when absolutely necessary.

Principle of Accuracy, Art. 5 para. 1 lit. d) GDPR: The EDPB advises that procedures and processes including respective algorithms implemented by the contact tracing apps should work under the strict supervision of qualified personnel in order to limit the occurrence of any false positives and negatives. Moreover, the applications should include the ability to correct data and subsequent analysis results.

Principle of Storage limitation, Art. 5 para. 1 lit. e) GDPR: With regards to data retention mandates, personal data should be kept only for the duration of the COVID-19 crisis. The EDPB also recommends including, as soon as practicable, the criteria to determine when the application shall be dismantled and which entity shall be responsible and accountable for making that determination.

Principle of Integrity and confidentiality, Art. 5 para. 1 lit. f) GDPR: Contact tracing apps should incorporate appropriate technical and organisational measures to ensure the security of processing. The EDPB places special emphasis on state-of-the-art cryptographic techniques which should be implemented to secure the data stored in servers and applications.

Principle of Accountability, Art. 5 para. 2 GDPR: To ensure accountability, the controller of any contact tracing application should be clearly defined. The EDPB suggests that national health authorities could be the controllers. Because contact tracing technology involves different actors in order to work effectively, their roles and responsibilities must be clearly established from the outset and be explained to the users.

Functional Requirements and Implementation

The EDPB also makes mention of the fact that the implementations for contact tracing apps may follow a centralised or a decentralised approach. Generally, both systems use Bluetooth signals to log when smartphone owners are close to each other.  If one owner was confirmed to have contracted COVID-19, an alert can be sent to other owners they may have infected. Under the centralised version, the anonymised data gathered by the app will be uploaded to a remote server where matches are made with other contacts. Under the decentralised version, the data is kept on the mobile device of the user, giving users more control over their data. The EDPB does not give a recommendation for using either approach. Instead, national authorities may consider both concepts and carefully weigh up the respective effects on privacy and the possible impacts on individuals rights.

Before implementing contact tracing apps, the EDPB also suggests that a Data Protection Impact Assessment (DPIA) must be carried out as the processing is considered likely high risk (health data, anticipated large-scale adoption, systematic monitoring, use of new technological solution). Furthermore, they strongly recommend the publication of DPIAs to ensure transparency.

Lastly, the EDPB proposes that the use of contact tracing applications should be voluntary and reiterates that it should not rely on tracing individual movements but rather on proximity information regarding users.

Outlook

The EDPB acknowledges that the systematic and large scale monitoring of contacts between natural persons is a grave intrusion into their privacy. Therefore, Data Protection is indispensable to build trust, create the conditions for social acceptability of any solution, and thereby guarantee the effectiveness of these measures. It further underlines that public authorities should not have to choose between an efficient response to the current pandemic and the protection of fundamental rights, but that both can be achieved at the same time.

In the third part of the series regarding COVID-19 contact tracing apps, we will take a closer look into the privacy issues that countries are facing when implementing contact tracing technologies.

Series on COVID-19 Contact Tracing Apps Part 1: Different Countries, Different Apps

20. May 2020

In order to combat the spread of COVID-19, as more and more countries are phasing out of lockdowns, the eye is on the use of contact tracing apps to help facilitate breaking the chain of transmissions. Contact tracing apps hope to bring a safer way to combat the spread of the pandemic and enable people to go back to a life that is closer to their previous normal. In this miniseries, we would like to present to you different contact tracing apps, as well as European Guidelines and the data protection problems arising from the technology.

Contact tracing apps mostly rely on localising the users of the phones and trace their whereabouts to analyse if they have gotten in contact with someone that has later tested positive for the coronavirus. Individuals who have been in close proximity of someone who is confirmed to be a carrier of the virus, will then be notified and asked to self-isolate for a certain period of time.

Due to this function, however, privacy is a big fear for a lot of users. It comes not only with the processing of personal data, but also tracing of movement and the collection of health data in order to be effective.

It is also important to note that there are different approaches to the purpose and use of anti-coronavirus apps all over the world. While this post focuses on portraying different contact tracing apps, there are also technologies that have a different purpose. For example, there’s apps that require the localisation of mobile data with the purpose to track movement streams and localize a potential future outbreak area. Another option currently in use in Taiwan would be using the localisation data of mobile devices to control and ensure that the lockdown and quarantine measures are being followed. In Hong Kong, the mobile app is paired with a wristband to track movement of the user and alert officials if they leave their dwelling.

However, as there are a lot of contact tracing apps used in different countries, with varying technology and also varying issues in the light of data protection. While a lot of countries immediately developed and released COVID-19 tracing apps, some are still trying to develop or test the technology with a commitment to data protection. In order to see the variety of different approaches to the matter, we are going to present some of the countries and the apps they are using or developing.

The following countries are some of the countries that have already implemented a contact tracing app to be able to counteract the spread of the virus quickly:

  • Austria – As one of the first European countries to jump to action, Austria has implemented the use of the tracing app project DP3T, which is backed by European scientists to be the best choice in terms of data protection. The handling of the data is transparent, as well as minimal and voluntary. The technology is based on Bluetooth identifiers in idea similar to the Google and Apple technology, and the data is stored in a decentralized manner.
  • India – The Aarogya Setu app has been downloaded over 13 Million times within the first week of its release. It uses Bluetooth as well as GPS signals to trace devices, and collects a lot of sensitive data like names, birthdates, and biometric information. Due to a backlash in regards to data protection, it has been stated that the technology uses unique IDs to keep the data anonymized, that there is no access by third parties and that the data is only stored securely in case of a positive COVID-19 test.
  • Singapore – In Singapore, the TraceTogether app is a voluntary tracing app that uses Bluetooth and the mobile number of users in order to track their proximity to other devices. It does not use location data, however, and exchanges temporary encrypted user IDs in order to know who a device came into contact with. The encrypted IDs can only be decoded by the Ministry of Health, which holds the only decryption key.
  • South Korea – In South Korea, two apps are being used in conjunction, though the focus is rather to keep away from areas with infected people. One app, Corona 100m, was made by a private developer and notifies you if you come within 100 metres of a person that has tested positive for the virus. The app collects data such as diagnosis date, nationality, age, gender and location. The other app, Corona Maps, shows the location of diagnosed patients so you can avoid them.

On the other hand, some of the countries still working on the development include the following:

  • France – The StopCovid app under development in France is supposed to be ready by June, and is being criticized by many French politicians for the lack of regulation in the case of what happens with the data after the pandemic. France has also denied Google and Apple’s help with the development of the app, stating that the risks of misuse of the data are too high.
    Update: In the meantime, the French Data Protection Authority (CNIL) has released its second review of the contact tracing app on May 26, 2020, giving it a green light to continue after not seeing any major issues with the data protection concept. Despite using a centralized system which relies on pseudonymized and not anonymized data, the CNIL has stated that the government promises that there will not be any disadvantages and that the data can be deleted from the app.
  • Germany – Germany, much like France and other EU countries, has abandoned the joint PEPP-PT project in favour of coming up with their own national tracing app. As opposed to other countries, Germany sets much more hope in the joint venture with Google and Apple in an attempt to develop a privacy regulated app which is up to EU standards.
  • United Kingdom – The UK is currently planning on testing their contact tracing app system on the Isle of Wight, before they plan on rolling out the use of the app later in May. The app developed is using a more centralized approach for the storage of the data, which has been criticized by data protection lawyers. However, some have conceaded that in such a situation, the “greater justification” for the use of the data is given in the public interest and health of the citizens.
  • USA – As announced by tech giants Apple and Google, the joint development of a tracing app is on the way. The app will be operating over Bluetooth, and will exchange identifiers when two devices are near each other for 10 minutes. These identifiers change every 15 minutes to minimize extended tracing, and in case of a positive test the Public Health Authority may broadcast an alert with the consent of the infected person. For more detailed information, please see our previous blog post on the joint announcement.

While the use of contact tracing apps increases, the data protection issues do as well. Most of them deal with the question of governmental access and misuse of the data, as well as transparency and voluntary use of the apps. The European Parliament and the European Data Protection Board (EDPB) have published guidelines for location tracing apps to conform with data protection laws and regulations, which we will be presenting in an upcoming blogpost as part of this miniseries.

Overall, tracing apps seem to be becoming the focus of the pandemic containment. It is important to remember as a user that, while the pandemic is starting to become a new state of normal, a lot of countries will still try to counteract the spread of the virus, and location tracking technology is one of the most effective ways to do so. In such a light, users need to remain conscious of their country’s approach to tracing apps and the privacy issues they may cause.

In the second part of the series regarding COVID-19 contact tracing apps, we will be going further into detail on the EDPB’s Guideline on location tracing apps, and focus on the European expectations and regulation in regards to data protection on the issue.

Zoom agrees on security and privacy measures with NY Attorney General

13. May 2020

Due to the COVID-19 pandemic, Zoom has seen an exponential surge in new users over the past two months. As we have mentioned in a previous blog post, this increase in activity highlighted a range of different issues and concerns both on the security and on the privacy side of the teleconference platform.

In light of these issues, which induced a wave of caution around the use of Zoom by a lot of companies, schools, religious institutions and governmental departments, urging to stop the use of the platform, Zoom has agreed to enhance security measures and privacy standards.

In the Agreement struck on May 7th with the New York Attorney General Laetitia James, Zoom has come to terms over several new measures it will enforce over the course of the next weeks. However, most of these enhancements have already been planned in the CEO Yang’s “90-day plan” published on April 1st, and have been slowly put into effect.

These measures include:

  • a new data security program,
  • conduction of risk assessment reviews,
  • enhancement of encryption protocols,
  • a default password for every meeting,
  • halt to sharing user data with Facebook.

In response to the Agreement being struck, Attorney General James stated: “Our lives have inexorably changed over the past two months, and while Zoom has provided an invaluable service, it unacceptably did so without critical security protections. This agreement puts protections in place so that Zoom users have control over their privacy and security, and so that workplaces, schools, religious institutions, and consumers don’t have to worry while participating in a video call.“

A day prior, Zoom was also reinstated for the use of online classes by the New York City Department of Education. In order to ensure the privacy of the students and counteract “Zoombombing”, Zoom has agreed to enhanced privacy controls for free accounts, as well as kindergarten through 12th grade education accounts. Hosts, even those with free accounts, will, by default, be able to control access to their video conferences by requiring a password or the placement of users in a digital waiting room before a meeting can be accessed.

This is not the only new addition to the controls that hosts will be able to access: they will also be able to control access to private messages in a Zoom chat, control access to email domains in a Zoom directory, decide who can share screens, and more.

Overall, Zoom stated that it was happy to have been able to reach a resolution with the Attorney General quickly. It remains to see how the measures in is implementing will hold up to the still growing audience, and how fast they can be implemented for worldwide use.

Hungarian Government suspends GDPR rights for COVID-19 related Data Processing

12. May 2020

In the face of the Corona pandemic, Hungary is currently in an indefinite “state of emergency”. Originally, Prime Minister Victor Orbán decreed the state of emergency on 11 March 2020 lasting for a period of 15 days. However, on 30 March 2020, the Hungarian Parliament passed emergency legislation (Bill on Protection against Coronavirus or Bill T/9790) extending the state of emergency until terminated by the Prime Minister and allowing the Prime Minister to rule by decree during the state of emergency. The Bill was passed thanks to the two-thirds majority of Orbán’s Fidesz Party in the Hungarian Parliament.

On 4 May 2020, Prime Minister Orbán issued Decree No. 179/2020 which contains several provisions affecting Data Protection in Hungary extensively for the time of the state of emergency.

Most importantly, the decree suspends the individual data subject’s rights pursuant to Art. 15 to 22 of the European GDPR when processing personal data for the purpose of preventing, recognising, and stopping the spread of the Coronavirus. It also stipulates that the one month time limit for Controllers to provide the necessary information (Art. 12 para. 3 GDPR) will only begin after the termination of the state of emergency for any Coronavirus related data subject requests. Furthermore, the data collection information requirements for Controllers pursuant to Art. 13 and 14 GDPR will be satisfied by publishing an electronic privacy notice providing the purpose and the legal basis of data processing which the data subjects may take notice of.

The emergency decree received much criticism from various European Data Protection authorities and civil rights groups. The head of the European Data Protection Board (“EDPB”) Andrea Jelinek stated that she is “personally very worried” about the developments, and described the Hungarian government’s decision as “unnecessary [and] detrimental”. In its most recent plenary session, the EDPB also specifically discussed Hungary’s emergency measures in light of European Data Protection Law.

Enforcement of Brazil’s new Data Protection Law postponed due to COVID-19

8. May 2020

The Coronavirus is affecting South America, like the rest of the world, and it is spreading rapidly in its largest country: Brazil. Brazil’s Government and Legislators try to handle both the public health crisis and the economic crisis that the country is facing. Now both branches have adopted emergency measures to alleviate the effects of the virus, even impacting the enforcement of the country’s new national Data Protection Law (“Lei Geral de Proteção de Dados Pessoais” or “LGPD”).

The National Congress of Brazil only passed the LGPD in August 2018. It was originally scheduled to come into effect on 15 August 2020 (we reported). As the effects of the Coronavirus began to impact Brazilian businesses, many companies called for the postponement of the LGPD’s effective date due to the difficult economic environment and due to the fact that Brazil’s national Data Protection Authority (“ANPD”) is still not fully functional.

On 3 April 2020, the Senate of Brazil unanimously approved of the Law Bill “PL 1179/2020” which includes a provision to delay the effective date of the LGPD until 1 January 2021. Furthermore, the Bill sets forth that non-compliance with the LGPD shall not be sanctioned by the Data Protection Authorities until 1 August 2021.

The second chamber of Brazil’s National Congress, the House of Representatives, debated “PL 1179/2020” all throughout April 2020 and considered the implications of the LGPD’s postponement for the privacy rights of individuals, especially with many emergency measures on the way that were increasingly restrictive on privacy rights. A vote on “PL 1179/2020” by the House of Representatives was still pending by the end of the month.

On 29 April 2020, the President of Brazil took matters into his own hands when he issued Provisional Measure #959/2020. The measure postponed the effective date of the LGPD to 3 May 2021, without segmenting the postponement into two stages like the Senate’s Law Bill “PL 1179/2020” stipulated.

Provisional Measures issued by the President of Brazil serve as temporary law and are valid for a period of 60 days which the President may extend for another 60 days. During this time period, both chambers of the National Congress must approve of the Provisional Measure in order to become permanent law. If Congress disapproves, the measure will be invalidated.

Dutch DPA administers record €725 000 fine for GDPR violation

6. May 2020

The Dutch Data Protection Authority, Autoriteit Persoonsgegevens (Dutch DPA), has issued a EUR 725 000 fine on April 30th to a company for scanning the fingerprints of its employees in order to record attendance.

As fingerprints fall under sensitive data according to Art. 9 GDPR, by being biometric data and therefore can easily identify a data subject, the Dutch DPA has addressed two exceptions in the present case: explicit consent according to Art. 9 II a GDPR, and the necessity of the processing for security reasons, which are related back to Art.9 II g GDPR.

According to the Dutch DPA, none of the two exceptions apply.

In the first case, the Dutch DPA states that the employer has shown no proof of valid explicit consent of the employees. Rather, the Dutch DPA is of the opinion that in an employment relationship, consent cannot be given freely. While it is tricky to ensure freely given consent in situations where one side is dependant on the other, it is possible to ensure such a freely given consent by the means of offering an alternative form of processing, allowing the employee to choose from two options according to their own judgement. In the case brought to the Dutch DPA, this had not been the case. Rather, employees felt obligated to give their consent, especially since the denial resulted in a personal meeting with the director. An alternative option to scanning their fingerprints was not given by the company.

The second exception of the necessity of the processing for security reasons was also dismantled by the Dutch DPA. It reasoned with the fact that such an exception only applies in cases where the security of the systems or the building depend on biometric data, and cannot be done by a less invasive method. While the activities of the company remain confidential, the Dutch DPA has denied them to be of that level of importance that security can only be done through biometrics. Therefore, the fingerprint scanning in the matter was unnecessary and disproportionate to the invasion of the employees’ privacy.

As this case shows, it is recommendable to be careful with the processing of biometric data. In particular, companies should ensure to have valid consent before progressing with the processing of sensitive data to mitigate the risks of a fine.

The Video-conference service Zoom and its Data Security issues

20. April 2020

Amidst the Corona crisis, the video communications service Zoom gained enormous popularity. The rate of daily Zoom users skyrocketed from 10 Mio in December 2019 to 200 Mio in March 2020. As it outshined many of its competitors, Zoom labels itself as “the leader in modern enterprise video communications”. However, the company has been facing a lot of public criticism because of its weaknesses in data security and lack of awareness in data protection matters.

Basic data security weaknesses unfolded little by little starting in March 2020:

  • Zoom had to admit that it was wrongly advertising to provide full end-to-end encryption for all shared contents like video, audio or screen sharing.
  • Security experts revealed several bugs that could have allowed webcam and mic hijacking and the theft of login credentials.
  • An online Tech Magazine reported that Zoom leaked thousands of their users’ email addresses and photos to strangers.
  • Video-conferences which users did not protect with a password, enabled “Zoombombing”, a phenomenon in which strangers hijacked videocalls and disrupted them by posting pornographic and racist images as well as spamming the conversations with threatening language. In response, Zoom introduced the Waiting Room feature and additional password settings.

At the same time, Zoom’s data privacy practices came under scrutiny:

  • Zoom shared web analytics data with third-party companies for advertising purposes without having a legal basis or notifying users about this practice. In response to criticism, Zoom revised its privacy policy and now declares that it does not share data from meetings for advertising.
  • The company also shared more analytics data of its users with Facebook than stated on Zoom’s privacy policy, even if the user did not sign in with their Facebook account. Zoom introduced an update in which this sharing is terminated.
  • The New York Times revealed that Zoom used a data mining feature that matched Zoom users’ names and email addresses to their LinkedIn profiles without the users knowing about it. Zoom then enabled automatic sharing of the matched LinkedIn profiles with other meeting members that were subscribers of a LinkedIn service for sales prospecting (“LinkedIn Sales Navigator”). In response to criticism, Zoom removed this feature permanently.
  • Zoom hosted a feature called Attention Tracking, which let the meeting’s host know when an attendee had clicked away the meeting window for more than 30 seconds. In the meantime, Zoom disabled the feature.

The security and privacy issues of Zoom have led various public authorities and companies internationally to ban their workers from using the service.

On 1 April 2020, Zoom’s founder and CEO Eric S. Yuan announced a 90-day plan to significantly improve their data security in an effort to build greater trust with their users. This plan includes freezing the introduction of new features, enlarge their cybersecurity team and engage outside help from security advisors.

Greek Data Protection Authority releases Guidance on Cookies

16. March 2020

On 25 February 2020, the Hellenic Data Protection Authority (DPA) published a guidance on Cookies and other tracking tools. Previously, the Authority had found that Greek websites and service providers have been largely failing to comply with the rules on the use of Cookies and other trackers set out by the ePrivacy Directive and the GDPR, and reaffirmed by the European Court of Justice’s ruling on Planet 49.

The guidance states that it will be relevant to HTTP/S Cookies, Flash Cookies, local storage applying to HTML 5, device fingerprinting, OS identifiers, and material identifiers.

The Greek DPA reiterated that, generally, providers are obliged to obtain the user’s consent if they are using any tracking tools – irrespective of whether the processing of personal data is taking place. It also outlined that technically necessary trackers are exempt from the obligation to consent. Furthermore, the guidance goes into detail on how information and consent can be made available on websites specifically.

Lastly, the Authority has given Greek website providers a grace period of two months to implement the provisions of this guidance and thereby become compliant with the European rules on tracking tools.

Dutch DPA fines Tennis Association

12. March 2020

The Dutch Data Protection Authority has fined the Royal Dutch Tennis Association (“KNLTB”) with EUR 525,000 for selling personal data of more than 350,000 of its members to sponsors who had contacted some of the members by mail and telephone for direct marketing purposes.

In 2018, the KNLTB illegally provided personal data of its members to two sponsors for a fee. One sponsor received personal data from 50,000 members and the other sponsor from more than 300,000 members. It turned out that the KNLTB sold personal data such as name, gender and address to third parties without obtaining consent of the data subjects.

The KNLTB found that it had a legitimate interest in selling the data. However, the data protection authority rejected the existence of a legitimate interest for the sale of the data and therefore decided that there was no legal basis for the transfer of the personal data to the sponsors. The KNLTB has objected to the fine decision. The Dutch Data Protection Authority will assess this.

 

 

Pages: Prev 1 2 3 ... 7 8 9 10 11 12 13 ... 20 21 22 Next
1 8 9 10 11 12 22