Tag: data protection

Dutch data scandal: illegal trade of COVID-19 patient data

19. February 2021

In recent months, a RTL Nieuws reporter Daniël Verlaan has discovered widespread trade in the personal data of Dutch COVID-19 test subjects. He found ads consisting of photos of computer screens listing data of Dutch citizens. Apparently, the data had been offered for sale on various instant messaging apps such as Telegram, Snapchat and Wickr. The prices ranged from €30 to €50 per person. The data included home addresses, email addresses, telephone numbers, dates of birth and BSN identifiers (Dutch social security number).

The personal data were registered in the two main IT systems of the Dutch Municipal Health Service (GGD) – CoronIT, containing details about citizens who took a COVID-19 test, and HPzone Light, a contact-tracing system, which contains the personal data of people infected with the coronavirus.

After becoming aware of the illegal trade, the GGD reported it to the Dutch Data Protection Authority and the police. The cybercrime team of the Midden-Nederland police immediately started an investigation. It showed that at least two GGD employees had maliciously stolen the data, as they had access to the official Dutch government COVID-19 systems and databases. Within 24 hours of the complaint, two men were arrested. Several days later, a third suspect was tracked down as well. The investigation continues, since the extent of the data theft is unclear and whether the suspects in fact managed to sell the data. Therefore, more arrests are certainly not excluded.

Chair of the Dutch Institute for Vulnerability Disclosure, Victor Gevers, told ZDNet in an interview:

Because people are working from home, they can easily take photos of their screens. This is one of the issues when your administrative staff is working from home.

Many people expressed their disapproval of the insufficient security measures concerning the COVID-19 systems. Since the databases include very sensitive data, the government has a duty to protect these properly in order to prevent criminal misuse. People must be able to rely on their personal data being treated confidentially.

In a press release, the Dutch police also raised awareness of the cybercrime risks, like scam or identity fraud. Moreover, they informed about the possibilities of protection against such crimes and the need to report them. This prevents victims and allows the police to immediately track down suspects and stop their criminal practices.

Giant database leak exposes data on 220 million Brazilians

28. January 2021

On January 19th, 2021, the dfndr lab, PSafe’s cybersecurity laboratory, reported a leak in a Brazilian database that may have exposed the CPF number and other confidential information of millions of people.

According to the cybersecurity experts, who use artificial intelligence techniques to identify malicious links and fake news, the leaked data they have found contains detailed information on 104 million vehicles and about 40 million companies. Overall, the leak poses a risk to close to 220 million Brazilians.

The personal data contained in the affected database includes names, birthdates and individual taxpayer registry identification, with distinct vehicle information, including license plate numbers, municipality, colour, make, model, year of manufacture, engine capacity and even the type of fuel used. The breach both affects almost all Brazilian citizens, as well as authorities.

In a press release, the director of the dfndr lab, Emilio Simoni, explained that the biggest risk following this data leak is that this data will be used in phishing scams, in which a person is induced to provide more personal information on a fake page.

In their statement, PSafe does not disclose either the name of the company involved or how the information was leaked, whether it was due to a security breach, hacker invasion or easy access. However, regardless of the cause of the leak, the new Brazilian Data Protection Security Law provides for fines that can reach R $ 50 million for an infraction of this type.

Clubhouse Data Protection issues

Clubhouse is a new social networking app by the US company Alpha Exploration Co. available for iOS devices. Registered users can open rooms for others to talk about various topics. Participation is possible both as a speaker and as a mere listener. These rooms can be available for the public or as closed groups. The moderators speak live in the rooms and the listeners can then join the virtual room. Participants are initially muted and can be unmuted by the moderators to talk. In addition, the moderators can also mute the participants or exclude them from the respective room. As of now, new users need to be invited by other users, the popularity of these invitations started to rise in autumn 2020 when US celebrities started to use the app. With increasing popularity also in the EU, Clubhouse has come under criticism from a data protection perspective.

As mentioned Clubhouse can only be used upon an invitation. To use the option to invite friends, users must share their address book with Clubhouse. In this way, Alpha Exploration can collect personal data from contacts who have not previously consented to the processing of their data and who do not use the app. Not only Alpha Exploration, but also users may be acting unlawfully when they give the app access to their contacts. The user may also be responsible for the data processing associated with the sharing of address books. Therefore, it is not only the responsibility of Alpha Exploration, but also of the user to ensure that consent has been obtained from the contacts whose personal data is being processed. From a data protection perspective, it is advisable not to grant the Clubhouse app access to this data unless the consent of the respective data subjects has been obtained and ideally documented. Currently, this data is transferred to US servers without the consent of the data subjects in the said address books. Furthermore, it is not apparent in what form and for what purposes the collected contact and account information of third parties is processed in the USA.

Under Clubouse’s Terms of Service, and in many cases according to several national laws, users are prohibited from recording or otherwise storing conversations without the consent of all parties involved. Nevertheless, the same Terms of Service include the sentence “By using the service, you consent to having your audio temporarily recorded when you speak in a room.” According to Clubhouse’s Privacy Policy, these recordings are used to punish violations of the Terms of Service, the Community Guidelines and legal regulations. The data is said to be deleted when the room in question is closed without any violations having been reported. Again, consent to data processing should be treated as the general rule. This consent must be so-called informed consent. In view of the fact that the scope and purpose of the storage are not apparent and are vaguely formulated, there are doubts about this. Checking one’s own platform for legal violations is in principle, if not a legal obligation in individual cases, at least a so-called legitimate interest (Art. 6 (1) (f) GDPR) of the platform operator. As long as recordings are limited to this, they are compliant with the GDPR. The platform operator who records the conversations is primarily responsible for this data processing. However, users who use Clubhouse for conversations with third parties may be jointly responsible, even though they do not record themselves. This is unlikely to play a major role in the private sphere, but all the more so if the use is in a business context.

It is suspected that Clubhouse creates shadow profiles in its own network. These are profiles for people who appear in the address books of Clubhouse users but are not themselves registered with Clubhouse. For this reason, Clubhouse considers numbers like “Mobile-Box” to be well-connected potential users. So far, there is no easy way to object to Clubhouse’s creation of shadow profiles that include name, number, and potential contacts.

Clubhouse’s Terms of Use and Privacy Policy do not mention the GDPR. There is also no address for data protection information requests in the EU. However, this is mandatory, as personal data of EU citizens is also processed. In addition, according to Art. 14 GDPR, EU data subjects must be informed about how their data is processed. This information must be provided to data subjects before their personal data is processed. That is, before the data subject is invited via Clubhouse and personal data is thereby stored on Alpha Exploration’s servers. This information does not take place. There must be a simple opt-out option, it is questionable whether one exists. According to the GDPR, companies that process data of European citizens must also designate responsible persons for this in Europe. So far, it is not apparent that Clubhouse even has such data controllers in Europe.

The german “Verbraucherzentrale Bundesverband” (“VZBV”), the german federate Consumer Organisation, has issued a written warning (in German) to Alpha Exploration, complaining that Clubhouse is operated without the required imprint and that the terms of use and privacy policy are only available in English, not in German as required. The warning includes a penalty-based cease-and-desist declaration relating to Alpha Exploration’s claim of the right to extensive use of the uploaded contact information. Official responses from European data protection authorities regarding Clubhouse are currently not available. The main data protection authority in this case is the Irish Data Protection Commissioner.

So far, it appears that Clubhouse’s data protection is based solely on the CCPA and not the GDPR. Business use of Clubhouse within the scope of the GDPR should be done with extreme caution, if at all.

16 Million brazilian COVID-19 patients’ personal data exposed online

7. December 2020

In November 2020, personal and sensitive health data of about 16 Million brazilian COVID-19 patients has been leaked on the online platform GitHub. The cause was a hospital employee, that uploaded a spreadsheet with usernames, passwords, and access keys to sensitive government systems on the online platforms. Under those affected were also the brazilian President Jair Bolsonaro and his family as well as seven ministers and 17 provincial governors.

Under the exposed systems were two government databases used to store information on COVID-19 patients. The first “E-SUS-VE” was used for recording COVID-19 patients with mild symptoms, while the second “Sivep-Gripe” was used to keep track of hospitalized cases across the country.

However, both systems contained highly sensitive personal information such as patient names, addresses, telephone numbers, individual taxpayer’s ID information, but also healthcare records such as medical history and medication regimes.

The leak was discovered after a GitHub user spotted the spreadsheet containing the password information on the personal GitHub account of an employee of the Albert Einstein Hospital in Sao Paolo. The user informed the Brazilian newspaper Estadao, which analysed the information shared on the platform before it notified the hospital and the health ministry of Brazil.

The spreadsheet was ultimately removed from GitHub, while government officials changed passwords and revoked access keys to secure their systems after the leak.

However, Estadao reporters confirmed that the leaked data included personal data of Brazilians across all 27 states.

EU offers new alliance with the USA on data protection

4. December 2020

The European Commission and the High Representative of the Union for Foreign Affairs and Security Policy outlined a new EU-US agenda for global change, which was published on December 2nd, 2020. It constitutes a proposal for a new, forward-looking transatlantic cooperation covering a variety of matters, including data protection.

The draft plan states the following guiding principles:

  • Advance of global common goods, providing a solid base for stronger multilateral action and institutions that will support all like-minded partners to join.
  • Pursuing common interests and leverage collective strength to deliver results on strategic priorities.
  • Looking for solutions that respect common values of fairness, openness and competition – including where there are bilateral differences.

As said in the draft plan, it is a “once-in-a-generation” opportunity to forge a new global alliance. It includes an appeal for the EU and US to bury the hatchet on persistent sources of transatlantic tension and join forces to shape the digital regulatory environment. The proposal aims to create a shared approach to enforcing data protection law and combatting cybersecurity threats, which could also include possible restrictive measures against attributed attackers from third countries. Moreover, a transatlantic agreement concerning Artificial Intelligence forms a part of the recommendation. The purpose is setting a blueprint for regional and global standards. The EU also wants to openly discuss diverging views on data governance and facilitate free data flow with trust on the basis of high safeguards. Furthermore, the creation of a specific dialogue with the US on the responsibility of online platforms and Big Tech is included in the proposal as well as the development of a common approach to protecting critical technologies.

The draft plan is expected to be submitted for endorsement by the European Council at a meeting on December 10-11th, 2020. It suggests an EU-US Summit in the first half of 2021 as the moment to launch the new transatlantic agenda.

China issued new Draft for Personal Information Protection Law

23. November 2020

At the end of October 2020, China issued a draft for a new „Personal Information Protection Law” (PIPL). This new draft is the introduction of a comprehensive system in terms of data protection, which seems to have taken inspiration from the European General Data Protection Regulation (GDPR).

With the new draft, China’s regulations regarding data protection will be consisting of China’s Cybersecurity Law, Data Security Law (draft) and Draft PIPL. The new draft legislation contains provisions relating to issues presented by new technology and applications, all of this in around 70 articles. The fines written in the draft for non-compliance are quite high, and will bring significant impact to companies with operations in China or targeting China as a market.

The data protection principles drawn out in the draft PIPL include transparency, fairness, purpose limitation, data minimization, limited retention, data accuracy and accountability. The topics that are covered include personal information processing, the cross-border transfer of personal information, the rights of data subjects in relation to data processing, obligations of data processors, the authority in charge of personal information as well as legal liabilities.

Unlike China’s Cybersecurity Law, which provides limited extraterritorial application, the draft PIPL proposes clear and specific extraterritorial application to overseas entities and individuals that process the personal data of data subjects in China.

Further, the definition of “personal data” and “processing” under the draft PIPL are very similar to its equivalent term under the GDPR. Organizations or individuals outside China that fall into the scope of the draft PIPL are also required to set up a dedicated organization or appoint a representative in China, in addition to also report relevant information of their domestic organization or representative to Chinese regulators.

In comparison to the GDPR, the draft PIPL extends the term of “sensitive data” to also include nationality, financial accounts, as well as personal whereabouts. However, sensitive personal information is defined as information that once leaked or abused may cause damage to personal reputation or seriously endanger personal and property safety, which opens the potential for further interpretation.

The draft legislation also regulates cross-border transfers of personal information, which shall be possible if it is certified by recognized institutions, or the data processor executes a cross-border transfer agreement with the recipient located outside of China, to ensure that the processing meets the protection standard provided under the draft PIPL. Where the data processor is categorized as a critical information infrastructure operator or the volume of data processed by the data processor exceeds the level stipulated by the Cyberspace Administration of China (CAC), the cross-border transfer of personal information must pass a security assessment conducted by the CAC.

It further to keep in mind that the draft PIPL enlarges the range of penalties beyond those provided in the Cybersecurity Law, which will put a much higher pressure on liabilities for Controllers operating in China.

Currently, the period established to receive open comments on the draft legislation has ended, but the next steps have not yet been reported, and it not yet sure when the draft legislation will come into full effect.

EDPB issues guidance on data transfers following Schrems II

17. November 2020

Following the recent judgment C-311/18 (Schrems II) by the Court of Justice of the European Union (CJEU), the European Data Protection Board (EDPB) published “Recommendations on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data” on November 11th. These measures are to be considered when assessing the transfer of personal data to countries outside of the European Economic Area (EEA), or so-called third countries. These recommendations are subject to public consultation until the end of November. Complementing these recommendations, the EDPB published “Recommendations on the European Essential Guarantees for surveillance measures”. Added together both recommendations are guidelines to assess sufficient measures to meet standards of the General Data Protection Regulation (GDPR), even if data is transferred to a country lacking protection comparable to that of the GDPR.

The EDPB highlights a six steps plan to follow when checking whether a data transfer to a third country meets the standards set forth by the GDPR.

The first step is to map all transfers of personal data undertaken, especially transfers into a third country. The transferred data must be adequate, relevant and limited to what is necessary in relation to the purpose. A major factor to consider is the storage of data in clouds. Furthermore, onwards transfer made by processors should be included. In a second step, the transfer tool used needs to be verified and matched to those listed in Chapter V of the GDPR. The third step is assessing if anything in the law or practice of the third country can impinge on the effectiveness of the safeguards of the transfer tool. The before mentioned Recommendations on European Essential Guarantees are supposed to help to evaluate a third countries laws, regarding the access of data by public authorities for the purpose of surveillance.

If the conclusion that follows the previous steps is that the third countries legislation impinges on the effectiveness of the Article 46 GDPR tool, the fourth step is identifying supplementary measures that are necessary to bring the level of protection of the data transfer up to EU Standards, or at least an equivalent, and adopting these. Recommendations for such measures are listed in Annex 2 of the EDPB Schrems II Recommendations. They may be of contractual, technical, or organizational nature. In Annex 2 the EDPB mentions seven technical cases they found and evaluates them. Five were deemed to be scenarios for which effective measures could be found. These are:

1. Data storage in a third country, that does not require access to the data in the clear.
2. Transfer of pseudonymized data.
3. Encrypted data merely transiting third countries.
4. Transfer of data to by law specially protected recipients.
5. Split or multi-party processing.

Maybe even more relevant are the two scenarios the EDPB found no effective measures for and therefore deemed to not be compliant with GDPR standards.:

6. Transfer of data in the clear (to cloud services or other processors)
7. Remote access (from third countries) to data in the clear, for business purposes, such as, for example, Human Resources.

These two scenarios are frequently used in practice. Still, the EDPB recommends not to execute these transfers in the upcoming future.
Examples of contractual measures are the obligation to implement necessary technical measures, measures regarding transparency of (requested) access by government authorities and measures to be taken against such requests. Accompanying this the European Commission published a draft regarding standard contractual clauses for transferring personal data to non-EU countries, as well as organizational measures such as internal policies and responsibilities regarding government interventions.

The last two steps are undertaking the formal procedural steps to adapt supplementary measures required and re-evaluating the former steps in appropriate intervals.

Even though these recommendations are not (yet) binding, companies should take a further look at the recommendations and check if their data transfers comply with the new situation.

EU looking to increase Enforcement Powers over Tech Giants

24. September 2020

In an interview with The Financial Times on Sunday, EU-Commissioner Thierry Breton stated that the European Union is considering plans to increase its enforcement powers regarding tech giants.

This empowerment is supposed to include punitive measures such as forcing tech firms to break off and sell their EU operations if the dominance on the market becomes too large. It is further considered to enable the EU to be able to boot tech companies from the EU single market entirely. Breton stated these measures would of course only be used in extreme circumstances, but did not elaborate on what would qualify as extreme.

“There is a feeling from end-users of these platforms that they are too big to care,” Thierry Breton told The Financial Times. In the interview, he compared tech giants’ market power to the big banks before the financial crisis. “We need better supervision for these big platforms, as we had again in the banking system,” he stated.

In addition, the European Union is considering a rating system, in which companies would be given scores in different categories such as tax compliance, taking action against illegal content, etc. However, Breton said that it is not the intend to make companies liable for their users’ content.

Breton further said that the first drafts of the new law will be ready by the end of the year.

Once the final draft is in place, it will require approval both by the European Parliament as well as the European Council, before it can be enacted.

Apple to delay iOS 14 Ad Tracking Changes

9. September 2020

In an update from Apple on Thursday, 3rd of September 2020, it was announced that some of the plans that were supposed to be launched in the new iOS 14 update are being delayed. The new feature of iOS developers having to request permission from app users before collecting their data for ad tracking is being pushed back to the beginning of 2021.

This and other features are seen as a big step towards users’ privacy, which you can read up on in our previous blogpost, but they have been criticised by app developers and big tech giants alike.

The permission feature was supposed to change the way users’ privacy is being accessed, from the current opt-out method to an opt-in one. “When enabled, a system prompt will give users the ability to allow or reject that tracking on an app-by-app basis,” stated Apple.

However, this will be delayed until early next year, due to the fact that the changes would affect a large amount of the platforms’ publishers, which rely strongly on ad tracking revenue. Facebook criticized the changes and announced that some of their tools may lose efficiency, and hence cause problems for smaller app developers. To combat this issue, Apple said: “We want to give developers the time they need to make the necessary changes, and as a result, the requirement to use this tracking permission will go into effect early next year.”

In recent years, Apple has taken its users’ privacy more seriously, launching new adjustments to ensure their right to privacy is being integrated in their devices.

„We believe technology should protect users’ fundamental right to privacy, and that means giving users tools to understand which apps and websites may be sharing their data with other companies for advertising or advertising measurement purposes, as well as the tools to revoke permission for this tracking,” Apple emphasized.

Category: EU · GDPR · General
Tags: , , ,

Apple’s new iOS Update will enhance Privacy Features

31. August 2020

At its Worldwide Developers Conference 2020 back in June, Apple announced new privacy features coming in a future iOS 14 update for its devices. These updates, coming in the fall, are supposed to include more control of sharing location data and indicators when an app is using the microphone or camera.

The updates mean that it will be further possible to limit how much location information is shared with apps, only allowing it to share approximate data rather than the devices precise location. Apple also introduced labels for app permissions to inform people how much data an app requests, before they even download them. The feature will show people those labels in two categories, on “Data Linked To You” and “Data Used to Track You“. However, this will have to be provided by the app developers themselves, leaving grey areas open.

“For food, you have nutrition labels,” said Erik Neuenschwander, Apple’s user privacy manager. “So we thought it would be great to have something similar for apps. We’re going to require each developer to self-report their practices.”

Further, the privacy updates also incorporate the Safari browser, allowing for a report on privacy while surfing the internet through the use of a “privacy report” button. It will allow the overview of all third-party trackers through one click, and allow the user to block them directly.

Apple also moved from the opt-out standard for apps using the user’s personal data to an opt-in scheme, requiring the active consent of the users in order to allow the use of their data.

While this is a positive development for all Apple users, Facebook states that it sees issues for small developers having to face these new privacy settings.

In a blog post, Facebook said it was making a change to its own apps, which in addition to its flagship app also include WhatsApp and Instagram, that would likely spare them from having to ask iPhone users for data-tracking permissions that many advertising industry insiders believe users will refuse. Facebook also stated it was making changes due to Apple’s new privacy rules that could hurt smaller developers that use a Facebook tool for serving apps in third-party apps.

Overall, Apple’s new privacy rules are a welcomed changes for its users, handing them further control over their own personal data.

Pages: 1 2 3 4 5 6 Next
1 2 3 6