China publishes provisions on the protection of personal data of children

10. October 2019

On 23 August 2019, the Cyberspace Administration of China published regulations on the cyber protection of personal data of children, which came into force on 1 October 2019. China thus enacted the first rules focusing exclusively on the protection of children’s personal data.

In the regulations, “children” refers to minors under the age of 14. This corresponds to the definition in the national “Information Security Technology – Personal Information Security Specification”.

The provisions regulate activities related to the collection, storage, use, transfer and disclosure of personal data of children through networks located on the territory of China. However, the provisions do not apply to activities conducted outside of China or to similar activities conducted offline.

The provisions provide a higher standard of consent than the Cybersecurity Law of China. To obtain the consent of a guardian, a network operator has to provide the possibility of refusal and expressly inform the guardian of the following:

  • Purpose, means and scope of collection, storage, use, transfer and disclosure of children’s personal information;
  • Storage location of children’s personal information, retention period and how the relevant information will be handled after expiration of the retention period;
  • Safeguard measures protecting children’s personal information;
  • Consequences of rejection by a guardian;
  • The channels and means of filing or reporting complaints; and
  • How to correct and delete children’s personal information.

The network operator also has to restrict internal access to children’s personal information. In particular, before accessing the information, personnel must obtain consent of the person responsible for the protection of children’s personal data or an authorised administrator.

If children’s personal data are processed by a third party processor, the network operator is obliged to carry out a security assessment of the data processor commissioned to process the children’s personal data. He also has to conclude an entrustment agreement with the data processor. The data processor is obliged to support the network operator in fulfilling the request of the guardian to delete the data of a child after termination of the service. Subletting or subcontracting by the data processor is prohibited.

If personal data of children is transferred to a third party, the network operator shall carry out a security assessment of the commissioned person or commission a third party to carry out such an assessment.

Children or their legal guardians have the right to demand the deletion of children’s personal data under certain circumstances. In any case, they have the right to demand the correction of personal data of children if they are collected, stored, used or disclosed by a network operator. In addition, the legal guardians have the right to withdraw their consent in its entirety.

In the event of actual or potential data breaches, the network operator is obliged to immediately initiate its emergency plan and take remedial action. If the violation has or may have serious consequences, the network operator must immediately report the violation to the competent authorities and inform the affected children and their legal guardians by e-mail, letter, telephone or push notification. Where it is challenging to send the notification to any data subject, the network operator shall take appropriate and effective measures to make the notification public. However, the rules do not contain a precise definition of the serious consequences.

In the event that the data breach is caused or observed by a data processor, the data processor is obliged to inform the network operator in good time.

USA and UK sign Cross Border Data Access Agreement for Criminal Electronic Data

The United States and the United Kingdom have entered into the first of its kind CLOUD Act Data Access Agreement, which will allow both countries’ law enforcement authorities to demand authorized access to electronic data relating to serious crime. In both cases, the respective authorities are permitted to ask the tech companies based in the other country, for electronic data directly and without legal barriers.

At the base of this bilateral Agreement stands the U.S.A.’s Clarifying Lawful Overseas Use of Data Act (CLOUD Act), which came into effect in March 2018. It aims to improve procedures for U.S. and foreign investigators for obtaining electronic information held by service providers in the other country. In light of the growing number of mutual legal assistance requests for electronic data from U.S. service providers, the current process for access may take up to two years. The Data Access Agreement can reduce that time considerably by allowing for a more efficient and effective access to data needed, while protecting the privacy and civil liberties of the data subjects.

The Cloud Act focuses on updating legal frameworks to respond to the growing technology in electronic communications and service systems. It further enables the U.S. and other countries to enter into a mutual executive Agreement in order to use own legal authorities to access electronic evidence in the other respective country. An Agreement of this form can only be signed by rights-respecting countries, after it has been certified by the U.S. Attorney General to the U.S. Congress that their laws have robust substansive and procedural protections for privacy and civil liberties.

The Agreement between the U.K. and the U.S.A. further assures providers that the requested disclosures are compatible with data protection laws in both respective countries.

In addition to the Agreement with the United Kingdom, there have been talks between the United States and Australia on Monday, reporting negotiations for such an Agreement between the two countries. Other negotiations have also been held between the U.S. and the European Commission, representing the European Union, in regards to a Data Access Agreement.

Category: General · UK · USA
Tags: , , , ,

Belgian DPA announces GDPR fine

7. October 2019

The Belgian data protection authority (Gegevensbeschermingsautoriteit) has recently imposed a fine of €10,000 for violating the General Data Protection Regulation (GDPR). The case concerns a Belgian shop that provided the data subject with only one opportunity to get a customer card, namely the  electronic identity card (eID). The eID is a national identification card, which contains several information about the cardholder, so the authority considers that the use of this information without the valid consent of the customer is disproportionate to the service offered.

The Authority had learnt of the case following a complaint from a customer. He was denied a customer card because he did not want to provide his electronic identity card. Instead, he had offered the shop to send his data in writing.

According to the Belgian data protection authority, this action violates the GDPR in several respects. On the one hand, the principle of data minimisation is not respected. This requires that the duration and the quantity of the processed data are limited by the controller to the extent absolutely necessary for the pursued purpose.

In order to create the customer card, the controller has access to all the data stored on the eID, including name, address, a photograph and the barcode associated with the national registration number. The Authority therefore believes that the use of all eID data is disproportionate to the creation of a customer card.

The DPA also considers that there is no valid consent as a legal basis. According to the GDPR, the consent must be freely given, specific and informed. However, there is no voluntary consent in this case, since no other alternative is offered to the customer. If a customer refuses to use his electronic ID card, he will not receive a customer card and will therefore not be able to benefit from the shops’ discounts and advantages.

In view of these violations, the authority has imposed a fine of €10,000.

Category: Belgian DPA · Belgium · GDPR · General
Tags: ,

CJEU rules pre-checked Cookie consent invalid

2. October 2019

The Court of Justice of the European Union (CJEU) ruled on Tuesday, October 1rst, that storing Cookies on internet users’ devices requires active consent. This decision concerns the implementation of widely spread pre-checked boxes, which has been decided to be insufficient to fulfill the requirements of a lawful consent under the General Data Protection Regulation (GDPR).

The case to be decided concerned a lottery for advertizing purposes initiated by Planet49 GmbH. During the participation process internet users were confronted with two information texts and corresponding checkboxes. Within the first information text the users were asked to agree to be contacted by other companies for promotional offers, by ticking the respective checkbox. The second information text required the user to consent to the installation of Cookies on their devices, while the respective checkbox had already been pre-checked. Therefore users would have needed to uncheck the checkbox if they did not agree to give their consent accordingly (Opt-out).

The Federal Court of Justice in Germany raised and referred their questions to the CJEU regarding whether such a process of obtaining consent could be lawful under the relevant EU jurisprudence, in particular whether valid consent could have been obtained for the storage of information and Cookies on users devices, in case of such mechanisms.

Answering the questions, the CJEU decided, referring to the relevant provisions of Directive 95/46 and the GDPR that require an active behaviour of the user, that pre-ticked boxes cannot constitute a valid consent. Furthermore, in a statement following the decision, the CJEU clarified that consent must be specific, and that users should be informed about the storage period of the Cookies, as well as about third parties accessing users’ information. The Court also said that the “decision is unaffected by whether or not the information stored or accessed on the user’s equipment is personal data.”

In consequence of the decision, it is very likely that at least half of all websites that fall into the scope of the GDPR will need to consider adjustments of their Cookie Banners and, if applicable, procedures for obtaining consent with regard to performance-related and marketing and advertising Cookies in order to comply with the CJEU’s view on how to handle Cookie usage under the current data protection law.

Cookies, in general, are small files which are sent to and stored in the browser of a terminal device as part of the website user’s visit on a website. In case of performance-related and marketing and advertising Cookies, the website provider can then access the information that such Cookies collected about the user when visiting the website on a further occasion, in order to, e.g., facilitate navigation on the internet or transactions, or to collect information about user behaviour.

Following the new CJEU decision, there are multiple possibilities to ensure a GDPR compliant way to receive users’ active consent. In any case it is absolutely necessary to give the user the possibility of actively checking the boxes themselves. This means that pre-ticked boxes are no longer a possibility.

In regard to the obligation of the website controller to provide the user with particular information about the storage period and third party access, a possible way would be to include a passage about Cookie information within the website’s Privacy Policy. Another would be to include all the necessary information under a seperate tab on the website containing a Cookie Policy. Furthermore, this information needs to be easily accessible by the user prior to giving consent, either by including the information directly within the Cookie Banner or by providing a link therein.

As there are various different options depending on the types of the used Cookies, and due to the clarification made by the CJEU, it is recommended to review the Cookie activities on websites and the corresponding procedures of informing about those activities and obtaining consent via the Cookie Banner.

CJEU rules that Right To Be Forgotten is only applicable in Europe

27. September 2019

In a landmark case on Tuesday the Court of Justice of the European Union (CJEU) ruled that Google will not have to apply the General Data Privacy Regulation’s (GDPR) “Right to be Forgotten” to its search engines outside of the European Union. The ruling is a victory for Google in a case against a fine imposed by the french Commission nationale de l’informatique et des libertés (CNIL) in 2015 in an effort to force the company and other search engines to take down links globally.

Seeing as the internet has grown into a worldwide media net with no borders, this case is viewed as a test of wether people can demand a blanket removal of information about themselves from searches without overbearing on the principles of free speech and public interest. Around the world, it has also been perceived as a trial to see if the European Union can extend its laws beyond its own borders.

“The balance between right to privacy and protection of personal data, on the one hand, and the freedom of information of internet users, on the other, is likely to vary significantly around the world,” the court stated in its decision.The Court also expressed in the judgement that the protection of personal data is not an absolute right.

While this leads to companies not being forced to delete sensitive information on their search engines outside of the EU upon request, they must take precautions to seriously discourage internet users from going onto non-EU versions of their pages. Furthermore, companies with search engines within the EU will have to closely weigh freedom of speech against the protection of privacy, keeping the currently common case to case basis for deletion requests.

In effect, since the Right to be Forgotten had been first determined by the CJEU in 2014, Google has since received over 3,3 million deletion requests. In 45% of the cases it has complied with the delisting of links from its search engine. As it stands, even while complying with deletion requests, the delisted links within the EU search engines can still be accessed by using VPN and gaining access to non-EU search engines, circumventing the geoblocking. This is an issue to which a solution has not yet been found.

CNIL updates its FAQs for case of a No-Deal Brexit

24. September 2019

The French data protection authority “CNIL” updated its existing catalogue of questions and answers (“FAQs”) to inform about the impact of a no-deal brexit and how controllers should prepare for the transfer of data from the EU to the UK.

As things stand, the United Kingdom will leave the European Union on 1st of November 2019. The UK will then be considered a third country for the purposes of the European General Data Protection Regulation (“GDPR”). For this reason, after the exit, data transfer mechanisms become necessary to transfer personal data from the EU to the UK.

The FAQs recommend five steps that entities should take when transferring data to a controller or processor in the UK to ensure compliance with GDPR:

1. Identify processing activities that involve the transfer of personal data to the United Kingdom.
2. Determine the most appropriate transfer mechanism to implement for these processing activities.
3. Implement the chosen transfer mechanism so that it is applicable and effective as of November 1, 2019.
4. Update your internal documents to include transfers to the United Kingdom as of November 1, 2019.
5. If necessary, update relevant privacy notices to indicate the existence of transfers of data outside the EU and EEA where the United Kingdom is concerned.

CNIL also discusses the GDPR-compliant data transfer mechanisms (e.g., standard contractual clauses, binding corporate rules, codes of conduct) and points out that, whichever one is chosen, it must take effect on 1st of November. If controllers should choose a derogation admissible according to GDPR, CNIL stresses that this must strictly comply with the requirements of Art. 49 GDPR.

Data Breach: Millions of patient data available on the Internet

20. September 2019

As reported by the US investment platform ProPublica and the German broadcaster Bayerischer Rundfunk, millions of highly sensitive patient data were discovered freely accessible on the Internet.

Among the data sets are high-resolution X-ray images, breast cancer screenings, CT scans and other medical images. Most of them are provided with personal data such as birth dates, names and information about their doctor and their medical treatment. The data could be found for years on unprotected servers.

In Germany, around 13,000 data records are affected, and more than 16 million worldwide, including more than 5 million patients in the USA.

When X-ray or MRI images of patients are taken, they are stored on “Picture Archiving Communication System” (PACS) servers. If these servers are not sufficiently secured, it is easy to access the data. In 2016, Oleg Pianykh, Professor of Radiology at Harvard Medical School, published a study on unsecured PACS servers. He was able to locate more than 2700 open systems, but the study did not prompt anyone in the industry to act.

The German Federal Ministry for Information Security has now informed authorities in 46 countries. Now it remains to be seen how they will react to the incident.

Ecuadorian Data Breach reveals Data of over 20 Million People

19. September 2019

On Monday, 16th of September, it has been revealed that the detailed information of potencially every citizen of Ecuador has been freely available online as part of a massive data breach resulting from an incorrectly configured database. The leak, detected by security researchers of vpnMentor during a routine large-scale web mapping project, exposed more than 20 million individuals, inclusing close to 7 million children, giving access to 18 GB of data.

In effect Ecuador counts close to 17 million citizens, making it possible that almost every citizen has had some data compromised. This also includes government officials, high profile persons like Julian Assange, and the Ecuadorian President.

In their report, vpnMentor designates that it was able to track the server back to its owner, an ecuadorian company named Novaestrat, which is a consulting company providing services in data analytics, strategic marketing and software development.

It also mentioned several examples of the entries it had found in the database, including the types of data that were leaked. Those came down to full names, gender and birth information, home and e-mail adresses, telephone numbers, financial information, family members and employment information.

Access to the data has been cut off by the ecuadorian Computer Emergency Response Team, but the highly private and sensitive nature of the leaked information could create long lasting privacy issues for the citizens of the country.

In a twitter post, Telecommunications Minister Andres Michelena announced that the data protection bill, which had been in the works for months, will be submitted to the National Assembly within 72 hours. On top of that, an investigation into the possibility of a violation of personal privacy by Novaestrat has been opened.

High Court dismisses challenge regarding Automated Facial Recognition

12. September 2019

On 4 September, the High Court of England and Wales dismissed a challenge to the police’s use of Automated Facial Recognition Technology (“AFR”). The court ruled that the use of AFR was proportionate and necessary to meet the legal obligations of the police.

The pilot project AFR Locate was used for certain events and public places when the commission of crimes was likely. Up to 50 faces per second can be detected. The faces are then compared by biometric data analysis with wanted persons registered in police databases. If no match is found the images are deleted immediately and automatically.

An individual has initiated a judicial review process after he has not been identified as a wanted person, but is likely to have been captured by AFR Locate. He considered this to be illegal, in particular due to a violation of the right to respect for private and family life under Article 8 of the European Convention on Human Rights (“ECHR”) and data protection law in the United Kingdom. In his view, the police did not respect the data protection principles. In particular, that approach would violate the principle of Article 35 of the Data Protection Act 2018 (“DPA 2018”), which requires the processing of personal data for law enforcement purposes to be lawful and fair. He also pointed out that the police had failed to carry out an adequate data protection impact assessment (“DPIA”).

The Court stated that the use of AFR has affected a person’s rights under Article 8 of the ECHR and that this type of biometric data has a private character in itself. Despite the fact that the images were erased immediately, this procedure constituted an interference with Article 8 of the ECHR, since it suffices that the data is temporarily stored.

Nevertheless, the Court found that the police’s action was in accordance with the law, as it falls within the police’s public law powers to prevent and detect criminal offences. The Court also found that the use of the AFR system is proportionate and that the technology can be used openly, transparently and with considerable public commitment, thus fulfilling all existing criteria. It was only used for a limited period, for a specific purpose and published before it was used (e.g. on Facebook and Twitter).

With regard to data protection law, the Court considers that the images of individuals captured constitute personal data, even if they do not correspond to the lists of persons sought, because the technology has singled them out and distinguished them from others. Nevertheless, the Court held that there was no violation of data protection principles, for the same reasons on which it denied a violation of Art. 8 ECHR. The Court found that the processing fulfilled the conditions of legality and fairness and was necessary for the legitimate interest of the police in the prevention and detection of criminal offences, as required by their public service obligations. The requirement of Sec. 35 (5) DPA 2018 that the processing is absolutely necessary was fulfilled, as was the requirement that the processing is necessary for the exercise of the functions of the police.

The last requirement under Sec. 35 (5) of the DPA 2018 is that a suitable policy document is available to regulate the processing. The Court considered the relevant policy document in this case to be short and incomplete. Nevertheless, it refused to give a judgment as to whether the document was adequate and stated that it would leave that judgment to the Information Commissioner Office (“ICO”), as it would publish more detailed guidelines.

Finally, the Court found that the impact assessment carried out by the police was sufficient to meet the requirements of Sec. 64 of DPA 2018.

The ICO stated that it would take into account the High Court ruling when finalising its recommendations and guidelines for the use of live face recognition systems.

London’s King’s Cross station facial recognition technology under investigation by the ICO

11. September 2019

Initially reported by the Financial Times, London’s King’s Cross station is under crossfire for making use of a live face-scanning system across its 67 acres large site. Developed by Argent, it was confirmed that the system has been used to ensure public safety, being part of a number of detection and tracking methods used in terms of surveillance at the famous train station. While the site is privately owned, it is widely used by the public and houses various shops, cafes, restaurants, as well as office spaces with tenants like, for example, Google.

The controversy behind the technology and its legality stems from the fact that it records everyone in its parameters without their consent, analyzing their faces and compairing them to a database of wanted criminals, suspects and persons of interest. While Developer Argent defended the technology, it has not yet explained what the system is, how it is used and how long it has been in place.

A day before the ICO launched its investigation, a letter from King’s Cross Chief Executive Robert Evans reached Mayor of London Sadiq Khan, explaining the matching of the technology against a watchlist of flagged individuals. In effect, if footage is unmatched, it is blurred out and deleted. In case of a match, it is only shared with law enforcement. The Metropolitan Police Service has stated that they have supplied images for a database to carry out facial scans to system, though it claims to not have done so since March, 2018.

Despite the explanation and the distinct statements that the software is abiding by England’s data protection laws, the Information Commissioner’s Office (ICO) has launched an investigation into the technology and its use in the private sector. Businesses would need to explicitly demonstrate that the use of such surveillance technology is strictly necessary and proportionate for their legitimate interests and public safety. In her statement, Information Commissioner Elizabeth Denham further said that she is deeply concerned, since “scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all,” especially if its being done without their knowledge.

The controversy has sparked a demand for a law about facial recognition, igniting a dialogue about new technologies and future-proofing against the yet unknown privacy issues they may cause.

Category: GDPR · General · UK
Tags: , , , ,
Pages: 1 2 3 4 5 6 7 8 9 10 ... 37 38 39 Next
1 2 3 39