Tag: Data Processing

High Court dismisses challenge regarding Automated Facial Recognition

12. September 2019

On 4 September, the High Court of England and Wales dismissed a challenge to the police’s use of Automated Facial Recognition Technology (“AFR”). The court ruled that the use of AFR was proportionate and necessary to meet the legal obligations of the police.

The pilot project AFR Locate was used for certain events and public places when the commission of crimes was likely. Up to 50 faces per second can be detected. The faces are then compared by biometric data analysis with wanted persons registered in police databases. If no match is found the images are deleted immediately and automatically.

An individual has initiated a judicial review process after he has not been identified as a wanted person, but is likely to have been captured by AFR Locate. He considered this to be illegal, in particular due to a violation of the right to respect for private and family life under Article 8 of the European Convention on Human Rights (“ECHR”) and data protection law in the United Kingdom. In his view, the police did not respect the data protection principles. In particular, that approach would violate the principle of Article 35 of the Data Protection Act 2018 (“DPA 2018”), which requires the processing of personal data for law enforcement purposes to be lawful and fair. He also pointed out that the police had failed to carry out an adequate data protection impact assessment (“DPIA”).

The Court stated that the use of AFR has affected a person’s rights under Article 8 of the ECHR and that this type of biometric data has a private character in itself. Despite the fact that the images were erased immediately, this procedure constituted an interference with Article 8 of the ECHR, since it suffices that the data is temporarily stored.

Nevertheless, the Court found that the police’s action was in accordance with the law, as it falls within the police’s public law powers to prevent and detect criminal offences. The Court also found that the use of the AFR system is proportionate and that the technology can be used openly, transparently and with considerable public commitment, thus fulfilling all existing criteria. It was only used for a limited period, for a specific purpose and published before it was used (e.g. on Facebook and Twitter).

With regard to data protection law, the Court considers that the images of individuals captured constitute personal data, even if they do not correspond to the lists of persons sought, because the technology has singled them out and distinguished them from others. Nevertheless, the Court held that there was no violation of data protection principles, for the same reasons on which it denied a violation of Art. 8 ECHR. The Court found that the processing fulfilled the conditions of legality and fairness and was necessary for the legitimate interest of the police in the prevention and detection of criminal offences, as required by their public service obligations. The requirement of Sec. 35 (5) DPA 2018 that the processing is absolutely necessary was fulfilled, as was the requirement that the processing is necessary for the exercise of the functions of the police.

The last requirement under Sec. 35 (5) of the DPA 2018 is that a suitable policy document is available to regulate the processing. The Court considered the relevant policy document in this case to be short and incomplete. Nevertheless, it refused to give a judgment as to whether the document was adequate and stated that it would leave that judgment to the Information Commissioner Office (“ICO”), as it would publish more detailed guidelines.

Finally, the Court found that the impact assessment carried out by the police was sufficient to meet the requirements of Sec. 64 of DPA 2018.

The ICO stated that it would take into account the High Court ruling when finalising its recommendations and guidelines for the use of live face recognition systems.

Google strives to reconcile advertising and privacy

27. August 2019

While other browser developers are critical of tracking, Google wants to introduce new standards to continue enabling personalized advertising. With the implementation of the “Privacy Sandbox” and the introduction of a new identity management system, the developer of the Chrome browser wants to bring browsers to an uniform level in processing of user data and protect the privacy of users more effectively.

The suggestions are the first steps of the privacy initiative announced by Google in May. Google has published five ideas. For example, browsers are to manage a “Privacy Budget” that gives websites limited access to user data so that users can be sorted into an advertising target group without being personally identified. Google also plans to set up central identity service providers that offer limited access to user data via an application programming interface (API) and inform users about the information they have passed on.

Measures like Apple’s, which have introduced Intelligent Tracking Protection, are not in Google’s interest, as Google generates much of its revenue from personalized advertising. In a blog post, Google also said that blocking cookies promotes non-transparent techniques such as fingerprinting. Moreover, without the ability to display personalized advertising, the future of publishers would be jeopardized. Their costs are covered by advertising. Recent studies have shown, that the financing of publishers decreases by an average of 52% if advertising loses relevance due to the removal of cookies.

Based on these ideas, the discussion among developers about the future of web browsers and how to deal with users’ privacy should now begin. Google’s long-term goal is a standardization process to which all major browser developers should adhere. So far, Google has had only limited success with similar initiatives.

Spanish DPA imposes fine on Spanish football league

13. June 2019

The Spanish data protection authority Agencia Española de Protección de Datos (AEPD) has imposed a fine of 250.000 EUR on the organisers of the two Spanish professional football leagues for data protection infringements.

The organisers, Liga Nacional de Fútbol Profesional (LFP), operate an app called “La Liga”, which aims to uncover unlicensed performances of games broadcasted on pay-TV. For this purpose, the app has recorded a sample of the ambient sounds during the game times to detect any live game transmissions and combined this with the location data. Privacy-ticker already reported.

AEPD criticized that the intended purpose of the collected data had not been made transparent enough, as it is necessary according to Art. 5 paragraph 1 GDPR. Users must approve the use explicitly and the authorization for the microphone access can also be revoked in the Android settings. However, AEPD is of the opinion that La Liga has to warn the user of each data processing by microphone again. In the resolution, the AEPD points out that the nature of the mobile devices makes it impossible for the user to remember what he agreed to each time he used the La Liga application and what he did not agree to.

Furthermore, AEPD is of the opinion that La Liga has violated Art. 7 paragraph 3 GDPR, according to which the user has the possibility to revoke his consent to the use of his personal data at any time.

La Liga rejects the sanction because of injustice and will proceed against it. It argues that the AEPD has not made the necessary efforts to understand how the technology works. They explain that the technology used is designed to produce only one particular acoustic fingerprint. This fingerprint contains only 0.75% of the information. The remaining 99.25% is discarded, making it technically impossible to interpret human voices or conversations. This fingerprint is also converted into an alphanumeric code (hash) that is not reversible to the original sound. Nevertheless, the operators of the app have announced that they will remove the controversial feature as of June 30.

Google Introduces Automatic Deletion for Web Tracking History

7. May 2019

Google has announced on its blog that it will introduce an auto delete feature for web tracking history.

So far, users have the option to manually delete data from Google products such as YouTube or Maps. After numerous requests, however, Google follows other technology giants and revised its privacy settings. “We work to keep your data private and secure, and we’ve heard your feedback that we need to provide simple ways for you to manage or delete it,” Google writes on it’s blog.

Users will be able to choose a period for which the data should remain stored, lasting a minimum of 3 months and a maximum of 18 months. At the end of the selected period, Google will automatically delete the data on a regular basis. This option will initially be introduced for Location History and Web & App Activity data and will be available over the next few weeks, according to Google.

Google’s announcement came the day after Microsoft unveiled a set of features designed to strengthen privacy controls for its Microsoft 365 users, aimed to simplify its privacy policies.

On the same day, during Facebook’s annual developer conference, F8, Mark Zuckerberg announced a privacy roadmap for the social network.

European Commission adopts adequacy decision on Japan

28. January 2019

The European Commission adopted an adequacy decision for Japan on the 23rd of January 2019, enabling data flows to take place freely and safely. The exchange of personal data is based on strong safeguards that Japan has put in place in advance of the adequacy decision to ensure that the transfer of data complies with EU standards.

The additional safeguards include:

– A set of rules (Supplementary Rules), which will cover the differences between the two data protection systems. This should strengthen the protection of sensitive data, the exercise of personal rights and the conditions under which EU data can be further transferred to another third country. These additional rules are binding in particular on Japanese companies importing data from the EU. They can also be enforced by the independent Japanese data protection authority (PPC) as well as by courts.

– Also, safeguards have been established concerning access by Japanese authorities for law enforcement and national security purposes. In this regard, the Japanese Government has given assurances to the Commission and has ensured that the use of personal data is limited to what is necessary and proportionate and is subject to independent supervision and redress.

– A complaint handling mechanism to investigate and resolve complaints from Europeans regarding Japanese authorities’ access to their data. This new mechanism will be managed and monitored by Japan’s independent data protection authority.

The adequacy decision has been in force since 23rd of January 2019. After two years, the functioning of the framework will be reviewed for the first time. The subsequent reviews will take place at least every four years.

The adequacy decision also complements the EU-Japan Economic Partnership Agreement, which will enter into force in February 2019. European companies will benefit from free data flows as well as privileged access to the 127 million Japanese consumers.

 

Turkey – Starting dates for registration obligation for processing data has been announced

3. September 2018

The data protection authority in turkey has announced in his decision 2018/88 starting dates to register as a data controller on VERBIS prior to processing personal data, the online registration system VERBIS can be found on the homepage of the Turkish data protection authority. 

Earliest starting date for the registration process will be the 1st of October 2018.

 

Following start dates have been announced

a) 1st of October 2018 – 30th of September 2019, for data controllers that employ more than 50 employees and whose annual financial statement exceeds TRY 25 million

b) 1st of October 2018 – 30th of September 2019, for data controllers established outside of Turkey

c) 1st of January 2019 – 31st of March2019, for data controllers that employ less than 50 employees, whose financial statement does not exceed TRY 25 million, but whose core business includes the processing of sensitive data

d) 1st of April – 30th June, for public institutions and organizations that act as data controllers

 

Data controllers should take the necessary action and register with VERBIS during the applicable period.

Authorization of the French DPA to process Personal Data for litigation purposes

26. February 2016

In February 2016, the French DPA (CNIL), published a single decision (AU-046) addressed to cover data processing activities from public organisms and private organizations for the purpose of managing and enforcing court actions.

The CNIL states that corporations may process certain categories of personal data, such as criminal convictions, offences or security measures in this context, in order to defend their interests in court. Art. 25. I. 3° of the French Data Protection Act, regulates the processing of these categories of personal data, for which a prior authorization from the CNIL is required. Also the prevention of criminal offences falls under the scope of this article. However, this article does not apply if the offences and criminal convictions are not related to the criminal sphere.

The AU-046 aims at accelerating and simplifying the process to obtain CNIL´s authorization for the processing of these personal data categories. The scope of application of this authorization is the processing related to offenses, convictions and security measures to prepare, perform and follow disciplinary action or judicial proceedings and, if necessary, to enforce the decision.

This authorization concerns all sectors and all types of litigation.

Category: French DPA
Tags: ,
Pages: Prev 1 2
1 2