High Court dismisses challenge regarding Automated Facial Recognition

12. September 2019

On 4 September, the High Court of England and Wales dismissed a challenge to the police’s use of Automated Facial Recognition Technology (“AFR”). The court ruled that the use of AFR was proportionate and necessary to meet the legal obligations of the police.

The pilot project AFR Locate was used for certain events and public places when the commission of crimes was likely. Up to 50 faces per second can be detected. The faces are then compared by biometric data analysis with wanted persons registered in police databases. If no match is found, the images are deleted immediately and automatically.

An individual has initiated a judicial review process after he has not been identified as a wanted person, but is likely to have been captured by AFR Locate. He considered this to be illegal, in particular due to a violation of the right to respect for private and family life under Article 8 of the European Convention on Human Rights (“ECHR”) and data protection law in the United Kingdom. In his view, the police did not respect the data protection principles. In particular, that approach would violate the principle of Article 35 of the Data Protection Act 2018 (“DPA 2018”), which requires the processing of personal data for law enforcement purposes to be lawful and fair. He also pointed out that the police had failed to carry out an adequate data protection impact assessment (“DPIA”).

The Court stated that the use of AFR has affected a person’s rights under Article 8 of the ECHR and that this type of biometric data has a private character in itself. Despite the fact that the images were erased immediately, this procedure constituted an interference with Article 8 of the ECHR, since it suffices that the data is temporarily stored.

Nevertheless, the Court found that the police’s action was in accordance with the law, as it falls within the police’s public law powers to prevent and detect criminal offences. The Court also found that the use of the AFR system is proportionate and that the technology can be used openly, transparently and with considerable public commitment, thus fulfilling all existing criteria. It was only used for a limited period, for a specific purpose and published before it was used (e.g. on Facebook and Twitter).

With regard to data protection law, the Court considers that the images of individuals captured constitute personal data, even if they do not correspond to the lists of persons sought, because the technology has singled them out and distinguished them from others. Nevertheless, the Court held that there was no violation of data protection principles, for the same reasons on which it denied a violation of Art. 8 ECHR. The Court found that the processing fulfilled the conditions of legality and fairness and was necessary for the legitimate interest of the police in the prevention and detection of criminal offences, as required by their public service obligations. The requirement of Sec. 35 (5) DPA 2018 that the processing is absolutely necessary was fulfilled, as was the requirement that the processing is necessary for the exercise of the functions of the police.

The last requirement under Sec. 35 (5) of the DPA 2018 is that a suitable policy document is available to regulate the processing. The Court considered the relevant policy document in this case to be short and incomplete. Nevertheless, it refused to give a judgment as to whether the document was adequate and stated that it would leave that judgment to the Information Commissioner Office (“ICO”), as it would publish more detailed guidelines.

Finally, the Court found that the impact assessment carried out by the police was sufficient to meet the requirements of Sec. 64 of DPA 2018.

The ICO stated that it would take into account the High Court ruling when finalising its recommendations and guidelines for the use of live face recognition systems.

London’s King’s Cross station facial recognition technology under investigation by the ICO

11. September 2019

Initially reported by the Financial Times, London’s King’s Cross station is under crossfire for making use of a live face-scanning system across its 67 acres large site. Developed by Argent, it was confirmed that the system has been used to ensure public safety, being part of a number of detection and tracking methods used in terms of surveillance at the famous train station. While the site is privately owned, it is widely used by the public and houses various shops, cafes, restaurants, as well as office spaces with tenants like, for example, Google.

The controversy behind the technology and its legality stems from the fact that it records everyone in its parameters without their consent, analyzing their faces and compairing them to a database of wanted criminals, suspects and persons of interest. While Developer Argent defended the technology, it has not yet explained what the system is, how it is used and how long it has been in place.

A day before the ICO launched its investigation, a letter from King’s Cross Chief Executive Robert Evans reached London Mayor Sadiq Khan, explaining the matching of the technology against a watchlist of flagged individuals. In effect, if footage is unmatched, it is blurred out and deleted. In case of a match, it is only shared with law enforcement. The Metropolitan Police Service has stated that they have supplied images for a database to carry out facial scans to system, though it claims to not have done so since March, 2018.

Despite the explanation and the distinct statements that the software is abiding by England’s data protection laws, the Information Commissioner’s Office (ICO) has launched an investigation into the technology and its use in the private sector. Businesses would need to explicitly demonstrate that the use of such surveillance technology is strictly necessary and proportionate for their legitimate interests and public safety. In her statement, Information Commissioner Elizabeth Denham further said that she is deeply concerned, since “scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all,” especially if its being done without their knowledge.

The controversy has sparked a demand for a law about facial recognition, igniting a dialogue about new technologies and future-proofing against the yet unknown privacy issues they may cause.

Category: GDPR · General · UK
Tags: , , , ,

Phone numbers of 420 million Facebook users in online database

5. September 2019

A database with more than 400 million phone numbers of Facebook users was publicly accessible online. Most of the records belong to American Facebook users (133 million), 50 million to users from Vietnam and 18 million to users from UK. In each case the phone number was connected with the user’s Facebook ID, a long, unique and public number associated with the account.

As a result of the publicly accessible data the concerned users are put at risk for spam calls and SIM-swapping attacks. Furthermore, the passwords of the accounts can be changed so that the user cannot access his own Facebook profile.

IT-expert Sanyam Jain found the database and contacted TechCrunch after being unable to find the owner. TechCrunch verified the authenticity of the found data and then tried to determine the owner – without success. So they contacted the web host who turned the site down.

The database is not accessible at the moment, but it is still unknown how the data was collected and who uploaded the information. It is possible, that the ability to find friends by phone number on Facebook was misused to create the database. This feature was disabled by Facebook in April 2018. In connection to this new infringement, Facebook just announced that there is no evidence for a hacking attack.

Update: on Friday September 6th 2019 a copy of the database appeared on the internet, so that the data is currently publicly accessible again.

Portugal’s new data protection law

3. September 2019

Portugal’s new data protection law “Lei de Execução do Regulamento Geral sobre a Proteção de Dados” was finally published and entered into force last month, following its approval in June. This makes Portugal one of the last EU states to implement the GDPR regulations in national law. The new law regulates among other things the following points:

Consent:

Persons aged 13 and over can give effective consent. In an employment relationship, an employee’s consent is considered a legitimate legal basis only if it leads to a legal or economic advantage for the employee or if it is necessary to fulfil a contract.

Data Protection Officer:

In addition to the tasks defined in the GDPR, the Data Protection Officer in Portugal must ensure that audits are carried out, that Controllers are aware of the importance of early detection of data protection incidents and the relations with the Data Subjects regarding data protection.

Video surveillance:

The law stipulates that in some areas, such as bathrooms or changing rooms, video surveillance is prohibited. ATMs may also only be filmed in such a way that the customer’s keyboard and the associated PIN entry cannot be seen.

Retention periods:

If no retention period is specified, the duration necessary to achieve the purpose shall be decisive. However, the right to be forgotten can only be exercised at the end of the retention period. In contrast to the GDPR the Portuguese data protection law permits a storage of certain dates for always. This applies only to data about the social security amounts for the retirement if suitable technical and organizational measures are taken.

Invitation to datenschutzticker.live on October 30th 2019 in Cologne

30. August 2019

The entry into force of the General Data Protection Regulation (GDPR) was a milestone in data protection law and attracted worldwide attention. In the daily business, interpretation issues continue to determine the work of all responsible persons for data protection. Since 8 years datenschutzticker.de, the blog of KINAST Attorneys at Law, has been reporting on practical questions regarding data protection. After approximately 2.000 blog posts and countless feedback from the readership, datenschutzticker.de is now going live.

 

We cordially invite you to this event!

 

datenschutzticker.live offers a platform for exchange between authorities and companies. We are pleased to have the Federal Commissioner for Data Protection and Freedom of Information, Prof. Ulrich Kelber, as well as the State Data Protection Commissioner for Hesse, Prof. Michael Ronellenfitsch and Saxony-Anhalt, Dr. Harald von Bose, as speakers for our event. Top-class speakers from the corporate side will also give lectures on data protection issues from their corporate practice.

Register today for datenschutzticker.live. The event will be in German language and take place all day on Wednesday, 30th October 2019 in the Wolkenburg in Cologne (city centre, near the main railway station). datenschutzticker.live is open to everyone and the participation is free of charge and including catering.

Due to the limitation of capacities we ask you to register by email at veranstaltung@datenschutzticker.live , stating your name and, if you are not coming as a private participant, your organisation. We look forward to meeting you live!

Your team from
datenschutzticker.live

Greek Parliament passes bill to adopt GDPR into National Law

29. August 2019

On Monday, August 26th, the Greek Parliament passed a bill that will incorporate the European Union’s General Data Protection Regulation (GDPR) into national law. Originally, the adaptation of the EU regulation was supposed to take place until May 06, 2018. Greece failed to comply with the deadline.

The, now, fast-paced implementation of the regulation may have come as a result of the referral of Greece and Spain by the European Commission (EC) to the European Court of Justice on July 25th. Since they had failed to adopt the GDPR into national law up until then, Greece could have faced a fine of €5,287.50 for every day passed since May 06, in addition to a stiff fine of €1.3 million. In its statement, the EC declared that “the lack of transposition by Spain and Greece creates a different level of protection of peoples’ rights and freedoms, and hampers data exchanges between Greece and Spain on one side and other Member States, who transposed the Directive, on the other side”.

The EU countries are allowed to adopt certain derogations, exeptions and specifications under the GDPR. Greece has done so, in the approved bill, with adjusted provisions in regards to the age of consent, the process of appointing a Data Protection Officer, sensitive data processing, data repurposing, data deletion, certifications and criminal sanctions.

The legislation was approved by New Democracy, the main opposition SYRIZA, the center-left Movement for Change and leftist MeRA25, with an overwhelming majority. The GDPR has already been in effect since May 25th, 2018, with its main aim being to offer more control to individuals over their personal data that they provide to companies and services.

 

Category: EU · EU Commission · GDPR · General
Tags: , , ,

Google strives to reconcile advertising and privacy

27. August 2019

While other browser developers are critical of tracking, Google wants to introduce new standards to continue enabling personalized advertising. With the implementation of the “Privacy Sandbox” and the introduction of a new identity management system, the developer of the Chrome browser wants to bring browsers to an uniform level in processing of user data and protect the privacy of users more effectively.

The suggestions are the first steps of the privacy initiative announced by Google in May. Google has published five ideas. For example, browsers are to manage a “Privacy Budget” that gives websites limited access to user data so that users can be sorted into an advertising target group without being personally identified. Google also plans to set up central identity service providers that offer limited access to user data via an application programming interface (API) and inform users about the information they have passed on.

Measures like Apple’s, which have introduced Intelligent Tracking Protection, are not in Google’s interest, as Google generates much of its revenue from personalized advertising. In a blog post, Google also said that blocking cookies promotes non-transparent techniques such as fingerprinting. Moreover, without the ability to display personalized advertising, the future of publishers would be jeopardized. Their costs are covered by advertising. Recent studies have shown, that the financing of publishers decreases by an average of 52% if advertising loses relevance due to the removal of cookies.

Based on these ideas, the discussion among developers about the future of web browsers and how to deal with users’ privacy should now begin. Google’s long-term goal is a standardization process to which all major browser developers should adhere. So far, Google has had only limited success with similar initiatives.

Swedish DPA imposed ist first GDPR fine

23. August 2019

The Swedish Data Protection Authority “datainspektionen” imposed its first fine since the General Data Protection Regulation (GDPR) has entered into force.

Affected is a high school in Skelleftea in the north of Sweden. In the school, 22 pupils were part of a pilot programme to monitor attendance times using facial recognition.

In January 2019, the IT company Tieto announced that it was testing the presence of students at the school with tags, spartphone apps and facial recognition software for automatic registration of students. In Sweden, it is mandatory for teachers to report the presence of all students in each lesson to the supervisors. According to Tieto, teachers at the school in Skelleftea spend around 18,000 hours a year on this registration. Therefore, a class was selected for the pilot project to test the registration for eight weeks using facial recognition. Parents and students were asked to give their consent.

However, the Swedish data protection authority has now said that the way in which consent was obtained violates the GDPR because of the clear imbalance between controller and data subject. Additionally the school failed to conduct an impact assessment including seeking prior consultation with datainspektionen.

Therefore, the DPA imposed a fine of SEK 200.000 (approximately EUR 20.000). In Sweden, public authorities can be fined up to SEK 20.000.000 (approximately EUR 1.000.000).

Millions of unencrypted biometric data discovered on the internet

19. August 2019

The Israeli security researchers Noam Rotem and Ran Locar discovered the unprotected and mostly unencrypted database of Biostar 2 during an Internet search.

Biostar 2 is a web-based biometric locking system that provides centralized control of access to secure facilities such as warehouses and office buildings. The researchers were given access to over 27.8 million records and 23 gigabytes of data, including fingerprint data, facial recognition data, facial photos of users, user names and passwords, and protocols for accessing facilities. Among others, the system is used by the British Metropolitan Police, insurance companies and banks.

Rotem told the Guardian: “The access allows first of all seeing millions of users are using this system to access different locations and see in real time which user enters which facility or which room in each facility, even.”
He also states that they were able to change data and add new users. So they could have added their own photo and fingerprint to an existing user account and could have had access to the buildings that user had access to or could have added a new user with their own photo and fingerprints.

The intensity of this data breach was particularly large because Biostar 2 is used in 1.5 million locations around the world and fingerprints, unlike passwords, cannot be changed.
Before Rotem and Locar turned to the Guardian, they made several attempts to contact Suprema, the security company responsible for Biostar 2. Meanwhile, the vulnerability has been closed.

To the Guardian, Suprema’s marketing director said they had conducted an “in-depth evaluation” of the information provided: “If there has been any definite threat on our products and/or services, we will take immediate actions and make appropriate announcements to protect our customers’ valuable businesses and assets.”

Rotem said that such problems not only occur at Suprema, but that he contacts three or four companies a week with similar problems.

Irish DPC releases guide on Data Breach Notifications

15. August 2019

On Monday the Irish Data Protection Commission (IDPC) has released a quick guide on Data Breach Notifications. It is supposed to help controllers understand their obligations regarding notification and communication requirements, both to the responsible DPC and to the data subject.

The guide, which is supposed to be a quick overview of the requirements and obligations which fall on data controllers, refers to the Article 29 Working Party’s (now European Data Protection Board or EDPB), much more in depth and detailed, guidance in their guideline concerning Data Breach Notifications.

In summary, the IDPC categorizes a Data Breach as a “security incident that negatively impacts the confidentiality, integrity or availability of personal data; meaning that the controller is unable to ensure compliance with the principles relating to the processing of personal data as outlined in Art. 5 GDPR”. In this case, it falls to the controller to follow two primary obligations: (1) to notify the responsible DPC of the data breach, unless it is unlikely to result in a risk for the data subject, and (2) to communicate the data breach to the affected data subjects, when it is likely to result in a high risk.

The IDPC seeks to help controllers by providing a list of requirements in cases of notification to the DPC and data subjects, especially given the tight timeframe for notifications to be filed within 72 hours of awareness of the breach. It is hoping to eliminate confusion arising in the process, as well as problems that companies have had while filing a Data Breach Notification in the past.

Pages: 1 2 3 4 5 6 7 8 9 10 ... 37 38 39 Next
1 2 3 39