Tag: Facial recognition software

Italian DPA imposes a 20 Mio Euro Fine on Clearview AI

29. March 2022

The Italian data protection authority “Garante” has fined Clearview AI 20 million Euros for data protection violations regarding its facial recognition technology. Clearview AI’s facial recognition system uses over 10 billion images from the internet and prides themself to have the largest biometric image database in the world. The data protection authority has found Clearview AI to be in breach of numerous GDPR requirements. For example, fair and lawful processing was not carried out within the data protection framework, and there was no lawful basis for the collection of information and no appropriate transparency and data retention policies.

Last November, the UK ICO warned of a potential 17 million pound fine against Clearview, and in this context, and also ordered Clearview to stop processing data.

Then, in December, the French CNIL ordered Clearview to stop processing citizens’ data and gave it two months to delete all the data it had stored, but did not mention any explicit financial sanction.

In Italy, Clearview AI must now, in addition to the 20 million Euro fine, not only delete all images of Italian citizens from its database. It must also delete the biometric information needed to search for a specific face. Furthermore, the company must provide a EU representative as a point of contact for EU data subjects and the supervisory authority.

Data protection authorities around the world are taking action against the facial recognition software Clearview AI

25. February 2021

The business model of the US company Clearview AI is coming under increasing pressure worldwide. The company collected billions of facial photos from publicly available sources, especially from social networks such as Facebook, Instagram, YouTube and similar services. Data subjects were not informed of the collection and use of their facial photos. Using the photos, Clearview AI created a comprehensive database and used it to develop an automated facial recognition system. Customers of this system are in particular law enforcement agencies and other prosecutors in the US, but companies can also make use of the system. In total, Clearview AI has around 2000 customers worldwide and a database with around 3 billion images.

After a comprehensive investigation by the New York Times in January 2020 drew attention to the company, opposition to the business practice is now also being voiced by the data protection authorities of various countries.

The Hamburg Data Protection Commissioner had already issued an order against Clearview AI in January 2021. According to the order, the company was to delete the biometric data of a Hamburg citizen who had complained to the authority about the storage. The reason given for the decision was that there was no legal basis for processing sensitive data and that the company was profiling by collecting photos over a longer period of time.

Now, several Canadian data protection authorities have also deemed Clearview AI’s actions illegal. In a statement, the Canadian Privacy Commissioner describes the activities as mass surveillance and an affront to the privacy rights of data subjects. The Canadian federal authority published a final report on the investigation into the Clearview AI case. In it, the company was found to have violated several Canadian federal reports.

It is interesting that the Canadian authorities even consider the data collection to be unlawful if Clearview AI were to obtain consents from the data subjects. They argue that already the purpose of the data processing is unlawful. They demand that Clearview AI cease its service in Canada and delete data already collected from Canadian citizens.

The pressure on Clearview AI is also growing due to the fact that the companies from which the data was collected are also opposing the procedure. In addition, the association “noyb” around the data protection activist Max Schrems is dealing with Clearview AI and various European data protection authorities have announced that they will take action against the facial recognition system.

Facial recognition on the rise

4. August 2017

At Australian airports new technology will be rolled out which will help processing passengers by means of facial recognition. Peter Dutton, Minister for Immigration and Border Protection, said that 105 smart gates will be provided for this purpose as part of a AU$22.5 million contract with Vision-Box Australia. Vision-Box has already implemented a facial recognition system at New York’s JFK airport.

Australian government’s goal is to automatize 90 % of air traveller processing by 2020. After the implementation, passengers will not have to show their passports, but will be processed by biometric recognition of their faces, irises and/or fingerprints.

Meanwhile, at Berlin’s Südkreuz station the testing of a facial recognition system began. The software can recognise known suspects and alert the police. Currently, the software is only scanning the faces of 250 volunteers. Thomas de Maizière, the German interior minister, aims at improving security in Germany after several terrorist attacks.

However, concerns were raised over this technology by privacy activists as well as by well-respected lawyers. They fear that Germany could head towards a surveillance state. Besides, it is stated there was no constitutional basis for the use of these methods.

Thomas de Maiziere aims to introduce a facial recognition software at train stations and airports in Germany

22. August 2016

Thomas de Maiziere, Germany’s Interior Minister, aims to introduce a facial recognition software at train stations and airports in order to support the identification of terror suspects. This suggestion was prompted by two Islamist attacks in Germany last month.

Due to the fact that internet software is able to determine whether individuals shown in photographs were celebrities or politicians Thomas de Maiziere commented that “I would like to use this kind of facial recognition technology in video cameras at airports and train stations. Then, if a suspect appears and is recognized, it will show up in the system”. He went on by explaining that such a system is already being tested in terms of the identification of unattended luggage, so that the camera reports the respective luggage to an authority after a certain number of minutes.

However, although other countries are also testing a similiar technology, Germany has been sceptical and has shown caution in terms of the introduction of surveillance due to historical events such as the abuses by the Stasi secret police in East Germany and the Gestapo under the Nazis.