Tag: ICO

High Court dismisses challenge regarding Automated Facial Recognition

12. September 2019

On 4 September, the High Court of England and Wales dismissed a challenge to the police’s use of Automated Facial Recognition Technology (“AFR”). The court ruled that the use of AFR was proportionate and necessary to meet the legal obligations of the police.

The pilot project AFR Locate was used for certain events and public places when the commission of crimes was likely. Up to 50 faces per second can be detected. The faces are then compared by biometric data analysis with wanted persons registered in police databases. If no match is found the images are deleted immediately and automatically.

An individual has initiated a judicial review process after he has not been identified as a wanted person, but is likely to have been captured by AFR Locate. He considered this to be illegal, in particular due to a violation of the right to respect for private and family life under Article 8 of the European Convention on Human Rights (“ECHR”) and data protection law in the United Kingdom. In his view, the police did not respect the data protection principles. In particular, that approach would violate the principle of Article 35 of the Data Protection Act 2018 (“DPA 2018”), which requires the processing of personal data for law enforcement purposes to be lawful and fair. He also pointed out that the police had failed to carry out an adequate data protection impact assessment (“DPIA”).

The Court stated that the use of AFR has affected a person’s rights under Article 8 of the ECHR and that this type of biometric data has a private character in itself. Despite the fact that the images were erased immediately, this procedure constituted an interference with Article 8 of the ECHR, since it suffices that the data is temporarily stored.

Nevertheless, the Court found that the police’s action was in accordance with the law, as it falls within the police’s public law powers to prevent and detect criminal offences. The Court also found that the use of the AFR system is proportionate and that the technology can be used openly, transparently and with considerable public commitment, thus fulfilling all existing criteria. It was only used for a limited period, for a specific purpose and published before it was used (e.g. on Facebook and Twitter).

With regard to data protection law, the Court considers that the images of individuals captured constitute personal data, even if they do not correspond to the lists of persons sought, because the technology has singled them out and distinguished them from others. Nevertheless, the Court held that there was no violation of data protection principles, for the same reasons on which it denied a violation of Art. 8 ECHR. The Court found that the processing fulfilled the conditions of legality and fairness and was necessary for the legitimate interest of the police in the prevention and detection of criminal offences, as required by their public service obligations. The requirement of Sec. 35 (5) DPA 2018 that the processing is absolutely necessary was fulfilled, as was the requirement that the processing is necessary for the exercise of the functions of the police.

The last requirement under Sec. 35 (5) of the DPA 2018 is that a suitable policy document is available to regulate the processing. The Court considered the relevant policy document in this case to be short and incomplete. Nevertheless, it refused to give a judgment as to whether the document was adequate and stated that it would leave that judgment to the Information Commissioner Office (“ICO”), as it would publish more detailed guidelines.

Finally, the Court found that the impact assessment carried out by the police was sufficient to meet the requirements of Sec. 64 of DPA 2018.

The ICO stated that it would take into account the High Court ruling when finalising its recommendations and guidelines for the use of live face recognition systems.

London’s King’s Cross station facial recognition technology under investigation by the ICO

11. September 2019

Initially reported by the Financial Times, London’s King’s Cross station is under crossfire for making use of a live face-scanning system across its 67 acres large site. Developed by Argent, it was confirmed that the system has been used to ensure public safety, being part of a number of detection and tracking methods used in terms of surveillance at the famous train station. While the site is privately owned, it is widely used by the public and houses various shops, cafes, restaurants, as well as office spaces with tenants like, for example, Google.

The controversy behind the technology and its legality stems from the fact that it records everyone in its parameters without their consent, analyzing their faces and compairing them to a database of wanted criminals, suspects and persons of interest. While Developer Argent defended the technology, it has not yet explained what the system is, how it is used and how long it has been in place.

A day before the ICO launched its investigation, a letter from King’s Cross Chief Executive Robert Evans reached Mayor of London Sadiq Khan, explaining the matching of the technology against a watchlist of flagged individuals. In effect, if footage is unmatched, it is blurred out and deleted. In case of a match, it is only shared with law enforcement. The Metropolitan Police Service has stated that they have supplied images for a database to carry out facial scans to system, though it claims to not have done so since March, 2018.

Despite the explanation and the distinct statements that the software is abiding by England’s data protection laws, the Information Commissioner’s Office (ICO) has launched an investigation into the technology and its use in the private sector. Businesses would need to explicitly demonstrate that the use of such surveillance technology is strictly necessary and proportionate for their legitimate interests and public safety. In her statement, Information Commissioner Elizabeth Denham further said that she is deeply concerned, since “scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all,” especially if its being done without their knowledge.

The controversy has sparked a demand for a law about facial recognition, igniting a dialogue about new technologies and future-proofing against the yet unknown privacy issues they may cause.

Category: GDPR · General · UK
Tags: , , , ,

ICO releases a draft Code of Practice to consult on the Use of Personal Data in Political Campaigning

14. August 2019

The United Kingdom’s Information Commissioner’s Office (ICO) plans to give consultations on a new framework code of practice regarding the use of personal data in relation to politcal campaigns.

ICO states that in any democratic society it is vital for political parties,  candidates and campaigners to be able to communicate effectively with voters. Equally vital, though, is that all organisations involved in political campaigning use personal data in a transparent, lawful way that is understood by the people.

Along with the internet, politcal campaigning has become increasingly sophisticated and innovative. Using new technologies and techniques to understand their voters and target them, political campaigning has changed, using social media, the electoral register or screening names for ethnicity and age. In a statement from June, ICO has adressed the risk that comes with innovation, which, intended or not, can undermine the democratic process by hidden manipulation through the processing of personal data that the people do not understand.

In this light, ICO expresses that their current guidance is outdated, since it has not been updated since the introduction of the General Data Protection Regulation (GDPR). It does not reflect modern campainging practices. However, the framework does not establish new requirements for campaigners, instead aims at explaining and clarifying data protection and electronic marketing laws as they already stand.

Before drafting the framework, the Information Commissioner launched a call for views in October 2018 in hopes of input from various people and organisations. The framework is hoped to have taken into account the responses the ICO had received in the process.

In hopes of being the basis of a statutory code of practice if the relevant legislation is introduced, the draft of the framework code of practice is now out for public consultation, and will remain open for public access until Ocotber 4th.

CNIL and ICO publish revised cookie guidelines

6. August 2019

The French data protection authority CNIL as well as the British data protection authority ICO have revised and published their guidelines on cookies.

The guidelines contain several similarities, but also differ in some respects.

Both France and the UK consider rules that apply to cookies to be also applicable to any device that stores or accesses information. In addition, both authorities stress that users must give specific, free and unambiguous consent before cookies are placed. Further scrolling of the website cannot be considered as consent. Likewise, obtaining consent from T&Cs is not lawful. This procedure violates Art. 7 (2) of the General Data Protection Regulation (GDPR), according to which the request for consent shall be presented in a manner which is clearly distinguishable from the other matters, in an intelligible and easily accessible form, using clear and plain language. In addition, all parties who place cookies must be named so that informed consent can be obtained. Finally, both authorities point out that browser settings alone are not a sufficient basis for valid consent.

With regard to the territorial scope, CNIL clarifies that the cookie rules apply only to the processing of cookies within the activities of an establishment of a controller or processor in France, regardless of whether the processing takes place in France. The English guideline does not comment on this.

Cookie walls are considered non-compliant with GDPR by the French data protection authority due to the negative consequences for the user in case of refusal. ICO, on the other hand, is of the opinion that a consent forced on the basis of a cookie wall is probably not valid. Nevertheless GDPR must be balanced with other rights. Insofar ICO has not yet delivered a clear position.

Regarding analytic cookies, CNIL explains that a consent is not always necessary, namely not if they correspond to a list of cumulative requirements created by CNIL. ICO, on the other hand, does not exempt cookies from the consent requirement even in the case of analytic cookies.

Finally, CNIL notes that companies have six months to comply with the rules. However, this period will only be set in motion by the publication of a statement by the CNIL, which is still pending. CNIL expects this statement to be finalised during the first quarter of 2020. The ICO does not foresee such a time limit.

EDPB publishes information note on data transfer in the event of a no-deal Brexit

25. February 2019

The European Data Protection Board has published an information note to explain data transfer to organisations and facilitate preparation in the event that no agreement is reached between the EEA and the UK. In case of a no-deal Brexit, the UK becomes a third country for which – as things stand at present – no adequacy decision exists.

EDPB recommends that organisations transferring data to the UK carry out the following five preparation steps:

• Identify what processing activities will imply a personal data transfer to the UK
• Determine the appropriate data transfer instrument for your situation
• Implement the chosen data transfer instrument to be ready for 30 March 2019
• Indicate in your internal documentation that transfers will be made to the UK
• Update your privacy notice accordingly to inform individuals

In addition, EDPB explains which instruments can be used to transfer data to the UK:
– Standard or ad hoc Data Protection Clauses approved by the European Commission can be used.
– Binding Corporate Rules for data processing can be defined.
– A code of conduct or certification mechanism can be established.

Derogations are possible in the cases mentioned by article 49 GDPR. However, they are interpreted very restrictively and mainly relate to processing activities that are occasional and non-repetitive. Further explanations on available derogations and how to apply them can be found in the EDPB Guidelines on Article 49 of GDPR.

The French data protection authority CNIL has published an FAQ based on the information note of the EDPB, explaining the consequences of a no-deal Brexit for the data transfer to the UK and which preparations should be made.

ICO fines companies for not paying the data protection fee

4. December 2018

The UK’s Information Commissioner’s Office (ICO) fines the first companies for not paying the data protection fee. Unless they are exempt, all organisations, companies and sole traders who process personal data have to pay an annual data protection fee.

Depending on their maximum turnover, number of employees and whether they are a charity or public authority, the fee varies from £40 to £2,900. Whereas the fine for not paying varies from £400 to £4,000. The fines recovered go to the Treasury’s Consolidated Fund. The regulations came into force together with the new Data Protection Act on 25 May 2018.

“Following numerous attempts to collect the fees via our robust collection process, we are now left with no option but to issue fines to these organisations. They must now pay these fines within 28 days or risk further legal action. (…) You are breaking the law if you process personal data or are responsible for processing it and do not pay the data protection fee to the ICO”, said Paul Arnold, Deputy Chief Executive Officer at the ICO.

More than 900 fine notices have been issued by the ICO since September and more are set to follow. Companies can check if their fee is due to renewal on the ICO’s website.

Category: General · UK
Tags: ,

ICO fines bank and ad firm for illegal marketing

13. October 2017

The Information Commissioner’s Office (ICO) has fined Vanquis Bank and advertising firm Xerpla £125,000 in total.

Vanquis Bank had sent over a million spam text messages and spam emails promoting its credit card. As the recipients had not given consent for such messages, Vanquis Bank’s marketing campaign was deemed illegal and a fine of £75,000 was imposed on the Bradford based bank.

Ad firm Xerpla had sent over a million spam emails promoting various products. The ad firm was fined £50,000 for not having the right consent of the recipients as it was not clear and specific enough.

“People need to be properly informed about what they are consenting to. Telling them their details could be passed to ‘similar organisations’ or ‘selected third parties’ cannot be relied upon as specific consent,” ICO Head of Enforcement Steve Eckersley said, adding, “these firms should have taken responsibility for ensuring they had obtained clear and specific consent for the sending of the messages. They didn’t and that is unacceptable.”

TalkTalk fined by ICO

11. August 2017

According to a Press Release from the Information Commissioner’s Office (“ICO”), the TalkTalk Telecom Group (“TalkTalk”) was fined for violating the UK Data Protection Act. More than 21.000 customers could be the victims of scams and frauds.

As a result of an investigation in 2014, the ICO fined TalkTalk 100.000 GPB by failing to protect customer data. The breach was possible because of a lack of security of a portal holding a huge amount of customer data. One company with access to the portal was Wipro, an IT services company in India. 40 employees of Wipro had access to personal data of between 25.000 to 50.000 customers. During the investigation, three accounts were found that had unauthorized access to this portal. The ICO determined that TalkTalk did not ensure the security of the customer data held in this portal. There were different reasons:

  • The portal was accessible via any device. There was no restriction on which devices the portal can be accessed.
  • The search engine of the portal allowed wildcards searches (with * as a placeholder to get many results).
  • The search engine allowed up to 500 results per search.

The access rights were too wide-ranging regarding the high amount of customer data held by the portal. The ICO fined TalkTalk because it breached one of the principles of the UK Data Protection Act by not implementing enough technical and organizational measures.

Category: Personal Data · UK
Tags: , , ,

ICO fines charities with a total of 43,000 GBP

13. December 2016

The ICO just released a statement saying that investigations have shown that the Royal Society for the Prevention of Cruelty to Animals, RSPCA, and the British Heart Foundation, BHF, did not act according to the Data Protection Act.

The statement explaines that these charities used to screen donors for wealth in order to increase their donations.

“The charities also traced and targeted new or lapsed donors by piecing together personal information obtained from other sources” is stated in the report. Furthermore, “they traded personal details with other charities creating a massive pool of donor data for sale. Donors were not informed of these practices, and so were unable to consent or object.”

Elizabeth Denham, Information Commissioner, fined both charities, the RSPCA 25,000 GBP and BHF 18,000 GBP. She explained that the reason for the fining is also due to the fact that “This widespread disregard for people’s privacy will be a concern to donors, but so will the thought that the contributions people have made to good causes could now be used to pay a regulator’s fine for their charity’s misuse of personal information”.

Category: Data breach · UK
Tags:

ICO announces that Facebook agrees to suspend disclosures of personal data from WhatsApp’s users

8. November 2016

After WhatsApp announced in August changes in its privacy policy, several EU DPAs announced monitoring activities in order to ensure the proper use of WhatsApp user’s data. One of these changes on the privacy policy, involved disclosure of personal data of WhatsApp users to Facebook in order to fight spam and improve both, WhatsApp and Facebook’s services.

The EU DPAs had requested WhatsApp not to carry out such disclosures until an adequate level of data protection could be ensured.

On Monday, ICO announced that Facebook agreed to suspend these disclosures. ICO already remarked that consumers were not adequately protected and in most cases a valid consent was not in place. Moreover, it has requested both companies to undertake in writing to inform users about the purposes for which their data will be used. Until now, none of the companies has signed such committment.

If enforcement action takes place, huge fines may be imposed. This is especially relevant upon the applicability of the GDPR from May 2018.

Other EU DPAs, such as Spain, will contact Facebook regarding WhatsApp’s privacy policy.

On the other side, Facebook stated that it only collects the data necessary to offer their services and only a part of this data is shared with Facebook. A Facebook spokeswoman confirmed that WhatsApp’s update complies with applicable law, including UK law and that they will continue the conversations with the ICO regarding the questions raised on the Privacy Policy.

Pages: 1 2 Next
1 2