Category: UK

High Court dismisses challenge regarding Automated Facial Recognition

12. September 2019

On 4 September, the High Court of England and Wales dismissed a challenge to the police’s use of Automated Facial Recognition Technology (“AFR”). The court ruled that the use of AFR was proportionate and necessary to meet the legal obligations of the police.

The pilot project AFR Locate was used for certain events and public places when the commission of crimes was likely. Up to 50 faces per second can be detected. The faces are then compared by biometric data analysis with wanted persons registered in police databases. If no match is found, the images are deleted immediately and automatically.

An individual has initiated a judicial review process after he has not been identified as a wanted person, but is likely to have been captured by AFR Locate. He considered this to be illegal, in particular due to a violation of the right to respect for private and family life under Article 8 of the European Convention on Human Rights (“ECHR”) and data protection law in the United Kingdom. In his view, the police did not respect the data protection principles. In particular, that approach would violate the principle of Article 35 of the Data Protection Act 2018 (“DPA 2018”), which requires the processing of personal data for law enforcement purposes to be lawful and fair. He also pointed out that the police had failed to carry out an adequate data protection impact assessment (“DPIA”).

The Court stated that the use of AFR has affected a person’s rights under Article 8 of the ECHR and that this type of biometric data has a private character in itself. Despite the fact that the images were erased immediately, this procedure constituted an interference with Article 8 of the ECHR, since it suffices that the data is temporarily stored.

Nevertheless, the Court found that the police’s action was in accordance with the law, as it falls within the police’s public law powers to prevent and detect criminal offences. The Court also found that the use of the AFR system is proportionate and that the technology can be used openly, transparently and with considerable public commitment, thus fulfilling all existing criteria. It was only used for a limited period, for a specific purpose and published before it was used (e.g. on Facebook and Twitter).

With regard to data protection law, the Court considers that the images of individuals captured constitute personal data, even if they do not correspond to the lists of persons sought, because the technology has singled them out and distinguished them from others. Nevertheless, the Court held that there was no violation of data protection principles, for the same reasons on which it denied a violation of Art. 8 ECHR. The Court found that the processing fulfilled the conditions of legality and fairness and was necessary for the legitimate interest of the police in the prevention and detection of criminal offences, as required by their public service obligations. The requirement of Sec. 35 (5) DPA 2018 that the processing is absolutely necessary was fulfilled, as was the requirement that the processing is necessary for the exercise of the functions of the police.

The last requirement under Sec. 35 (5) of the DPA 2018 is that a suitable policy document is available to regulate the processing. The Court considered the relevant policy document in this case to be short and incomplete. Nevertheless, it refused to give a judgment as to whether the document was adequate and stated that it would leave that judgment to the Information Commissioner Office (“ICO”), as it would publish more detailed guidelines.

Finally, the Court found that the impact assessment carried out by the police was sufficient to meet the requirements of Sec. 64 of DPA 2018.

The ICO stated that it would take into account the High Court ruling when finalising its recommendations and guidelines for the use of live face recognition systems.

London’s King’s Cross station facial recognition technology under investigation by the ICO

11. September 2019

Initially reported by the Financial Times, London’s King’s Cross station is under crossfire for making use of a live face-scanning system across its 67 acres large site. Developed by Argent, it was confirmed that the system has been used to ensure public safety, being part of a number of detection and tracking methods used in terms of surveillance at the famous train station. While the site is privately owned, it is widely used by the public and houses various shops, cafes, restaurants, as well as office spaces with tenants like, for example, Google.

The controversy behind the technology and its legality stems from the fact that it records everyone in its parameters without their consent, analyzing their faces and compairing them to a database of wanted criminals, suspects and persons of interest. While Developer Argent defended the technology, it has not yet explained what the system is, how it is used and how long it has been in place.

A day before the ICO launched its investigation, a letter from King’s Cross Chief Executive Robert Evans reached London Mayor Sadiq Khan, explaining the matching of the technology against a watchlist of flagged individuals. In effect, if footage is unmatched, it is blurred out and deleted. In case of a match, it is only shared with law enforcement. The Metropolitan Police Service has stated that they have supplied images for a database to carry out facial scans to system, though it claims to not have done so since March, 2018.

Despite the explanation and the distinct statements that the software is abiding by England’s data protection laws, the Information Commissioner’s Office (ICO) has launched an investigation into the technology and its use in the private sector. Businesses would need to explicitly demonstrate that the use of such surveillance technology is strictly necessary and proportionate for their legitimate interests and public safety. In her statement, Information Commissioner Elizabeth Denham further said that she is deeply concerned, since “scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all,” especially if its being done without their knowledge.

The controversy has sparked a demand for a law about facial recognition, igniting a dialogue about new technologies and future-proofing against the yet unknown privacy issues they may cause.

Category: GDPR · General · UK
Tags: , , , ,

ICO releases a draft Code of Practice to consult on the Use of Personal Data in Political Campaigning

14. August 2019

The United Kingdom’s Information Commissioner’s Office (ICO) plans to give consultations on a new framework code of practice regarding the use of personal data in relation to politcal campaigns.

ICO states that in any democratic society it is vital for political parties,  candidates and campaigners to be able to communicate effectively with voters. Equally vital, though, is that all organisations involved in political campaigning use personal data in a transparent, lawful way that is understood by the people.

Along with the internet, politcal campaigning has become increasingly sophisticated and innovative. Using new technologies and techniques to understand their voters and target them, political campaigning has changed, using social media, the electoral register or screening names for ethnicity and age. In a statement from June, ICO has adressed the risk that comes with innovation, which, intended or not, can undermine the democratic process by hidden manipulation through the processing of personal data that the people do not understand.

In this light, ICO expresses that their current guidance is outdated, since it has not been updated since the introduction of the General Data Protection Regulation (GDPR). It does not reflect modern campainging practices. However, the framework does not establish new requirements for campaigners, instead aims at explaining and clarifying data protection and electronic marketing laws as they already stand.

Before drafting the framework, the Information Commissioner launched a call for views in October 2018 in hopes of input from various people and organisations. The framework is hoped to have taken into account the responses the ICO had received in the process.

In hopes of being the basis of a statutory code of practice if the relevant legislation is introduced, the draft of the framework code of practice is now out for public consultation, and will remain open for public access until Ocotber 4th.

CNIL and ICO publish revised cookie guidelines

6. August 2019

The French data protection authority CNIL as well as the British data protection authority ICO have revised and published their guidelines on cookies.

The guidelines contain several similarities, but also differ in some respects.

Both France and the UK consider rules that apply to cookies to be also applicable to any device that stores or accesses information. In addition, both authorities stress that users must give specific, free and unambiguous consent before cookies are placed. Further scrolling of the website cannot be considered as consent. Likewise, obtaining consent from T&Cs is not lawful. This procedure violates Art. 7 (2) of the General Data Protection Regulation (GDPR), according to which the request for consent shall be presented in a manner which is clearly distinguishable from the other matters, in an intelligible and easily accessible form, using clear and plain language. In addition, all parties who place cookies must be named so that informed consent can be obtained. Finally, both authorities point out that browser settings alone are not a sufficient basis for valid consent.

With regard to the territorial scope, CNIL clarifies that the cookie rules apply only to the processing of cookies within the activities of an establishment of a controller or processor in France, regardless of whether the processing takes place in France. The English guideline does not comment on this.

Cookie walls are considered non-compliant with GDPR by the French data protection authority due to the negative consequences for the user in case of refusal. ICO, on the other hand, is of the opinion that a consent forced on the basis of a cookie wall is probably not valid. Nevertheless GDPR must be balanced with other rights. Insofar ICO has not yet delivered a clear position.

Regarding analytic cookies, CNIL explains that a consent is not always necessary, namely not if they correspond to a list of cumulative requirements created by CNIL. ICO, on the other hand, does not exempt cookies from the consent requirement even in the case of analytic cookies.

Finally, CNIL notes that companies have six months to comply with the rules. However, this period will only be set in motion by the publication of a statement by the CNIL, which is still pending. CNIL expects this statement to be finalised during the first quarter of 2020. The ICO does not foresee such a time limit.

Record fine by ICO for British Airways data breach

11. July 2019

After a data breach in 2018, which affected 500 000 customers, British Airways (BA) has now been fined a record £183m by the UK’s Information Commissioners Office (ICO). According to the BBC, Alex Cruz, chairman and CEO of British Airways, said he was “surprised and disappointed” by the ICO’s initial findings.

The breach happened by a hacking attack that managed to get a script on to the BA website. Unsuspecting users trying to access the BA website had been diverted to a false website, which collected their information. This information included e-mail addresses, names and credit card information. While BA had stated that they would reimburse every customer that had been affected, its owner IAG declared through its chief executive that they would take “all appropriate steps to defend the airline’s position”.

The ICO said that it was the biggest penalty that they had ever handed out and made public under the new rules of the GDPR. “When an organization fails to protect personal data from loss, damage or theft, it is more than an inconvenience,” ICO Commissioner Elizabeth Dunham said to the press.

In fact, the GDPR allows companies to be fined up to 4% of their annual turnover over data protection infringements. In relation, the fine of £183m British Airways received equals to 1,5% of its worldwide turnover for the year 2017, which lies under the possible maximum of 4%.

BA can still put forth an appeal in regards to the findings and the scale of the fine, before the ICO’s final decision is made.

Royal family uses GDPR to protect their privacy

22. May 2019

Last week Prince Harry and Meghan Markle could claim another victory in the royal family’s never ending struggle with paparazzi photographers, securing “a substantial sum” in damages from an agency that released intimate photos of the Oxfordshire home the Duke and Duchess of Sussex rented to the media. In a statement, Splash News apologized for and acknowledged that this situation would represent “an error of judgement”.

The paparazzi agency “Splash News” took photos and footage of the couple’s former Cotswolds home — including their living room, dining area, and bedroom — using a helicopter and promptly sold to different news outlets. The lawyers of Prince Harry argued that this situation caused a breach of his right to privacy according to Art. 7 and 8 ECHR as well as a breach of the General Data Protection Regulation (GDPR) and Data Protection Act 2018 (DPA).

Considering the strategy of the Duke’s lawyers, it looks like the royal family have found a potentially attractive alternative to claims of defamation of invasion of privacy. Since in contrast to such a claim, a claimant relying on data protection law neither needs to prove that a statement is at least defamatory and met the threshold for serious harm to reputation nor that the information is private.

However, the (new) European data protection legislation grants all data subjects, regardless of their position and/or fame, a right of respect for their privacy and family lives and protection of their personal data. In particular, the GDPR requires organisations, according to its Article 5, to handle personal data (such as names, pictures and stories relating to them) fairly and in a transparent manner while also using it for a legitimate purpose.

Moreover, when obtaining pictures and footage of an individual’s private or even the intimite sphere, the organization using such materials need a specific reason like some kind of contract, the individual’s consent or be able to argue that using this photos and footage was “in the public interest” or for a “legitimate interest”. As a contract and a consent can be excluded here, the only basis that might be considerd could be a public interest or a legitimate interest of the organization itself. Taking into account the means and the way how these photos and footage of the Duke and Dutchess were created, both of these interest cannot withstand the interest  in protecting the rights and freedom of individuals’ private and intimite sphere.

Referring to this case, it seems pretty likely that the European data protection regime changed the way in how celebrities and the courts enforce the heavy-contested threshold of whether the public is allowed to see and be informed about certain parts and aspects of famous people’s lives or not.

 

 

Morrisons is Allowed to Appeal Data Protection Class Action

29. April 2019

The British food store chain VM Morrison Supermarkets PLC (“Morrisons”) has been granted permission by the Supreme Court to appeal the data protection class action brought against it and to challenge the judgment for all its grounds. The case is important as it’s the first to be filed in the UK for a data breach and its outcome may affect the number of class actions for data breaches.

An employee who worked as a senior IT auditor for Morrsisons copied the payroll data of almost 100,000 employees onto a USB stick and published it on a file-sharing website. He then reported the violation anonymously to three newspapers. The employee himself was sentenced to eight years in prison for various crimes.

5,518 employees filed a class action lawsuit against Morrisons for the violation. It claimed both primary and representative liability for the company. The Supreme Court dismissed all primary liability claims under the Data Protection Act (“DPA”), as it concluded that the employee had acted independently of Morrisons in violation of the DPA.

However, the court found that Morrisons is vicariously liable for its employee’s actions, although the DPA does not explicitly foresee vicarious liability. The company appealed the decision.

The Court of Appeals dismissed the appeal and upheld the Supreme Court’s ruling that the Company is vicariously liable for its employee’s data breach, even though it was itself acquitted of any misconduct.

In the future appeal of the Supreme Court, it will have to examine, among other things, whether there is deputy liability under the DPA and whether the Court of Appeal’s conclusion that the employee disclosed the data during his employment was incorrect.

Brexit: Deal or “No-deal”

12. March 2019

Yesterday evening, shortly before the vote of the UK parliament on the circumstances and if necessary a postponement of the Brexit, Theresa May met again with Jean-Claude Juncker in Strasbourg. Both sides could agree on “clarifications and legal guarantees” regarding the fall-back solution for Northern Ireland.

These (slightly) expand the United Kingdom’s (UK) opportunity to appeal to an arbitration court in the event that the EU should “hold the UK hostage” in terms of the membership of the customs union by means of the Backstop-Clause beyond 2020. This “legally binding instrument”, as Juncker said, intends to clarify that the Backstop-Clause on the Irish border is not to be regarded as a permanent solution. This shall also be confirmed in a joint political declaration on the future relations between the two sides. However, the wording of the complementary regulation is legally vague.

May is nevertheless confident that the British Parliament will approve the “new” agreement to be voted on tonight. Meanwhile, Jeremy Corbyn, Labour Party leader, has announced and urged to vote against the agreement. In any case, Juncker has already rejected further negotiations on adjustments to the current version of the withdrawal agreement, emphasizing that there will be no “third chance”. By 23rd May, when the EU elections begin, the Kingdom shall have left the EU.

The vote on “how” and “when” of the Brexit will be taken in the next few days, starting tonight at 8 p.m. CET. If the withdrawal agreement will be rejected again today, the parliament will vote on a no-deal Brexit tomorrow (the UK would then be a third country in the sense of the GDPR as of 30th March). In case this will also be rejected, on 14th March the parliament will eventually vote on a delay of the Brexit date. A postponement could then lead to a new referendum and thus to a renewed decision on the question of “whether” a Brexit will actually take place.

Category: EU · GDPR · General · UK
Tags:

EDPB publishes information note on data transfer in the event of a no-deal Brexit

25. February 2019

The European Data Protection Board has published an information note to explain data transfer to organisations and facilitate preparation in the event that no agreement is reached between the EEA and the UK. In case of a no-deal Brexit, the UK becomes a third country for which – as things stand at present – no adequacy decision exists.

EDPB recommends that organisations transferring data to the UK carry out the following five preparation steps:

• Identify what processing activities will imply a personal data transfer to the UK
• Determine the appropriate data transfer instrument for your situation
• Implement the chosen data transfer instrument to be ready for 30 March 2019
• Indicate in your internal documentation that transfers will be made to the UK
• Update your privacy notice accordingly to inform individuals

In addition, EDPB explains which instruments can be used to transfer data to the UK:
– Standard or ad hoc Data Protection Clauses approved by the European Commission can be used.
– Binding Corporate Rules for data processing can be defined.
– A code of conduct or certification mechanism can be established.

Derogations are possible in the cases mentioned by article 49 GDPR. However, they are interpreted very restrictively and mainly relate to processing activities that are occasional and non-repetitive. Further explanations on available derogations and how to apply them can be found in the EDPB Guidelines on Article 49 of GDPR.

The French data protection authority CNIL has published an FAQ based on the information note of the EDPB, explaining the consequences of a no-deal Brexit for the data transfer to the UK and which preparations should be made.

The European Data Protection Board presents Work Program for 2019/2020

14. February 2019

On February 12, 2019 the European Data Protection Board (EDPB) released on their website a document containing a two-year Work Program.

The EDPB acts as an independent European body and is established by the General Data Protection Regulation (GDPR). The board is formed of representatives of the national EU and EEA EFTA data protection supervisory authorities, and the European Data Protection Supervisor (EDPS).

The tasks of the EDPB are to issue guidelines on the interpretation of key ideas of the GDPR as well as the ruling by binding decisions on disputes regarding cross-border processing activities. Its objective is to ensure a consistent application of EU rules to avoid the same case potentially being dealt with differently across various jurisdictions. It promotes cooperation between EEA EFTA and the EU data protection supervisory authorities.

The EDPB work program is based on the needs identified by the members as priority for individuals, stakeholders, as well as the EU legislator- planned activities. It contains Guidelines, Consistency opinions, other types of activities, recurrent activities and possible topics.

Furthermore, the EDPB released an information note about data transfers if a no-deal Brexit occurs. As discussed earlier, in this case the UK will become a so-called “third country” for EU member countries beginning from March 30. According to the UK Government, the transfer of data from the UK to the EEA will remain unaffected, permitting personal data to flow freely in the future.

Pages: 1 2 3 4 Next
1 2 3 4