Tag: ICO

ICO plans to update guidance on anonymisation and pseudonymisation

31. March 2021

The ICO is planning to update their anonymisation and pseudonymisation guidance as blogged by Ali Shah, ICO’s Head of Technology Policy on March 19th, 2021. He emphasizes the important role of sharing personal data in a digital economy, citing the healthcare and financial sector as examples. Thus, in healthcare, data could improve patient care, and in the financial sector, it could help prevent money laundering and protect individuals from fraud.

Last year, the ICO published their recent Data Sharing Code of Practice. The intention of the Data Sharing Code, according to Elizabeth Denham CBE, Information Commissioner, is “to give individuals, businesses and organisations the confidence to share data in a fair, safe and transparent way (…)”. Shah calls the Data Sharing Code a milestone and not a conclusion stating that ICO’s ongoing work shall lead to more clarity and advice in regard to lawful data sharing.

He names several key topics that are going to be explored by the ICO in regard to updating the anonymisation and pseudonymisation guidance. Among others, you will find the following:

  • “Anonymisation and the legal framework – legal, policy and governance issues around the application of anonymisation in the context of data protection law”
  • “Guidance on pseudonymisation techniques and best practices”
  • “Accountability and governance requirements in the context of anonymisation and pseudonymisation, including data protection by design and DPIAs”
  • “Guidance on privacy enhancing technologies (PETs) and their role in safe data sharing”
  • “Technological solutions – exploring possible options and best practices for implementation”

It is to be welcomed that apparently not only the legal side will be explored, but also technical aspects should play their role, as designing and implementing systems with privacy enhancing technologies (PETs) and data protection by design in mind has the potential to contribute to compliance with data protection laws already at the technical level and therefore at an early stage of processing.

The ICO plans to publish each chapter of the guidance asking the industry, academia and other key stakeholders to present their point of view on the topic encouraging them to give insights and feedback in order for the ICO to get a better understanding where the guidance can be targeted most effectively.

EU-UK Trade Deal in light of Data Protection

4. January 2021

Almost fit to be called a Christmas miracle, the European Union (EU) and the United Kingdom (UK) came to an agreement on December 24th, 2020. The Trade Agreement, called in full length “EU-UK Trade and Cooperation Agreement“, is set out to define new rules from the date of the UK Exit from the EU, January 1st, 2021.

President of the European Commission, Ursula von der Leyen, claimed it was a deal worth fighting for, “because we now have a fair and balanced agreement with the UK, which will protect our European interests, ensure fair competition, and provide much needed predictability for our fishing communities. Finally, we can leave Brexit behind us and look to the future. Europe is now moving on.

In light of Data Protection however, the new Trade Deal has not given much certainty of what is to come next.

Both sides are aware that an adequacy decision by the EU Commission is very important with regard to data protection and cross-border data flows. Accordingly, the EU has agreed to allow a period of four months, extendable by a further two months, during which data can be transferred between EU Member States and the UK without additional safeguards. This period was granted to give the Commission enough time to make an adequacy decision. Accordingly, data transfers can continue as before until possibly mid-2021. However, this arrangement is only valid if the UK does not change its data protection laws in the meantime.

With regard to direct marketing, the situation has not changed either: for individuals, active consent must be given unless there was a prior contractual relationship and the advertising relates to similar products as the prior contract. Furthermore, the advertising must also be precisely recognisable as such, and the possibility of revoking consent must be given in every advertising mail.

However, much else has yet to be clarified. Questions such as the competence of the UK Data Protection Authority, the Information Commissioner’s Office (ICO), as well as the fate of its ongoing investigations, have not yet been answered. As of now, companies with their original EU Headquarters in the UK will have to designate a new Lead Supervisory Authority (Art. 56 GDPR) for their business in the EU.

The upcoming months will determine if questions with high relevance to businesses’ day to day practice will be able to be answered reassuringly.

ICO fines Marriott International

9. November 2020

The Information Commissioner’s Office (ICO) fines Marriott International Inc. (Marriott) £18.400.00  (€20.397.504).

The fine refers to a data breach which occurred in 2018. Back then the world’s largest hotel company based in the USA suffered a massive data breach affecting up to 383 million customers. For Marriott it is still not possible to state the exact number of people affected.

The ICO considers it proven that Marriott failed keeping customers’ personal data secure. In context of the breach confidential data like name, address and contact data as well as unencrypted passport and credit card data has been unauthorized accessed.

In a previous statement in 2019 the ICO announced, that it intends to fine Marriott with a fine of £99.200.396 (€109.969.591) this fine has now been reduced.

The reduction is based on the following reasons: the ICO considered the presentations from Marriott as well as the taken steps by Marriott as well as the consequences of the COVID-19 pandemic.

In October, the fine previously issued by the ICO against British Airways was also reduced, again partly because of the consequences of the COVID-19 pandemic.

Since the data breach occurred before the UK left the EU, the ICO investigated on behalf of all European Data Protection Authorities as lead Supervisory Authority and the fine has been approved by all other Authorities.

Experian to appeal ICO’s decision regarding handling of personal data

29. October 2020

On October 27th, 2020 the Information Commissioner’s Office (ICO) issued an enforcement notice against the credit reference agency Experian Limited, ordering it to make fundamental changes to how it handles personal data related to its direct marketing services in the United Kingdom.

An ICO investigation found that at the three largest credit reference agencies (CRAs) in the UK significant ‘invisible’ processing took place, likely affecting millions of adults in the UK. Experian, Equifax and TransUnion, were ‘trading, enriching and enhancing’ people’s personal data without their knowledge to provide direct marketing services. The data was used by commercial organisations, political parties for political campaigning, or charities for their fundraising campaigns. Some of the CRAs were also using profiling to generate new or previously unknown information about people.

While Equifax and TransUnion made adequate improvements to their marketing practices, the ICO found Experian’s efforts to be insufficient and the processing of personal data to remain non-compliant with the data protection law. As a result, Experian has been given an enforcement notice compelling it to make changes within nine months or it will face financial penalties under the GDPR.

Experian is going to appeal the decision by the ICO regarding the notice over data protection failures. In a statement, the Chief Executive Officer Brian Cassin said:

We disagree with the ICO’s decision today and we intend to appeal. At heart this is about the interpretation of GDPR and we believe the ICO’s view goes beyond the legal requirements. This interpretation also risks damaging the services that help consumers, thousands of small businesses and charities, particularly as they try to recover from the COVID-19 crisis.

We share the ICO’s goals on the need to provide transparency, maintain privacy and ensure consumers are in control of their data. The Experian Consumer Information Portal makes it very easy for consumers to fully understand the ways we work with data and to opt out of having their data processed if they wish.

 

 

British Airways: Fine reduced

20. October 2020

In 2018 British Airways (BA) had to announce that they suffered a massive data breach. The data breach referred to the online booking tool. Login data and credit card data as well as travel data and address data were accessed illegaly. Affected were more than 400.000 customers.

Back in 2019 the UK’s Information Commissioners Office (ICO) evaluated the breach and stated that weak security precautions enabled the hakers to access the data. Thus, the ICO fined BA as a consequence of the breach a record fine of £183.000.000 (€ 205.000.000).

BA appealed against the fine and now – in 2020 – the ICO announced a reduced fine.

On October 16th, 2020, the ICO announced the final sanction for BA. The initial fine of £183.000.000 (€ 205.000.000) has been reduced to a total fine of £20.000.000 (€ 22.000.000). Reason for the reduction is inter alia the current COVID-19 situation and it’s consequences for the Aviation industry.

The notification from the authority states in this context:

As part of the regulatory process the ICO considered both representations from BA and the economic impact of COVID-19 on their business before setting a final penalty.

ICO passed Children’s Code

8. September 2020

The UK Information Commissioner’s Office (ICO) passed the Age Appropriate Design Code, also called Children’s Code, which applies especially to social media and online services likely to be used by minors under the age of 18 in the UK.

The Children’s Code contains 15 standards for designers of online services and products. The aim is to ensure a minimum level of data protection. Therefore, the Code requires that apps, games, websites etc. are built up in a way which provides already a baseline of data protection. The following default settings should be mentioned here:

  • Glocalization disabled by default,
  • Profiling disabled by default,
  • Newly created profiles private and not public by default.

Base for the Children’s Code is the UK Data Protection Act of 2018 – local implementation law of the GDPR. Thus, the standards also include the GDPR Data Protection principles Transparency and Data Minimisation.

The requirements also and especially apply to the major social media and online services used by minors in the UK, e.g. TikTok, Instagram and Facebook.

The Code is designed to be risk-based. This means that not all organizations have to fulfil the same obligations. The more companies use, analyse and profile data from minors, the more they must undertake to comply with the Code.

easyJet Data Breach: 9 million customers affected

22. May 2020

The British airline ‘easyJet’ has been hacked. The hackers have been able to access personal data of approximately 9 million customers.

easyJet published a statement on the hacker attack and announced that e-mail addresses and travel details were among the concerned personal data of customers. Which personal data in detail belong to ‘travel data’ was not disclosed. In some cases, the hackers could also access credit card data. easyJet stated that there is no proof, that the accessed personal data was abused. easyjet now warns about fake mails in his name as well as in the name of ‘easyJet Holidays’.

The hack was noticed by easyJet in January, but was only made public this week. With becoming aware of the attack, easyJet took several measures and has blocked the unauthorized access in the meantime. easyJet is also in contact with the British Data Protection Authority ‘ICO’ and the National Security Center.

At this time, easyJet has not yet been able to evaluate how the attack could have occurred, but easyJet explained, that the hacker attack was no ‘general’ hacker attack, since the attack was very sophisticated compared to other hacker attacks. It is suspected that the attack originated from a group that has already hacked other airlines, such as British Airways in 2018.

easyJet announced that they will get in contact with concerned data subjects until May 26th to inform those about the breach and to explain further measures which should be taken in order to decrease the risk. easyJet customers who will not receive a statement until then are not concerned by the breach.

In connection with hacker attacks like these the risk for phishing attacks is the highest. In phishing attacks, criminals use fake e-mails, for example on behalf of well-known companies or authorities, to try to persuade users to pass on personal data or to click on prepared e-mail attachments containing malware.

UK: Betting companies had access to millions of data of children

28. January 2020

In the UK, betting companies have gained access to data from 28 million children under 14 and adolescents. The data was stored in a government database and could be used for learning purposes. Access to the platform is granted by the government. A company that was given access is said to have illegally given it to another company, which in turn allowed access for the betting companies. The betting providers used the access, among other things, to check age information online. The company accused of passing on the access denies the allegations, but has not yet made any more specific statements.

The British Department for Education speaks of an unacceptable situation. All access points have been closed and the cooperation has been terminated.

Category: Data breach · General · UK
Tags: , ,

High Court dismisses challenge regarding Automated Facial Recognition

12. September 2019

On 4 September, the High Court of England and Wales dismissed a challenge to the police’s use of Automated Facial Recognition Technology (“AFR”). The court ruled that the use of AFR was proportionate and necessary to meet the legal obligations of the police.

The pilot project AFR Locate was used for certain events and public places when the commission of crimes was likely. Up to 50 faces per second can be detected. The faces are then compared by biometric data analysis with wanted persons registered in police databases. If no match is found the images are deleted immediately and automatically.

An individual has initiated a judicial review process after he has not been identified as a wanted person, but is likely to have been captured by AFR Locate. He considered this to be illegal, in particular due to a violation of the right to respect for private and family life under Article 8 of the European Convention on Human Rights (“ECHR”) and data protection law in the United Kingdom. In his view, the police did not respect the data protection principles. In particular, that approach would violate the principle of Article 35 of the Data Protection Act 2018 (“DPA 2018”), which requires the processing of personal data for law enforcement purposes to be lawful and fair. He also pointed out that the police had failed to carry out an adequate data protection impact assessment (“DPIA”).

The Court stated that the use of AFR has affected a person’s rights under Article 8 of the ECHR and that this type of biometric data has a private character in itself. Despite the fact that the images were erased immediately, this procedure constituted an interference with Article 8 of the ECHR, since it suffices that the data is temporarily stored.

Nevertheless, the Court found that the police’s action was in accordance with the law, as it falls within the police’s public law powers to prevent and detect criminal offences. The Court also found that the use of the AFR system is proportionate and that the technology can be used openly, transparently and with considerable public commitment, thus fulfilling all existing criteria. It was only used for a limited period, for a specific purpose and published before it was used (e.g. on Facebook and Twitter).

With regard to data protection law, the Court considers that the images of individuals captured constitute personal data, even if they do not correspond to the lists of persons sought, because the technology has singled them out and distinguished them from others. Nevertheless, the Court held that there was no violation of data protection principles, for the same reasons on which it denied a violation of Art. 8 ECHR. The Court found that the processing fulfilled the conditions of legality and fairness and was necessary for the legitimate interest of the police in the prevention and detection of criminal offences, as required by their public service obligations. The requirement of Sec. 35 (5) DPA 2018 that the processing is absolutely necessary was fulfilled, as was the requirement that the processing is necessary for the exercise of the functions of the police.

The last requirement under Sec. 35 (5) of the DPA 2018 is that a suitable policy document is available to regulate the processing. The Court considered the relevant policy document in this case to be short and incomplete. Nevertheless, it refused to give a judgment as to whether the document was adequate and stated that it would leave that judgment to the Information Commissioner Office (“ICO”), as it would publish more detailed guidelines.

Finally, the Court found that the impact assessment carried out by the police was sufficient to meet the requirements of Sec. 64 of DPA 2018.

The ICO stated that it would take into account the High Court ruling when finalising its recommendations and guidelines for the use of live face recognition systems.

London’s King’s Cross station facial recognition technology under investigation by the ICO

11. September 2019

Initially reported by the Financial Times, London’s King’s Cross station is under crossfire for making use of a live face-scanning system across its 67 acres large site. Developed by Argent, it was confirmed that the system has been used to ensure public safety, being part of a number of detection and tracking methods used in terms of surveillance at the famous train station. While the site is privately owned, it is widely used by the public and houses various shops, cafes, restaurants, as well as office spaces with tenants like, for example, Google.

The controversy behind the technology and its legality stems from the fact that it records everyone in its parameters without their consent, analyzing their faces and compairing them to a database of wanted criminals, suspects and persons of interest. While Developer Argent defended the technology, it has not yet explained what the system is, how it is used and how long it has been in place.

A day before the ICO launched its investigation, a letter from King’s Cross Chief Executive Robert Evans reached Mayor of London Sadiq Khan, explaining the matching of the technology against a watchlist of flagged individuals. In effect, if footage is unmatched, it is blurred out and deleted. In case of a match, it is only shared with law enforcement. The Metropolitan Police Service has stated that they have supplied images for a database to carry out facial scans to system, though it claims to not have done so since March, 2018.

Despite the explanation and the distinct statements that the software is abiding by England’s data protection laws, the Information Commissioner’s Office (ICO) has launched an investigation into the technology and its use in the private sector. Businesses would need to explicitly demonstrate that the use of such surveillance technology is strictly necessary and proportionate for their legitimate interests and public safety. In her statement, Information Commissioner Elizabeth Denham further said that she is deeply concerned, since “scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all,” especially if its being done without their knowledge.

The controversy has sparked a demand for a law about facial recognition, igniting a dialogue about new technologies and future-proofing against the yet unknown privacy issues they may cause.

Category: GDPR · General · UK
Tags: , , , ,
Pages: 1 2 3 Next
1 2 3