Tag: UK
4. January 2021
Almost fit to be called a Christmas miracle, the European Union (EU) and the United Kingdom (UK) came to an agreement on December 24th, 2020. The Trade Agreement, called in full length “EU-UK Trade and Cooperation Agreement“, is set out to define new rules from the date of the UK Exit from the EU, January 1st, 2021.
President of the European Commission, Ursula von der Leyen, claimed it was a deal worth fighting for, “because we now have a fair and balanced agreement with the UK, which will protect our European interests, ensure fair competition, and provide much needed predictability for our fishing communities. Finally, we can leave Brexit behind us and look to the future. Europe is now moving on.”
In light of Data Protection however, the new Trade Deal has not given much certainty of what is to come next.
Both sides are aware that an adequacy decision by the EU Commission is very important with regard to data protection and cross-border data flows. Accordingly, the EU has agreed to allow a period of four months, extendable by a further two months, during which data can be transferred between EU Member States and the UK without additional safeguards. This period was granted to give the Commission enough time to make an adequacy decision. Accordingly, data transfers can continue as before until possibly mid-2021. However, this arrangement is only valid if the UK does not change its data protection laws in the meantime.
With regard to direct marketing, the situation has not changed either: for individuals, active consent must be given unless there was a prior contractual relationship and the advertising relates to similar products as the prior contract. Furthermore, the advertising must also be precisely recognisable as such, and the possibility of revoking consent must be given in every advertising mail.
However, much else has yet to be clarified. Questions such as the competence of the UK Data Protection Authority, the Information Commissioner’s Office (ICO), as well as the fate of its ongoing investigations, have not yet been answered. As of now, companies with their original EU Headquarters in the UK will have to designate a new Lead Supervisory Authority (Art. 56 GDPR) for their business in the EU.
The upcoming months will determine if questions with high relevance to businesses’ day to day practice will be able to be answered reassuringly.
23. June 2020
In another update about contact tracing apps, we are going to talk about the new path of contact tracing in the United Kingdom (UK), as well as the European Data Protection Board’s (EDPB) statement in regards to the cross-border interoperability of the contact tracing apps being deployed in the European Union.
UK Contact Tracing App Update
Since starting the field tests on the NHS COVID-19 App on the Isle of Wight, the UK government has decided to change their approach towards the contact tracing model. It has been decided to abandon the centralized app model in favour of the decentralized Google/Apple alternative.
The change was brought on by technical issues and privacy challenges which surfaced during the trial period on the Isle of Wight, and in the end were direct consequences of the centralized model and important enough to motivate the change of approach.
The technical problems included issues with the background Bluetooth access, as well as operation problems in the light of cross-border interoperability. Further, the data protection risks of mission creep and a lack of transparency only urged on the of the app.
The new model is widely used throughout the European Union, and provides more data protection as well as better technical support. The only deficit in comparison with the centralized model is the lesser access to data by epidemiologists, which seems to be a trade off that the UK government is willing to take for the increase in data protection and technical compatibility.
EDPB statement on cross-border interoperability
On June 17th, 2020, the EDPB has released a statement with regards to the cross-border interoperability of contact tracing apps. The statement builds on the EDPB Guideline from 04/2020 with regards to data protection aspects of contact tracing apps, emphasising the importance of the issues presented.
The statement stems from an agreement between EU-Member states and the European Commission formed in May 2020 with regards to the basic guidelines for cross-border interoperability of contact tracing apps, as well as the newly settled technical specs for the achievement of such an interoperability.
The EDPB states key aspects that have to be kept in mind during the entirety of the project, namely transparency, legal basis, controllership, data subject’s rights, as well as data retention and minimisation rules.
Further, the statement emphasises that the sharing of data about individuals which have been diagnosed or tested positively should only be triggered by a voluntary action of the users themselves. In the end, the goal of interoperability should not be used as an argument to extend the collection of personal data further than necessary.
Overall, this type of sharing of personal data can pose an increased data protection risk to the personal data of the users, which is why it needs to be made sure that the principles set down by the GDPR are being upheld, and made sure that there is no less intrusive method to be used in the matter.
28. January 2020
In the UK, betting companies have gained access to data from 28 million children under 14 and adolescents. The data was stored in a government database and could be used for learning purposes. Access to the platform is granted by the government. A company that was given access is said to have illegally given it to another company, which in turn allowed access for the betting companies. The betting providers used the access, among other things, to check age information online. The company accused of passing on the access denies the allegations, but has not yet made any more specific statements.
The British Department for Education speaks of an unacceptable situation. All access points have been closed and the cooperation has been terminated.
10. October 2019
The United States and the United Kingdom have entered into the first of its kind CLOUD Act Data Access Agreement, which will allow both countries’ law enforcement authorities to demand authorized access to electronic data relating to serious crime. In both cases, the respective authorities are permitted to ask the tech companies based in the other country, for electronic data directly and without legal barriers.
At the base of this bilateral Agreement stands the U.S.A.’s Clarifying Lawful Overseas Use of Data Act (CLOUD Act), which came into effect in March 2018. It aims to improve procedures for U.S. and foreign investigators for obtaining electronic information held by service providers in the other country. In light of the growing number of mutual legal assistance requests for electronic data from U.S. service providers, the current process for access may take up to two years. The Data Access Agreement can reduce that time considerably by allowing for a more efficient and effective access to data needed, while protecting the privacy and civil liberties of the data subjects.
The Cloud Act focuses on updating legal frameworks to respond to the growing technology in electronic communications and service systems. It further enables the U.S. and other countries to enter into a mutual executive Agreement in order to use own legal authorities to access electronic evidence in the other respective country. An Agreement of this form can only be signed by rights-respecting countries, after it has been certified by the U.S. Attorney General to the U.S. Congress that their laws have robust substansive and procedural protections for privacy and civil liberties.
The Agreement between the U.K. and the U.S.A. further assures providers that the requested disclosures are compatible with data protection laws in both respective countries.
In addition to the Agreement with the United Kingdom, there have been talks between the United States and Australia on Monday, reporting negotiations for such an Agreement between the two countries. Other negotiations have also been held between the U.S. and the European Commission, representing the European Union, in regards to a Data Access Agreement.
11. September 2019
Initially reported by the Financial Times, London’s King’s Cross station is under crossfire for making use of a live face-scanning system across its 67 acres large site. Developed by Argent, it was confirmed that the system has been used to ensure public safety, being part of a number of detection and tracking methods used in terms of surveillance at the famous train station. While the site is privately owned, it is widely used by the public and houses various shops, cafes, restaurants, as well as office spaces with tenants like, for example, Google.
The controversy behind the technology and its legality stems from the fact that it records everyone in its parameters without their consent, analyzing their faces and compairing them to a database of wanted criminals, suspects and persons of interest. While Developer Argent defended the technology, it has not yet explained what the system is, how it is used and how long it has been in place.
A day before the ICO launched its investigation, a letter from King’s Cross Chief Executive Robert Evans reached Mayor of London Sadiq Khan, explaining the matching of the technology against a watchlist of flagged individuals. In effect, if footage is unmatched, it is blurred out and deleted. In case of a match, it is only shared with law enforcement. The Metropolitan Police Service has stated that they have supplied images for a database to carry out facial scans to system, though it claims to not have done so since March, 2018.
Despite the explanation and the distinct statements that the software is abiding by England’s data protection laws, the Information Commissioner’s Office (ICO) has launched an investigation into the technology and its use in the private sector. Businesses would need to explicitly demonstrate that the use of such surveillance technology is strictly necessary and proportionate for their legitimate interests and public safety. In her statement, Information Commissioner Elizabeth Denham further said that she is deeply concerned, since “scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all,” especially if its being done without their knowledge.
The controversy has sparked a demand for a law about facial recognition, igniting a dialogue about new technologies and future-proofing against the yet unknown privacy issues they may cause.
14. August 2019
The United Kingdom’s Information Commissioner’s Office (ICO) plans to give consultations on a new framework code of practice regarding the use of personal data in relation to politcal campaigns.
ICO states that in any democratic society it is vital for political parties, candidates and campaigners to be able to communicate effectively with voters. Equally vital, though, is that all organisations involved in political campaigning use personal data in a transparent, lawful way that is understood by the people.
Along with the internet, politcal campaigning has become increasingly sophisticated and innovative. Using new technologies and techniques to understand their voters and target them, political campaigning has changed, using social media, the electoral register or screening names for ethnicity and age. In a statement from June, ICO has adressed the risk that comes with innovation, which, intended or not, can undermine the democratic process by hidden manipulation through the processing of personal data that the people do not understand.
In this light, ICO expresses that their current guidance is outdated, since it has not been updated since the introduction of the General Data Protection Regulation (GDPR). It does not reflect modern campainging practices. However, the framework does not establish new requirements for campaigners, instead aims at explaining and clarifying data protection and electronic marketing laws as they already stand.
Before drafting the framework, the Information Commissioner launched a call for views in October 2018 in hopes of input from various people and organisations. The framework is hoped to have taken into account the responses the ICO had received in the process.
In hopes of being the basis of a statutory code of practice if the relevant legislation is introduced, the draft of the framework code of practice is now out for public consultation, and will remain open for public access until Ocotber 4th.
11. July 2019
After a data breach in 2018, which affected 500 000 customers, British Airways (BA) has now been fined a record £183m by the UK’s Information Commissioners Office (ICO). According to the BBC, Alex Cruz, chairman and CEO of British Airways, said he was “surprised and disappointed” by the ICO’s initial findings.
The breach happened by a hacking attack that managed to get a script on to the BA website. Unsuspecting users trying to access the BA website had been diverted to a false website, which collected their information. This information included e-mail addresses, names and credit card information. While BA had stated that they would reimburse every customer that had been affected, its owner IAG declared through its chief executive that they would take “all appropriate steps to defend the airline’s position”.
The ICO said that it was the biggest penalty that they had ever handed out and made public under the new rules of the GDPR. “When an organization fails to protect personal data from loss, damage or theft, it is more than an inconvenience,” ICO Commissioner Elizabeth Dunham said to the press.
In fact, the GDPR allows companies to be fined up to 4% of their annual turnover over data protection infringements. In relation, the fine of £183m British Airways received equals to 1,5% of its worldwide turnover for the year 2017, which lies under the possible maximum of 4%.
BA can still put forth an appeal in regards to the findings and the scale of the fine, before the ICO’s final decision is made.
29. April 2019
The British food store chain VM Morrison Supermarkets PLC (“Morrisons”) has been granted permission by the Supreme Court to appeal the data protection class action brought against it and to challenge the judgment for all its grounds. The case is important as it’s the first to be filed in the UK for a data breach and its outcome may affect the number of class actions for data breaches.
An employee who worked as a senior IT auditor for Morrsisons copied the payroll data of almost 100,000 employees onto a USB stick and published it on a file-sharing website. He then reported the violation anonymously to three newspapers. The employee himself was sentenced to eight years in prison for various crimes.
5,518 employees filed a class action lawsuit against Morrisons for the violation. It claimed both primary and representative liability for the company. The Supreme Court dismissed all primary liability claims under the Data Protection Act (“DPA”), as it concluded that the employee had acted independently of Morrisons in violation of the DPA.
However, the court found that Morrisons is vicariously liable for its employee’s actions, although the DPA does not explicitly foresee vicarious liability. The company appealed the decision.
The Court of Appeals dismissed the appeal and upheld the Supreme Court’s ruling that the Company is vicariously liable for its employee’s data breach, even though it was itself acquitted of any misconduct.
In the future appeal of the Supreme Court, it will have to examine, among other things, whether there is deputy liability under the DPA and whether the Court of Appeal’s conclusion that the employee disclosed the data during his employment was incorrect.
24. August 2017
According to BBC.com, the fraud prevention group Cifas warns that cases of identity theft increase year by year in the UK. In the first six months of the year Cifas already recorded 89,000 cases, which is a 5% increase in relation to the same period of the last year and a new record.
BBC.com further reports that Simon Dukes, chief executive of Cifas, said: “We have seen identity fraud attempts increase year on year, now reaching epidemic levels, with identities being stolen at a rate of almost 500 a day.” It is further explained that “these frauds are taking place almost exclusively online. The vast amounts of personal data that is available either online or through data breaches is only making it easier for the fraudster.”
Fraudsters are targeting data such as the name, address, date of birth or bank account details. They gather these data by hacking computers, stealing mails or buying data through the “dark web”. Also, victims are tricked into giving away their personal data. However, most of the thefts, about 80%, are committed online and mostly without notice of the victims. The crimes often come to light, when for example the first random bill arrives.
The victims of impersonation were breaked down into categories of ages, showing that it is most likely that people in their 30s and 40s are victims of identity thefts, since about this group of people often a high amount of information was gathered online. It is further reported that according to Cifas, the amount of cases fell for the group of over-60s, while the group of 21 to 30 years old showed the biggest increase of cases.