Category: GDPR

London’s King’s Cross station facial recognition technology under investigation by the ICO

11. September 2019

Initially reported by the Financial Times, London’s King’s Cross station is under crossfire for making use of a live face-scanning system across its 67 acres large site. Developed by Argent, it was confirmed that the system has been used to ensure public safety, being part of a number of detection and tracking methods used in terms of surveillance at the famous train station. While the site is privately owned, it is widely used by the public and houses various shops, cafes, restaurants, as well as office spaces with tenants like, for example, Google.

The controversy behind the technology and its legality stems from the fact that it records everyone in its parameters without their consent, analyzing their faces and compairing them to a database of wanted criminals, suspects and persons of interest. While Developer Argent defended the technology, it has not yet explained what the system is, how it is used and how long it has been in place.

A day before the ICO launched its investigation, a letter from King’s Cross Chief Executive Robert Evans reached London Mayor Sadiq Khan, explaining the matching of the technology against a watchlist of flagged individuals. In effect, if footage is unmatched, it is blurred out and deleted. In case of a match, it is only shared with law enforcement. The Metropolitan Police Service has stated that they have supplied images for a database to carry out facial scans to system, though it claims to not have done so since March, 2018.

Despite the explanation and the distinct statements that the software is abiding by England’s data protection laws, the Information Commissioner’s Office (ICO) has launched an investigation into the technology and its use in the private sector. Businesses would need to explicitly demonstrate that the use of such surveillance technology is strictly necessary and proportionate for their legitimate interests and public safety. In her statement, Information Commissioner Elizabeth Denham further said that she is deeply concerned, since “scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all,” especially if its being done without their knowledge.

The controversy has sparked a demand for a law about facial recognition, igniting a dialogue about new technologies and future-proofing against the yet unknown privacy issues they may cause.

Category: GDPR · General · UK
Tags: , , , ,

Greek Parliament passes bill to adopt GDPR into National Law

29. August 2019

On Monday, August 26th, the Greek Parliament passed a bill that will incorporate the European Union’s General Data Protection Regulation (GDPR) into national law. Originally, the adaptation of the EU regulation was supposed to take place until May 06, 2018. Greece failed to comply with the deadline.

The, now, fast-paced implementation of the regulation may have come as a result of the referral of Greece and Spain by the European Commission (EC) to the European Court of Justice on July 25th. Since they had failed to adopt the GDPR into national law up until then, Greece could have faced a fine of €5,287.50 for every day passed since May 06, in addition to a stiff fine of €1.3 million. In its statement, the EC declared that “the lack of transposition by Spain and Greece creates a different level of protection of peoples’ rights and freedoms, and hampers data exchanges between Greece and Spain on one side and other Member States, who transposed the Directive, on the other side”.

The EU countries are allowed to adopt certain derogations, exeptions and specifications under the GDPR. Greece has done so, in the approved bill, with adjusted provisions in regards to the age of consent, the process of appointing a Data Protection Officer, sensitive data processing, data repurposing, data deletion, certifications and criminal sanctions.

The legislation was approved by New Democracy, the main opposition SYRIZA, the center-left Movement for Change and leftist MeRA25, with an overwhelming majority. The GDPR has already been in effect since May 25th, 2018, with its main aim being to offer more control to individuals over their personal data that they provide to companies and services.

 

Category: EU · EU Commission · GDPR · General
Tags: , , ,

Swedish DPA imposed ist first GDPR fine

23. August 2019

The Swedish Data Protection Authority “datainspektionen” imposed its first fine since the General Data Protection Regulation (GDPR) has entered into force.

Affected is a high school in Skelleftea in the north of Sweden. In the school, 22 pupils were part of a pilot programme to monitor attendance times using facial recognition.

In January 2019, the IT company Tieto announced that it was testing the presence of students at the school with tags, spartphone apps and facial recognition software for automatic registration of students. In Sweden, it is mandatory for teachers to report the presence of all students in each lesson to the supervisors. According to Tieto, teachers at the school in Skelleftea spend around 18,000 hours a year on this registration. Therefore, a class was selected for the pilot project to test the registration for eight weeks using facial recognition. Parents and students were asked to give their consent.

However, the Swedish data protection authority has now said that the way in which consent was obtained violates the GDPR because of the clear imbalance between controller and data subject. Additionally the school failed to conduct an impact assessment including seeking prior consultation with datainspektionen.

Therefore, the DPA imposed a fine of SEK 200.000 (approximately EUR 20.000). In Sweden, public authorities can be fined up to SEK 20.000.000 (approximately EUR 1.000.000).

Irish DPC releases guide on Data Breach Notifications

15. August 2019

On Monday the Irish Data Protection Commission (IDPC) has released a quick guide on Data Breach Notifications. It is supposed to help controllers understand their obligations regarding notification and communication requirements, both to the responsible DPC and to the data subject.

The guide, which is supposed to be a quick overview of the requirements and obligations which fall on data controllers, refers to the Article 29 Working Party’s (now European Data Protection Board or EDPB), much more in depth and detailed, guidance in their guideline concerning Data Breach Notifications.

In summary, the IDPC categorizes a Data Breach as a “security incident that negatively impacts the confidentiality, integrity or availability of personal data; meaning that the controller is unable to ensure compliance with the principles relating to the processing of personal data as outlined in Art. 5 GDPR”. In this case, it falls to the controller to follow two primary obligations: (1) to notify the responsible DPC of the data breach, unless it is unlikely to result in a risk for the data subject, and (2) to communicate the data breach to the affected data subjects, when it is likely to result in a high risk.

The IDPC seeks to help controllers by providing a list of requirements in cases of notification to the DPC and data subjects, especially given the tight timeframe for notifications to be filed within 72 hours of awareness of the breach. It is hoping to eliminate confusion arising in the process, as well as problems that companies have had while filing a Data Breach Notification in the past.

Study shows behavior patterns of internet users regarding cookies

Research has been carried out to see how European consumers interact with the cookie consent mechanisms online.

The study focused in particular on how consumers react to different designs of cookie pop-ups and how different design choices can influence users’ data protection choices. The researchers collected around 5000 cookie notices from leading websites to get an idea of how different cookie consent mechanisms are currently being implemented. They also worked with a German e-commerce site over a period of four months to investigate how more than 82,000 users of the site interacted with the different cookie consent designs. The designs were adapted by the researchers to analyse how different preferences and designs affect the individual’s decision.

Their research showed that the majority of cookie consent notices are placed at the bottom of the screen (58%), do not block interaction with the site (93%) and offer no other option than the confirmation button (86%), leaving the user no choice.

The majority (57%) also tries to get users consent through the design, for example by highlighting the “Agreed” button with a color, while the link to “more options” is made less visible.

The research showed that interaction with consent notices varied widely, between 5-55%, depending on where they were positioned or what preferences were set. More users clicked the “Accept” button when it was highlighted by color (50.8% on mobile, 26.9% on desktop). In contrast, only 39.2% on mobile and 21.1% on desktop clicked the “Accept” button if it was displayed as a text link. As for third parties, around 30% of mobile users and 10% of desktop users accepted all third parties if the checkboxes were preselected. On the other hand, only a small fraction (< 0.1%) allowed all third parties when given the opt-in choice.

They also found that the more choices are given in a cookie notice, the more likely it is that the visitor will refuse the use of cookies. In fact, they concluded that with GDPR-compliant cookie notices, where the website user is free to choose which cookie categories to activate, only 0.1% of users would allow all cookie categories to be set.

ICO releases a draft Code of Practice to consult on the Use of Personal Data in Political Campaigning

14. August 2019

The United Kingdom’s Information Commissioner’s Office (ICO) plans to give consultations on a new framework code of practice regarding the use of personal data in relation to politcal campaigns.

ICO states that in any democratic society it is vital for political parties,  candidates and campaigners to be able to communicate effectively with voters. Equally vital, though, is that all organisations involved in political campaigning use personal data in a transparent, lawful way that is understood by the people.

Along with the internet, politcal campaigning has become increasingly sophisticated and innovative. Using new technologies and techniques to understand their voters and target them, political campaigning has changed, using social media, the electoral register or screening names for ethnicity and age. In a statement from June, ICO has adressed the risk that comes with innovation, which, intended or not, can undermine the democratic process by hidden manipulation through the processing of personal data that the people do not understand.

In this light, ICO expresses that their current guidance is outdated, since it has not been updated since the introduction of the General Data Protection Regulation (GDPR). It does not reflect modern campainging practices. However, the framework does not establish new requirements for campaigners, instead aims at explaining and clarifying data protection and electronic marketing laws as they already stand.

Before drafting the framework, the Information Commissioner launched a call for views in October 2018 in hopes of input from various people and organisations. The framework is hoped to have taken into account the responses the ICO had received in the process.

In hopes of being the basis of a statutory code of practice if the relevant legislation is introduced, the draft of the framework code of practice is now out for public consultation, and will remain open for public access until Ocotber 4th.

EDPB adopts Guidelines on processing of personal data through video devices

13. August 2019

Recently, the EDPB has adopted its Guidelines on processing of personal data through video devices (“the guidelines”). The guidelines provide assistance on how to apply the GDPR in cases of processing through video devices with several examples, which are not exhaustive but applicable for all areas of using video devices.

In a first step, the guidelines set the scope of application. The GDPR is only applicable for the use of video devices if

  • personal data is collected through the video device ( e.g. a person is identifiable on basis of their looks or other specific elements)
  • the processing is not carried out by competent authorities for the purposes of prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, or,
  • the so-called “household exemption” does not apply (processing by a natural person in the course of personal or household activity).

Before processing personal data through video devices, controllers must specify their legal basis for it. According to the guidelines, every legal ground under Article 6 (1) can provide a legal basis. The purposes for using video devices for processing personal data should be documented in writing and specified for every camera in use.

Another subject of the guidelines is the transparency of the processing. The controllers have to inform data subjects about the video surveillance. The EDPB recommends a layered approach and combining several methods to ensure transparency. The most important information should be written on the warning sign itself (first layer) and the other mandatory details may be provided by other means (second layer). The second layer must also be easily accessible for data subjects.

The guidelines also deal with storage periods and technical and organizational measures (TOMs). In some member states may be specific provisions for storing video surveillance footage, but it is recommended to – ideally automatically – delete the personal data after a few days. As with any kind of data processing, the controller must adequately secure it and therefore must have implemented technical and organizational measures. Examples provided are masking or scrambling areas that are not relevant to surveillance, or the editing out of images of third persons, when providing video footage to data subjects.

Until September 9th 2019, the guidelines will be open for public consultation and a final and revised version is planned for the end of 2019.

CNIL and ICO publish revised cookie guidelines

6. August 2019

The French data protection authority CNIL as well as the British data protection authority ICO have revised and published their guidelines on cookies.

The guidelines contain several similarities, but also differ in some respects.

Both France and the UK consider rules that apply to cookies to be also applicable to any device that stores or accesses information. In addition, both authorities stress that users must give specific, free and unambiguous consent before cookies are placed. Further scrolling of the website cannot be considered as consent. Likewise, obtaining consent from T&Cs is not lawful. This procedure violates Art. 7 (2) of the General Data Protection Regulation (GDPR), according to which the request for consent shall be presented in a manner which is clearly distinguishable from the other matters, in an intelligible and easily accessible form, using clear and plain language. In addition, all parties who place cookies must be named so that informed consent can be obtained. Finally, both authorities point out that browser settings alone are not a sufficient basis for valid consent.

With regard to the territorial scope, CNIL clarifies that the cookie rules apply only to the processing of cookies within the activities of an establishment of a controller or processor in France, regardless of whether the processing takes place in France. The English guideline does not comment on this.

Cookie walls are considered non-compliant with GDPR by the French data protection authority due to the negative consequences for the user in case of refusal. ICO, on the other hand, is of the opinion that a consent forced on the basis of a cookie wall is probably not valid. Nevertheless GDPR must be balanced with other rights. Insofar ICO has not yet delivered a clear position.

Regarding analytic cookies, CNIL explains that a consent is not always necessary, namely not if they correspond to a list of cumulative requirements created by CNIL. ICO, on the other hand, does not exempt cookies from the consent requirement even in the case of analytic cookies.

Finally, CNIL notes that companies have six months to comply with the rules. However, this period will only be set in motion by the publication of a statement by the CNIL, which is still pending. CNIL expects this statement to be finalised during the first quarter of 2020. The ICO does not foresee such a time limit.

Hackers steal millions of Bulgarians’ financial data

18. July 2019

After a cyberattack on the Bulgarian’s tax agency (NRA) millions of taxpayers’ financial data has been stolen. In an estimate, it is said that most working adults in the 7 million country are affected by some of their data being compromised. The stolen data included names, adresses, income and social security information.

The attack happened in June, but an E-mail from the self-proclaimed perpetrator was sent to Bulgarian media on Monday. It stated that more than 110 databases of the agency had been compromised, the hacker calling the NRA’s cybersecurity a parody. The Bulgarian media were further offered access to the stolen data. One stolen file, e-mailed to the newspaper 24 Chasa,  contained up to 1,1 million personal identification numbers with income, social security and healthcare figures.

The country’s finance minister Vladislav Goranov has appologized in parliament and to the Bulgarian citizens, adding that about 3% of the tax agency’s database had been affected. He made clear that whoever attempted to exploit the stolen data would fall under the impact of Bulgarian law.

In result to this hacking attack, the Bulgarian tax agency now faces a fine of up to 20 million euros by the Commission of Personal Data Protection (CPDP). In addition, the issue has reignited an old debate about the lax cybersecurity standards in Bulgaria, and its adjustement to the modern times.

Google data breach notification sent to IDPC

Google may face further investigations under the General Data Protection Regulation(GDPR), after unauthorized audio recordings have been forwarded to subcontractors. The Irish Data Protection Commission (IDPC) has confirmed through a spokesperson that they have received a data breach notification concerning the issue last week.

The recordings were exposed by the Belgian broadcast VRT, said to affect 1000 clips of conversations in the region of Belgium and the Netherlands. Being logged by Google Assistant, the recordings were then sent to Google’s subcontractors for review. At least 153 of those recordings were not authorized by Google’s wake phrase “Ok/Hey, Google,” and were never meant to be recorded in the first place. They contained personal data reaching from family conversations over bedroom chatter to business calls with confidential information.

Google has addressed this violation of their data security policies in a blog post. It said that the audio recordings were sent to experts, who understand nuances and accents, in order to refine Home’s linguistic abilities, which is a critical part in the process of building speech technology. Google stresses that the storing of recorded data on its services is turned off by default, and only sends audio data to Google once its wake phrase is said. The recordings in question were most likely initiated by the users saying a phrase that sounded similar to “Ok/Hey, Google,” therefore confusing Google Assistant and turning it on.

According to Google’s statement, Security and Privacy teams are working on the issue and will fully review its safeguards to prevent this sort of misconduct from happening again. If, however, following investigations by the IDPC discover a GDPR violation on the matter, it could result in significant financial penalty for the tech giant.

Pages: 1 2 3 4 5 6 7 8 9 10 Next
1 2 3 10