Swedish DPA imposed ist first GDPR fine

23. August 2019

The Swedish Data Protection Authority “datainspektionen” imposed its first fine since the General Data Protection Regulation (GDPR) has entered into force.

Affected is a high school in Skelleftea in the north of Sweden. In the school, 22 pupils were part of a pilot programme to monitor attendance times using facial recognition.

In January 2019, the IT company Tieto announced that it was testing the presence of students at the school with tags, spartphone apps and facial recognition software for automatic registration of students. In Sweden, it is mandatory for teachers to report the presence of all students in each lesson to the supervisors. According to Tieto, teachers at the school in Skelleftea spend around 18,000 hours a year on this registration. Therefore, a class was selected for the pilot project to test the registration for eight weeks using facial recognition. Parents and students were asked to give their consent.

However, the Swedish data protection authority has now said that the way in which consent was obtained violates the GDPR because of the clear imbalance between controller and data subject. Additionally the school failed to conduct an impact assessment including seeking prior consultation with datainspektionen.

Therefore, the DPA imposed a fine of SEK 200.000 (approximately EUR 20.000). In Sweden, public authorities can be fined up to SEK 20.000.000 (approximately EUR 1.000.000).

Millions of unencrypted biometric data discovered on the internet

19. August 2019

The Israeli security researchers Noam Rotem and Ran Locar discovered the unprotected and mostly unencrypted database of Biostar 2 during an Internet search.

Biostar 2 is a web-based biometric locking system that provides centralized control of access to secure facilities such as warehouses and office buildings. The researchers were given access to over 27.8 million records and 23 gigabytes of data, including fingerprint data, facial recognition data, facial photos of users, user names and passwords, and protocols for accessing facilities. Among others, the system is used by the British Metropolitan Police, insurance companies and banks.

Rotem told the Guardian: “The access allows first of all seeing millions of users are using this system to access different locations and see in real time which user enters which facility or which room in each facility, even.”
He also states that they were able to change data and add new users. So they could have added their own photo and fingerprint to an existing user account and could have had access to the buildings that user had access to or could have added a new user with their own photo and fingerprints.

The intensity of this data breach was particularly large because Biostar 2 is used in 1.5 million locations around the world and fingerprints, unlike passwords, cannot be changed.
Before Rotem and Locar turned to the Guardian, they made several attempts to contact Suprema, the security company responsible for Biostar 2. Meanwhile, the vulnerability has been closed.

To the Guardian, Suprema’s marketing director said they had conducted an “in-depth evaluation” of the information provided: “If there has been any definite threat on our products and/or services, we will take immediate actions and make appropriate announcements to protect our customers’ valuable businesses and assets.”

Rotem said that such problems not only occur at Suprema, but that he contacts three or four companies a week with similar problems.

Irish DPC releases guide on Data Breach Notifications

15. August 2019

On Monday the Irish Data Protection Commission (IDPC) has released a quick guide on Data Breach Notifications. It is supposed to help controllers understand their obligations regarding notification and communication requirements, both to the responsible DPC and to the data subject.

The guide, which is supposed to be a quick overview of the requirements and obligations which fall on data controllers, refers to the Article 29 Working Party’s (now European Data Protection Board or EDPB), much more in depth and detailed, guidance in their guideline concerning Data Breach Notifications.

In summary, the IDPC categorizes a Data Breach as a “security incident that negatively impacts the confidentiality, integrity or availability of personal data; meaning that the controller is unable to ensure compliance with the principles relating to the processing of personal data as outlined in Art. 5 GDPR”. In this case, it falls to the controller to follow two primary obligations: (1) to notify the responsible DPC of the data breach, unless it is unlikely to result in a risk for the data subject, and (2) to communicate the data breach to the affected data subjects, when it is likely to result in a high risk.

The IDPC seeks to help controllers by providing a list of requirements in cases of notification to the DPC and data subjects, especially given the tight timeframe for notifications to be filed within 72 hours of awareness of the breach. It is hoping to eliminate confusion arising in the process, as well as problems that companies have had while filing a Data Breach Notification in the past.

Study shows behavior patterns of internet users regarding cookies

Research has been carried out to see how European consumers interact with the cookie consent mechanisms online.

The study focused in particular on how consumers react to different designs of cookie pop-ups and how different design choices can influence users’ data protection choices. The researchers collected around 5000 cookie notices from leading websites to get an idea of how different cookie consent mechanisms are currently being implemented. They also worked with a German e-commerce site over a period of four months to investigate how more than 82,000 users of the site interacted with the different cookie consent designs. The designs were adapted by the researchers to analyse how different preferences and designs affect the individual’s decision.

Their research showed that the majority of cookie consent notices are placed at the bottom of the screen (58%), do not block interaction with the site (93%) and offer no other option than the confirmation button (86%), leaving the user no choice.

The majority (57%) also tries to get users consent through the design, for example by highlighting the “Agreed” button with a color, while the link to “more options” is made less visible.

The research showed that interaction with consent notices varied widely, between 5-55%, depending on where they were positioned or what preferences were set. More users clicked the “Accept” button when it was highlighted by color (50.8% on mobile, 26.9% on desktop). In contrast, only 39.2% on mobile and 21.1% on desktop clicked the “Accept” button if it was displayed as a text link. As for third parties, around 30% of mobile users and 10% of desktop users accepted all third parties if the checkboxes were preselected. On the other hand, only a small fraction (< 0.1%) allowed all third parties when given the opt-in choice.

They also found that the more choices are given in a cookie notice, the more likely it is that the visitor will refuse the use of cookies. In fact, they concluded that with GDPR-compliant cookie notices, where the website user is free to choose which cookie categories to activate, only 0.1% of users would allow all cookie categories to be set.

ICO releases a draft Code of Practice to consult on the Use of Personal Data in Political Campaigning

14. August 2019

The United Kingdom’s Information Commissioner’s Office (ICO) plans to give consultations on a new framework code of practice regarding the use of personal data in relation to politcal campaigns.

ICO states that in any democratic society it is vital for political parties,  candidates and campaigners to be able to communicate effectively with voters. Equally vital, though, is that all organisations involved in political campaigning use personal data in a transparent, lawful way that is understood by the people.

Along with the internet, politcal campaigning has become increasingly sophisticated and innovative. Using new technologies and techniques to understand their voters and target them, political campaigning has changed, using social media, the electoral register or screening names for ethnicity and age. In a statement from June, ICO has adressed the risk that comes with innovation, which, intended or not, can undermine the democratic process by hidden manipulation through the processing of personal data that the people do not understand.

In this light, ICO expresses that their current guidance is outdated, since it has not been updated since the introduction of the General Data Protection Regulation (GDPR). It does not reflect modern campainging practices. However, the framework does not establish new requirements for campaigners, instead aims at explaining and clarifying data protection and electronic marketing laws as they already stand.

Before drafting the framework, the Information Commissioner launched a call for views in October 2018 in hopes of input from various people and organisations. The framework is hoped to have taken into account the responses the ICO had received in the process.

In hopes of being the basis of a statutory code of practice if the relevant legislation is introduced, the draft of the framework code of practice is now out for public consultation, and will remain open for public access until Ocotber 4th.

EDPB adopts Guidelines on processing of personal data through video devices

13. August 2019

Recently, the EDPB has adopted its Guidelines on processing of personal data through video devices (“the guidelines”). The guidelines provide assistance on how to apply the GDPR in cases of processing through video devices with several examples, which are not exhaustive but applicable for all areas of using video devices.

In a first step, the guidelines set the scope of application. The GDPR is only applicable for the use of video devices if

  • personal data is collected through the video device ( e.g. a person is identifiable on basis of their looks or other specific elements)
  • the processing is not carried out by competent authorities for the purposes of prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, or,
  • the so-called “household exemption” does not apply (processing by a natural person in the course of personal or household activity).

Before processing personal data through video devices, controllers must specify their legal basis for it. According to the guidelines, every legal ground under Article 6 (1) can provide a legal basis. The purposes for using video devices for processing personal data should be documented in writing and specified for every camera in use.

Another subject of the guidelines is the transparency of the processing. The controllers have to inform data subjects about the video surveillance. The EDPB recommends a layered approach and combining several methods to ensure transparency. The most important information should be written on the warning sign itself (first layer) and the other mandatory details may be provided by other means (second layer). The second layer must also be easily accessible for data subjects.

The guidelines also deal with storage periods and technical and organizational measures (TOMs). In some member states may be specific provisions for storing video surveillance footage, but it is recommended to – ideally automatically – delete the personal data after a few days. As with any kind of data processing, the controller must adequately secure it and therefore must have implemented technical and organizational measures. Examples provided are masking or scrambling areas that are not relevant to surveillance, or the editing out of images of third persons, when providing video footage to data subjects.

Until September 9th 2019, the guidelines will be open for public consultation and a final and revised version is planned for the end of 2019.

Privacy issues on Twitter and Instagram

12. August 2019

Both, Twitter and Instagram admitted in the last week that they had some privacy issues regarding the personal data of users in connection with external advertising companies.

Twitter published a statement explaining that the setting choices the user made in regards to ads on Twitter, ecspecially regarding data sharing, were not followed always. Twitter admitted that the setting choices not have worked as intended. The consequence of which is that on the one hand maybe data was shared with advertising companies in case the user clicked or viewed an advertisement. On the other hand it is possible that personalized ads have been shown to the user based on inferences. Both things could have happened even if no permission was given.

The statement also states that the problems were fixed on August 5, 2019 and no personal data like passwords or email accounts were affected. At the moment Twitter is still investigating how many and which users were concerned.

According to a report on businessinsider Instagram had to admit that the trusted partner Hyp3r tracked millions of users’ location data, secretly saved their stories and flout its rules.  Hyp3r, a startup from San Francisco is spezialized on location related advertising and evaluated millions of users’ public stories. The CEO of Hyp3r published a note on the company’s website and contradicts the comparisons with Cambridge Analytica and says that no prohibited practives were used. Privacy is a major and important concern for the company. Whether this is the case can only be left open at this point. Be that as it may, for European users of the platform there is no known legal basis for such an approach.

Nonetheless, Instagram’s careless privacy and data security mechanisms enabled this approach. Even though Instagram ended the cooperation with Hyp3r and stated that they changed the platform to protect the users, the problems of the Facebook-owned app regarding the protection of users personal data are still there.

CNIL and ICO publish revised cookie guidelines

6. August 2019

The French data protection authority CNIL as well as the British data protection authority ICO have revised and published their guidelines on cookies.

The guidelines contain several similarities, but also differ in some respects.

Both France and the UK consider rules that apply to cookies to be also applicable to any device that stores or accesses information. In addition, both authorities stress that users must give specific, free and unambiguous consent before cookies are placed. Further scrolling of the website cannot be considered as consent. Likewise, obtaining consent from T&Cs is not lawful. This procedure violates Art. 7 (2) of the General Data Protection Regulation (GDPR), according to which the request for consent shall be presented in a manner which is clearly distinguishable from the other matters, in an intelligible and easily accessible form, using clear and plain language. In addition, all parties who place cookies must be named so that informed consent can be obtained. Finally, both authorities point out that browser settings alone are not a sufficient basis for valid consent.

With regard to the territorial scope, CNIL clarifies that the cookie rules apply only to the processing of cookies within the activities of an establishment of a controller or processor in France, regardless of whether the processing takes place in France. The English guideline does not comment on this.

Cookie walls are considered non-compliant with GDPR by the French data protection authority due to the negative consequences for the user in case of refusal. ICO, on the other hand, is of the opinion that a consent forced on the basis of a cookie wall is probably not valid. Nevertheless GDPR must be balanced with other rights. Insofar ICO has not yet delivered a clear position.

Regarding analytic cookies, CNIL explains that a consent is not always necessary, namely not if they correspond to a list of cumulative requirements created by CNIL. ICO, on the other hand, does not exempt cookies from the consent requirement even in the case of analytic cookies.

Finally, CNIL notes that companies have six months to comply with the rules. However, this period will only be set in motion by the publication of a statement by the CNIL, which is still pending. CNIL expects this statement to be finalised during the first quarter of 2020. The ICO does not foresee such a time limit.

Amazon lets Alexa recordings evaluate by timeworkers in home-office

5. August 2019

According to a report by German newspaper “Welt am Sonntag”, Amazon has Alexa’s voice recordings listened to not only by its own employees, but also by Polish temporary workers.

For some time now, Amazon has been the subject of criticism because the recordings of the Alexa language assistant are listened to and typed in by employees in order to improve speech recognition. For a long time, however, the users were unaware of this long-standing practice.

It has now become known that temporary workers in the home office listen to and evaluate the recordings using a remote work program. Until recently, a Polish recruitment agency advertised “teleworking all over the country”, although Amazon had previously assured that the voice recordings would only be evaluated in specially protected offices. However, one of the Polish temporary workers stated that many of them would work from home and that among the records were personal data such as names or places that allowed conclusions to be drawn about the person.

Upon request, Amazon confirmed the research results. A spokesman said that some employees were allowed to work from other locations than the Amazon offices, but that particularly strict rules would have to be observed. In particular, working in public places is not allowed.

On the same day, the online job advertisements were deleted and Amazon offered a new data protection option. Users can now explicitly object and block their recording for post-processing by Amazon employees.

Other language assistants have also been or are to be suspended from language evaluation, at least for European users. According to Google, around 0.2 % of the recordings are listened to subsequently, while Apple and Amazon say it is less than 1 %. Google already deactivated the function three months ago and Apple also wants to suspend the evaluation and explicitly ask its users later whether an evaluation may be resumed.

Chinese police uses gait recognition for identification

30. July 2019

The police in Beijing and Shanghai have begun to use a new form of surveillance. The gait recognition technology analyzes the body shapes and ways people walk to identify them, even if their faces are hidden from the camera.

The gait recognition software is part of an advance in China towards the development of artificial intelligence and data-driven surveillance.

On their website, the Chinese technology startup Watrix explains that gait functions with a low-resolution video are remotely obtainable and recognizable compared to other biometrics such as face, iris, palm print and fingerprint. With the features of the contactless, far-reaching, transparent recognition range and the difficult to disguise gait recognition, it closes the gap in the market for remote identification in the public security industry. “You don’t need people’s cooperation for us to be able to recognize their identity,” Huang Yongzhen, the CEO of Watrix, said in an interview. “Gait analysis can’t be fooled by simply limping, walking with splayed feet or hunching over, because we’re analyzing all the features of an entire body.”

Watrix’s software extracts a person’s silhouette from the video and analyzes their movements to create a model of the person’s gait. However, it is not yet able to identify people in real time. Users must upload videos to the program. Yet no special cameras are needed. The software can use footage from regular surveillance cameras to analyze the gait.

The technology is not new. Scientists in Japan, the UK and the U.S. Defense Information Systems Agency have been researching gait detection for over a decade. Professors from the University of Osaka have been working with the Japanese National Police Agency since 2013 to pilot the gait recognition software.

Category: China
Tags: , ,
Pages: Prev 1 2 3 ... 29 30 31 32 33 34 35 ... 67 68 69 Next
1 30 31 32 33 34 69