Study shows behavior patterns of internet users regarding cookies

15. August 2019

Research has been carried out to see how European consumers interact with the cookie consent mechanisms online.

The study focused in particular on how consumers react to different designs of cookie pop-ups and how different design choices can influence users’ data protection choices. The researchers collected around 5000 cookie notices from leading websites to get an idea of how different cookie consent mechanisms are currently being implemented. They also worked with a German e-commerce site over a period of four months to investigate how more than 82,000 users of the site interacted with the different cookie consent designs. The designs were adapted by the researchers to analyse how different preferences and designs affect the individual’s decision.

Their research showed that the majority of cookie consent notices are placed at the bottom of the screen (58%), do not block interaction with the site (93%) and offer no other option than the confirmation button (86%), leaving the user no choice.

The majority (57%) also tries to get users consent through the design, for example by highlighting the “Agreed” button with a color, while the link to “more options” is made less visible.

The research showed that interaction with consent notices varied widely, between 5-55%, depending on where they were positioned or what preferences were set. More users clicked the “Accept” button when it was highlighted by color (50.8% on mobile, 26.9% on desktop). In contrast, only 39.2% on mobile and 21.1% on desktop clicked the “Accept” button if it was displayed as a text link. As for third parties, around 30% of mobile users and 10% of desktop users accepted all third parties if the checkboxes were preselected. On the other hand, only a small fraction (< 0.1%) allowed all third parties when given the opt-in choice.

They also found that the more choices are given in a cookie notice, the more likely it is that the visitor will refuse the use of cookies. In fact, they concluded that with GDPR-compliant cookie notices, where the website user is free to choose which cookie categories to activate, only 0.1% of users would allow all cookie categories to be set.

ICO releases a draft Code of Practice to consult on the Use of Personal Data in Political Campaigning

14. August 2019

The United Kingdom’s Information Commissioner’s Office (ICO) plans to give consultations on a new framework code of practice regarding the use of personal data in relation to politcal campaigns.

ICO states that in any democratic society it is vital for political parties,  candidates and campaigners to be able to communicate effectively with voters. Equally vital, though, is that all organisations involved in political campaigning use personal data in a transparent, lawful way that is understood by the people.

Along with the internet, politcal campaigning has become increasingly sophisticated and innovative. Using new technologies and techniques to understand their voters and target them, political campaigning has changed, using social media, the electoral register or screening names for ethnicity and age. In a statement from June, ICO has adressed the risk that comes with innovation, which, intended or not, can undermine the democratic process by hidden manipulation through the processing of personal data that the people do not understand.

In this light, ICO expresses that their current guidance is outdated, since it has not been updated since the introduction of the General Data Protection Regulation (GDPR). It does not reflect modern campainging practices. However, the framework does not establish new requirements for campaigners, instead aims at explaining and clarifying data protection and electronic marketing laws as they already stand.

Before drafting the framework, the Information Commissioner launched a call for views in October 2018 in hopes of input from various people and organisations. The framework is hoped to have taken into account the responses the ICO had received in the process.

In hopes of being the basis of a statutory code of practice if the relevant legislation is introduced, the draft of the framework code of practice is now out for public consultation, and will remain open for public access until Ocotber 4th.

EDPB adopts Guidelines on processing of personal data through video devices

13. August 2019

Recently, the EDPB has adopted its Guidelines on processing of personal data through video devices (“the guidelines”). The guidelines provide assistance on how to apply the GDPR in cases of processing through video devices with several examples, which are not exhaustive but applicable for all areas of using video devices.

In a first step, the guidelines set the scope of application. The GDPR is only applicable for the use of video devices if

  • personal data is collected through the video device ( e.g. a person is identifiable on basis of their looks or other specific elements)
  • the processing is not carried out by competent authorities for the purposes of prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, or,
  • the so-called “household exemption” does not apply (processing by a natural person in the course of personal or household activity).

Before processing personal data through video devices, controllers must specify their legal basis for it. According to the guidelines, every legal ground under Article 6 (1) can provide a legal basis. The purposes for using video devices for processing personal data should be documented in writing and specified for every camera in use.

Another subject of the guidelines is the transparency of the processing. The controllers have to inform data subjects about the video surveillance. The EDPB recommends a layered approach and combining several methods to ensure transparency. The most important information should be written on the warning sign itself (first layer) and the other mandatory details may be provided by other means (second layer). The second layer must also be easily accessible for data subjects.

The guidelines also deal with storage periods and technical and organizational measures (TOMs). In some member states may be specific provisions for storing video surveillance footage, but it is recommended to – ideally automatically – delete the personal data after a few days. As with any kind of data processing, the controller must adequately secure it and therefore must have implemented technical and organizational measures. Examples provided are masking or scrambling areas that are not relevant to surveillance, or the editing out of images of third persons, when providing video footage to data subjects.

Until September 9th 2019, the guidelines will be open for public consultation and a final and revised version is planned for the end of 2019.

Privacy issues on Twitter and Instagram

12. August 2019

Both, Twitter and Instagram admitted in the last week that they had some privacy issues regarding the personal data of users in connection with external advertising companies.

Twitter published a statement explaining that the setting choices the user made in regards to ads on Twitter, ecspecially regarding data sharing, were not followed always. Twitter admitted that the setting choices not have worked as intended. The consequence of which is that on the one hand maybe data was shared with advertising companies in case the user clicked or viewed an advertisement. On the other hand it is possible that personalized ads have been shown to the user based on inferences. Both things could have happened even if no permission was given.

The statement also states that the problems were fixed on August 5, 2019 and no personal data like passwords or email accounts were affected. At the moment Twitter is still investigating how many and which users were concerned.

According to a report on businessinsider Instagram had to admit that the trusted partner Hyp3r tracked millions of users’ location data, secretly saved their stories and flout its rules.  Hyp3r, a startup from San Francisco is spezialized on location related advertising and evaluated millions of users’ public stories. The CEO of Hyp3r published a note on the company’s website and contradicts the comparisons with Cambridge Analytica and says that no prohibited practives were used. Privacy is a major and important concern for the company. Whether this is the case can only be left open at this point. Be that as it may, for European users of the platform there is no known legal basis for such an approach.

Nonetheless, Instagram’s careless privacy and data security mechanisms enabled this approach. Even though Instagram ended the cooperation with Hyp3r and stated that they changed the platform to protect the users, the problems of the Facebook-owned app regarding the protection of users personal data are still there.

CNIL and ICO publish revised cookie guidelines

6. August 2019

The French data protection authority CNIL as well as the British data protection authority ICO have revised and published their guidelines on cookies.

The guidelines contain several similarities, but also differ in some respects.

Both France and the UK consider rules that apply to cookies to be also applicable to any device that stores or accesses information. In addition, both authorities stress that users must give specific, free and unambiguous consent before cookies are placed. Further scrolling of the website cannot be considered as consent. Likewise, obtaining consent from T&Cs is not lawful. This procedure violates Art. 7 (2) of the General Data Protection Regulation (GDPR), according to which the request for consent shall be presented in a manner which is clearly distinguishable from the other matters, in an intelligible and easily accessible form, using clear and plain language. In addition, all parties who place cookies must be named so that informed consent can be obtained. Finally, both authorities point out that browser settings alone are not a sufficient basis for valid consent.

With regard to the territorial scope, CNIL clarifies that the cookie rules apply only to the processing of cookies within the activities of an establishment of a controller or processor in France, regardless of whether the processing takes place in France. The English guideline does not comment on this.

Cookie walls are considered non-compliant with GDPR by the French data protection authority due to the negative consequences for the user in case of refusal. ICO, on the other hand, is of the opinion that a consent forced on the basis of a cookie wall is probably not valid. Nevertheless GDPR must be balanced with other rights. Insofar ICO has not yet delivered a clear position.

Regarding analytic cookies, CNIL explains that a consent is not always necessary, namely not if they correspond to a list of cumulative requirements created by CNIL. ICO, on the other hand, does not exempt cookies from the consent requirement even in the case of analytic cookies.

Finally, CNIL notes that companies have six months to comply with the rules. However, this period will only be set in motion by the publication of a statement by the CNIL, which is still pending. CNIL expects this statement to be finalised during the first quarter of 2020. The ICO does not foresee such a time limit.

Amazon lets Alexa recordings evaluate by timeworkers in home-office

5. August 2019

According to a report by German newspaper “Welt am Sonntag”, Amazon has Alexa’s voice recordings listened to not only by its own employees, but also by Polish temporary workers.

For some time now, Amazon has been the subject of criticism because the recordings of the Alexa language assistant are listened to and typed in by employees in order to improve speech recognition. For a long time, however, the users were unaware of this long-standing practice.

It has now become known that temporary workers in the home office listen to and evaluate the recordings using a remote work program. Until recently, a Polish recruitment agency advertised “teleworking all over the country”, although Amazon had previously assured that the voice recordings would only be evaluated in specially protected offices. However, one of the Polish temporary workers stated that many of them would work from home and that among the records were personal data such as names or places that allowed conclusions to be drawn about the person.

Upon request, Amazon confirmed the research results. A spokesman said that some employees were allowed to work from other locations than the Amazon offices, but that particularly strict rules would have to be observed. In particular, working in public places is not allowed.

On the same day, the online job advertisements were deleted and Amazon offered a new data protection option. Users can now explicitly object and block their recording for post-processing by Amazon employees.

Other language assistants have also been or are to be suspended from language evaluation, at least for European users. According to Google, around 0.2 % of the recordings are listened to subsequently, while Apple and Amazon say it is less than 1 %. Google already deactivated the function three months ago and Apple also wants to suspend the evaluation and explicitly ask its users later whether an evaluation may be resumed.

Chinese police uses gait recognition for identification

30. July 2019

The police in Beijing and Shanghai have begun to use a new form of surveillance. The gait recognition technology analyzes the body shapes and ways people walk to identify them, even if their faces are hidden from the camera.

The gait recognition software is part of an advance in China towards the development of artificial intelligence and data-driven surveillance.

On their website, the Chinese technology startup Watrix explains that gait functions with a low-resolution video are remotely obtainable and recognizable compared to other biometrics such as face, iris, palm print and fingerprint. With the features of the contactless, far-reaching, transparent recognition range and the difficult to disguise gait recognition, it closes the gap in the market for remote identification in the public security industry. “You don’t need people’s cooperation for us to be able to recognize their identity,” Huang Yongzhen, the CEO of Watrix, said in an interview. “Gait analysis can’t be fooled by simply limping, walking with splayed feet or hunching over, because we’re analyzing all the features of an entire body.”

Watrix’s software extracts a person’s silhouette from the video and analyzes their movements to create a model of the person’s gait. However, it is not yet able to identify people in real time. Users must upload videos to the program. Yet no special cameras are needed. The software can use footage from regular surveillance cameras to analyze the gait.

The technology is not new. Scientists in Japan, the UK and the U.S. Defense Information Systems Agency have been researching gait detection for over a decade. Professors from the University of Osaka have been working with the Japanese National Police Agency since 2013 to pilot the gait recognition software.

Category: China
Tags: , ,

Settlement of $13 Million for Google in Street View Privacy Case

In an attempt to settle a long-running litigation of a class-action case started in 2010, Google agrees to pay $13 million over claims that it violated U.S. wire-tapping laws. The issue came from vehicles used for its Street View mapping Project that captured and collected personal data from private wifi networks along the way.

Street View is a feature that lets users interact with panoramic and detailed images of locations all around the world. The legal action began when several people whose data was collected sued Google after it admitted the cars photographing neighborhoods for Street View had also gathered emails, passwords and other private information from wifi networks in more than 30 countries.

While the company was quick to call this collection of data a mistake,  investigators found out that the capture of personal data was built and embedded by Google engineers in the software of the vehicles to intentionally collect personal data from accessed networks.

The new agreement would make Google to be required to destroy any collected data via Street View, agree not to use Street View to collect personal data from wifi networks without consent, and to create webpages and instructions to explain to people how to secure their wireless content.

Google had been asked to refrain from using and collecting personal data from wifi networks in an earlier settlement in 2013, which raises questions as to why it was necessary to include it in the current settlement as well.

Category: Cyber security · General · USA
Tags: , ,

CNIL fines French insurance company

26. July 2019

The French Data Protection Authority (CNIL) imposed a € 180.000 fine on a French insurance company for violating customer data security on their website.

Active Assurance is an insurance intermediary and distributor of motor insurances to customers. On their website, people can request offers, subscribe to contracts and access their personal space.

In 2018, CNIL received a complaint from an Active Assurance customer, saying that he had been able access other users’ data. The other accounts were accessible via hypertext links referred on a search engine. Customers’ documents were also available by slightly changing the URL. Among those records were drivers’ licences, bank statements and documents revealing whether someone has been subject of a licence withdrawal or hit and run.

CNIL informed the company about the violations and a few days later, the company stated that measures had been taken to rectify the infringements. After an on-site audit at the company’s premises, CNIL found that the measures taken were not sufficient and that Active Assurance violates Art. 32 GDPR. Active Assurance should have ensured that only authorized persons had access to the documents. The company should have also instructed the customers to use strong passwords and it should not have send them the passwords in plain text by e-mail.

Based on the seriousness of the breach and the number of people involved, CNIL imposed a fine of € 180.000.

FaceApp reacts to privacy concerns

22. July 2019

The picture editing app FaceApp, which became increasingly popular on social media, was confronted with various concerns about their privacy.

Created in Russia by a four-person start-up company, the app applies a newly developed technology that uses neural networks to modify a face in any photo while remaining photorealistic. In this process, no filters are placed on the photo, but the image itself is modified with the help of deep learning technology.

However, the app is accused of not explaining that the images are uploaded to a cloud for editing. In addition, the app is accused of uploading not only the image selected by the user, but also the entire camera roll in the background. The latter in particular raises high security concerns due to the large number of screenshots that people nowadays take of sensitive information such as access data or bank details.

While there is no evidence for the latter accusation and FaceApp emphasizes in its statement that no image other than the one chosen by the user is uploaded, they confirm the upload into a cloud.

The upload to the cloud justifies FaceApp with reasons of performance and traffic. With this, the app developers want to ensure that the user does not upload the photo repeatedly during each editing process.

Finally, FaceApp declares that no user data will be sold or passed on to third parties. Also, in 99 % of cases, they are unable to identify a person because the app can be and actually is used without registration by a large number of users.

Pages: Prev 1 2 3 4 5 6 7 8 9 10 ... 37 38 39 Next
1 2 3 4 39