Category: Personal Data

WP29 Guidelines on the notion of consent according to the GDPR – Part 1

26. January 2018

According to the GDPR, consent is one of the six lawful bases mentioned in Art. 6. In order for consent to be valid and compliant with the GDPR it needs to reflect the data subjects real choice and control.

The Working Party 29 (WP 29) clarifies and specifies the “requirements for obtaining and demonstrating” such a valid consent in its Guidelines released in December 2017.

The guidelines start off with an analysis of Article 4 (11) of the GDPR and then discusses the elements of valid consent. Referring to the Opinion 15/2011 on the definition of consent, “obtaining consent also does not negate or in any way diminish the controller’s obligations to observe the principles of processing enshrined in the GDPR, especially Article 5 of the GDPR with regard to fairness, necessity and proportionality, as well as data quality.”

The WP29 illustrates the elements of valid consent, such as the consent being freely given, specific, informed and unambiguous. For example, a consent is not considered as freely given if a mobile app for photo editing requires the users to have their GPS location activated simply in order to collect behavioural data aside from the photo editing. The WP29 emphasizes that consent to processing of unnecessary personal data “cannot be seen as a mandatory consideration in exchange for performance.”

Another important aspect taken into consideration is the imbalance of powers, e.g. in the matter of public authorities or in the context of employment. “Consent can only be valid if the data subject is able to exercise a real choice, and there is no risk of deception, intimidation, coercion or significant negative consequences (e.g. substantial extra costs) if he/she does not consent. Consent will not be free in cases where there is any element of compulsion, pressure or inability to exercise free will. “

Art. 7(4) GDPR emphasizes that the performance of a contract is not supposed to be conditional on consent to the processing of personal data that is not necessary for the performance of the contract. The WP 29 states that “compulsion to agree with the use of personal data additional to what is strictly necessary limits data subject’s choices and stands in the way of free consent.” Depending on the scope of the contract or service, the term “necessary for the performance of a contract… …needs to be interpreted strictly”. The WP29 lays down examples of cases where the bundling of situations is acceptable.

If a service involves multiple processing operations or multiple purposes, the data subject should have the freedom to choose which purpose they accept. This concept of granularity requires the purposes to be separated and consent to be obtained for each purpose.

Withdrawal of consent has to be possible without any detriment, e.g. in terms of additional costs or downgrade of services. Any other negative consequence such as deception, intimidation or coercion is also considered to be invalidating. The WP29 therefore suggests controllers to ensure proof that consent has been given accordingly.

(will be soon continued in Part 2)

Will Visa Applicants for the USA have to reveal their Social Media Identities in future?

11. January 2018

The U.S. Department of State is aiming for Visa applicants to answer supplemental questions, including information about social media. A 30-Day notice has been published in November in order to gather opinions from all interested individuals and organizations. The goal is to establish a legal basis for the “proper collection of all information necessary to rigorously evaluate all grounds of inadmissibility or deportability, or grounds for the denial of other immigration benefits”.

In concrete terms, applicants are supposed to reveal their social media identifiers used during the last five years. The State Department stresses the fact that “the collection of social media platforms and identifiers will not be used to deny visas based on applicants’ race, religion, ethnicity, national origin, political views, gender, or sexual orientation.”

Meanwhile, the Electronic Privacy Information Center (EPIC) has submitted its comments asking for withdrawal of the proposal to collect social media identifiers and for review of the appropriateness of using social media to make visa determinations.

EPIC not only critizes the lack of transparency as it is “not clear how the State Department intends to use the social media identifiers” and further continues that “the benefits for national security” don’t seem precise. The organization also expresses concerns because the collection of these data enable enhanced profiling and tracking of individuals as well as large scale surveillance of innocent people, maybe even leading to secret profiles.

It remains to be seen how the situation develops and how the public opinion influences the outcome.

Indian government urges people to sign up to Aadhaar – the world’s largest biometric ID system – while the Supreme Court still needs to determine its legality

28. December 2017

As reported in August of this year, the Indian Supreme Court (SC) acknowledged that the right to privacy is “intrinsic to life and liberty” and is “inherently protected under the various fundamental freedoms enshrined under Part III of the Indian Constitution.”

In the same context, the SC had announced it will be hearing petitions on Aadhaar related matters (the term – meaning “foundation” – stands for a 12 digit unique-identity number supposedly issued to all Indian residents based on their biometric and demographic data) in November.

According to a Bloomberg report, India’a Prime Minister Narendra Modi is calling for an expansion of Aadhaar, even though its constitutionality is still to be debated. The SC has set January 10th as the beginning of the final hearings.

While officials say Aadhaar is saving the government billions of dollars by better targeting beneficiaries of subsidized food and cash transfers, critics point to unfair exclusions and data leaks. The latter on the one hand also fear that the database might lead India into becoming a state of surveillance. On the other hand, they are concerned about the high risk of major leaks, such as the ones reported by a news agency in India, the PTI (Press Trust of India): “Personal details of several Aadhaar users were made public on over 200 central and state government websites.”

Meanwhile, Medianama, a source of information and analysis on Digital and Telecom businesses in India, has launched a list of already compromised leaks and encourages people to point out any similar incidents.

Category: Data breach · General · India · Personal Data
Tags: ,

WP 29 adopts guidelines on transparency under the GDPR

21. December 2017

The Article 29 Working Party (WP 29) has adopted guidelines on transparency under the General Data Protection Regulation (GDPR). The guideline intends to bring clearance into the transparency requirement regarding the processing of personal data and gives practical advice.

Transparency as such is not defined in the GDPR. However, Recital 39 describes what the transparency obligation requires when personal data is processed. Providing information to a data subject about the processing of personal data is one major aspect of transparency.

In order to explain transparency and its requirements, the WP 29 points out “elements of transparency under the GDPR” and explains their understanding of these. The following elements are named and described:

– “Concise, transparent, intelligible and easily accessible”
– “Clear and plain language”
– “Providing information to children”
– “In writing or by other means”
– “..the information may be provided orally”
– “Free of charge”

In a schedule, the WP 29 lists which information under Art. 13 and Art. 14 GDPR shall be provided to a data subject and which information is not required.

New and surprising password guidelines released by NIST

The National Institute of Standards and Technology (NIST), a non-regulatory federal agency within the U.S. Department of Commerce that promotes innovation and industrial competitiveness often by recommending best practices in matters of security, has released its Digital Identity Guidelines uttering advice for user password management.

Considering that Bill Burr, the pioneer of password management, has admitted regretting his recommendations in a publication back in 2003, the NIST is taking appropriate action by revising wide-spread practices.

For over a decade, people were encouraged to create complex passwords with capital letters, numbers and „obscure“ characters – along with frequent changes.

Research has now shown that these requirements don’t necessarily improve the level of security, but instead might even make it easier for hackers to crack the code as people tend to make minor changes when they have to change their already complex password – usually pressed for time.

This is why the NIST is now recommending to let go of periodic password change requirements alongside of algorithmic complexity.

Rather than holding on to these practices, the experts emphasize the importance of password length. The NIST states, that „password length has been found to be a primary factor in characterizing password strength. Passwords that are too short yield to brute force attacks as well as to dictionary attacks using words and commonly chosen passwords.“

It takes years for computers to figure out passwords with 20 or more characters as long as the password is not commonly used.

The NIST advises to screen new passwords against specific lists: „For example, the list may include, but is not limited to passwords obtained from previous breach corpuses, dictionary words, repetitive or sequential characters (e.g. ‘aaaaaa’, ‚1234abcd’), context-specific words, such as the name of the service, the username, and derivatives thereof.“

Subsequently, the NIST completely abandons its own suggestions and causes great relief for industries all over:

„Length and complexity requirements beyond those recommended here significantly increase the difficulty of memorized secrets and increase user frustration. As a result, users often work around these restrictions in a way that is counterproductive. Furthermore, other mitigations such as blacklists, secure hashed storage, and rate limiting are more effective at preventing modern brute-force attacks. Therefore, no additional complexity requirements are imposed.“

French Data Protection Commission threatens WhatsApp with sanctions

The French National Data Protection Commission (CNIL) has found violations of the French Data Protection Act in the course of an investigation conducted in order to verify compliance of WhatsApps data Transfer to Facebook with legal requirements.

In 2016, WhatsApp had announced to transfer data to Facebook for the purpose of targeted advertising, security and business intelligence (technology-driven process for analyzing data and presenting actionable information to help executives, managers and other corporate end users make informed business decisions).

Immediately after the announcement, the Working Party 29 (an independent European advisory body on data protection and privacy, set up under Article 29 of Directive 95/46/EC; hereinafter referred to as „WP29“) asked the company to stop the data transfer for targeted advertising as French law doesn’t provide an adequate legal basis.

„While the security purpose seems to be essential to the efficient functioning of the application, it is not the case for the “business intelligence” purpose which aims at improving performances and optimizing the use of the application through the analysis of its users’ behavior.“

In the wake of the request, WhatsApp had assured the CNIL that it does not process the data of French users for such purposes.

However, the CNIL currently not only came to the result that the users’ consent was not validly collected as it lacked two essential aspects of data protection law: specific function and free choice. But it also denies a legitimate interest when it comes to preserving fundamental rights of users based on the fact that the application cannot be used if the data subjects refuse to allow the processing.

WhatsApp has been asked to provide a sample of the French users’ data transferred to Facebook, but refused to do so because being located in die United States, „it considers that it is only subject to the legislation of this country.“

The inspecting CNIL thus has issued a formal notice to WhatsApp and again requested to comply with the requirements within one month and states:

„Should WhatsApp fail to comply with the formal notice within the specified timescale, the Chair may appoint an internal investigator, who may draw up a report proposing that the CNIL’s restricted committee responsible for examining breaches of the Data Protection Act issue a sanction against the company.“


WP29: Guideline for profiling and automated decision-making

19. October 2017

The Article 29 Data Protection Working Party (WP29) adopted a guideline for the automated individual decision-making and profiling which are addressed by the General Data Protection Regulation (GDPR). The GDPR will be applicable from the 25th May 2018. WP29 acknowledges that “profiling and automated decision-making can be useful for individuals and organisations as well as for the economy and society as a whole”. “Increased efficiencies” and “resource savings” are two examples that were named.

However, it was also stated that “profiling and automated decision-making can pose significant risks for individuals’ rights and freedoms which require appropriate safeguards”. One risk could be that profiling may “perpetuate existing stereotypes and social segregation”.

The Guideline covers inter alia definitions of profiling and automated decision-making as well as the general approach of the GDPR to these. It is addressed that the GDPR introduces provisions to ensure that the use of profiling and automated decision-making does not have an “unjustified impact on individuals’ rights” and names examples, such as “specific transparency and fairness requirements” and “greater accountability obligations”.

Moscow adds facial recognition to its network of surveillance cameras

2. October 2017

Moscow adds facial recognition to its network of 170.000 surveillance cameras across the city to be able to identify criminals and boost security, Bloomberg reports. The camera surveillance started in 2012. The recordings of the camera surveillance system have been held for five days after they are captured, with an amount of 20 million hours of video material stored at any one time. “We soon found it impossible to process such volumes of data by police officers alone,” Artem Ermolaev, who is Head of the Department of Information Technology in Moscow, said according to Bloomberg. “We needed an artificial intelligence to help find what we are looking for.”, he further said.

A Russian start-up, named N-Tech.Lab Ltd designed the facial recognition technology. The start-up is known for its mobile app FindFace which was released last year. With FindFace it is possible to search for users of the Russian social network VKontakte by making a picture of a person’s face and match it against the user profiles of VKontakte.

However, due to high costs the face recognition technology should not be deployed to every camera and therefore only be installed selectively within specific districts where it is needed the most. To maintain the camera surveillance, the Moscow government already should spend about $ 86 million a year and this amount would triple if every camera would use the new facial recognition technology.

The new technology is used to cross-reference images captured by the cameras with those from the Interior Ministry’s database.

New Zealand: Police uses backdoor in law to gather private data

5. September 2017

According to the New Zealand Council of Civil Liberties, in several cases private data was handed over by banks to the police, after the police requested these data from them. It is further explained that the police used forms that looked official, instead of applying to a judge for a search warrant or examination. The police should neither have an oversight, nor a register which tracks the amount of filed requests.

The Police and banks rely on a legal loophole in the Privacy Act that allows organisations to reveal information about persons in order “to avoid prejudice to the maintenance of the law”. The Privacy Commissioner John Edwards is willing to end the further use of this backdoor. Referring to the case of handing over the private information of activist and journalist Martyn Bradbury, he said:

“…we concluded that Police had collected this information in an unlawful way by asking for such sensitive information without first putting the matter before a judicial officer. Our view is that this was a breach of Principle 4 of the Privacy Act, which forbids agencies from collecting information in an unfair, unreasonable or unlawful way.”

TalkTalk fined by ICO

11. August 2017

According to a Press Release from the Information Commissioner’s Office (“ICO”), the TalkTalk Telecom Group (“TalkTalk”) was fined for violating the UK Data Protection Act. More than 21.000 customers could be the victims of scams and frauds.

As a result of an investigation in 2014, the ICO fined TalkTalk 100.000 GPB by failing to protect customer data. The breach was possible because of a lack of security of a portal holding a huge amount of customer data. One company with access to the portal was Wipro, an IT services company in India. 40 employees of Wipro had access to personal data of between 25.000 to 50.000 customers. During the investigation, three accounts were found that had unauthorized access to this portal. The ICO determined that TalkTalk did not ensure the security of the customer data held in this portal. There were different reasons:

  • The portal was accessible via any device. There was no restriction on which devices the portal can be accessed.
  • The search engine of the portal allowed wildcards searches (with * as a placeholder to get many results).
  • The search engine allowed up to 500 results per search.

The access rights were too wide-ranging regarding the high amount of customer data held by the portal. The ICO fined TalkTalk because it breached one of the principles of the UK Data Protection Act by not implementing enough technical and organizational measures.

Category: Personal Data · UK
Tags: , , ,
Pages: 1 2 3 4 5 Next
1 2 3 5