Category: Personal Data

Protection against automated decision making with personal data becomes a human right

30. May 2018

Regardless the new data protection legislation in the EU, the worldwide standard of data protection increases too. Through the “Amendment of the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (mostly known as “profiling”)” the European Court of Human Rights (ECtHR) will apply this expansion of the European Convention on Human Rights (ECHR) in the future.

For the last four decades, the Convention has been the only international legally binding instrument for the protection of privacy and personal data open to any country in the world. The aim of the amending is now to modernise and improve the Convention, taking into account the new challenges to the protection of individuals with regard to the processing of personal data that have occurred since the adoption of the Convention in 1980. In particular, this concerns new information and communication technologies, which require a different type of protection mechanism against privacy.

As for any other human right listed in the ECHR, any person can submit an individual application if she/he is violated by one of the contracting parties of the ECHR. This seems to be interesting especially regarding the investigation through profiling by national security authorities all over the European continent.

However, the adoption of the amendments also raises some questions. Particularly with regard to the relationship between European Union law and the Convention, which does not contain any explicit provisions in this respect, as well as deviations in the scope of application. Therefore the ECtHR will comment hopefully before the first lawsuits will start.

Category: Personal Data

In China National Standard on Personal Information Security (GB/T 35273-2017) Went into Effect

14. May 2018

On May 1, 2018, the Information Security Technology – Personal Information Security Specification (the “Specification”) went into effect in China. The Specification not mandatory and it is not possible to enforce it directly. Nonetheless, it could become important in the sense of guideline or reference for their administration and enforcement agencies.
The “Specification” embodies a framework concerning the collection, retention, use, sharing and transfer of personal information.

The Information Security Technology – Personal Information Security Specification establishes primary rules for personal information security, notice and consent requirements, security measures, rights of data subjects and requirements related to internal administration and management.
It distinguishes between personal information and sensitive personal information. For the latter exist specific obligations for its collection and use.
Under the the „Specification“, sensitive personal information means information such as personal identity information (ID card or passport number), financial information (bank account number or credit information) and biological identifying information (fingerprint or iris information).

Even though the “Specification” is not binding it may become significant within China because it constitutes benchmarks for the processing of personal information by a wide variety of entities and organizations. Companies that collect or process personal information should make sure that their practices in China are in compliance with the „Specification“.

Category: General · Personal Data
Tags:

WP29 Guidelines on the notion of consent according to the GDPR – Part 2

3. April 2018

Continued from the article about the Working Party 29 (WP29) guidelines on consent, additional elements of the term should be considered as consent plays a key role for the processing of personal data.

The GDPR requires consent to further be specific, i.e. the data subject must be informed about the purpose of the processing and be safeguarded against function creep. The data controller has to, again, be granular when it comes to multiple consent requests and clearly separate information regarding consent from other matters.

In case the data controller wishes to process the data for a new purpose, he will have to seek new consent from the data subject and cannot use the original consent as a legitimisation for processing of further or new purposes.

Consent will also be invalid if the data controller doesn’t comply with the requirements for informed consent. The WP29 lists six key points for consent to be informed focussing on the aspect that the data subject genuinely needs to understand the processing operations at hand. Information has to be provided in a clear and plain language and should not be hidden in general terms and conditions.

Furthermore, consent has to be an unambiguous indication of wishes, i.e. it must always be given through an active motion or declaration. For example, the use of pre-ticked opt-in boxes is invalid.

However, explicit consent is required in situations where serious data protection risks emerge such as the processing of Special categories of data pursuant to Art. 9 GDPR.

In general, the burden of proof will be on the data controller according to Art. 7 GDPR, without prescribing any specific methods. The WP29 recommends that consent should be refreshed at appropriate intervals.

Concerning the withdrawal of consent, it has to be as easy as giving consent and should be possible without detriment.

The WP29 also recommends that data controllers assess whether processing of data is appropriate irrespective of data subjects’ requests.

WP29 Guidelines on the notion of consent according to the GDPR – Part 1

26. January 2018

According to the GDPR, consent is one of the six lawful bases mentioned in Art. 6. In order for consent to be valid and compliant with the GDPR it needs to reflect the data subjects real choice and control.

The Working Party 29 (WP 29) clarifies and specifies the “requirements for obtaining and demonstrating” such a valid consent in its Guidelines released in December 2017.

The guidelines start off with an analysis of Article 4 (11) of the GDPR and then discusses the elements of valid consent. Referring to the Opinion 15/2011 on the definition of consent, “obtaining consent also does not negate or in any way diminish the controller’s obligations to observe the principles of processing enshrined in the GDPR, especially Article 5 of the GDPR with regard to fairness, necessity and proportionality, as well as data quality.”

The WP29 illustrates the elements of valid consent, such as the consent being freely given, specific, informed and unambiguous. For example, a consent is not considered as freely given if a mobile app for photo editing requires the users to have their GPS location activated simply in order to collect behavioural data aside from the photo editing. The WP29 emphasizes that consent to processing of unnecessary personal data “cannot be seen as a mandatory consideration in exchange for performance.”

Another important aspect taken into consideration is the imbalance of powers, e.g. in the matter of public authorities or in the context of employment. “Consent can only be valid if the data subject is able to exercise a real choice, and there is no risk of deception, intimidation, coercion or significant negative consequences (e.g. substantial extra costs) if he/she does not consent. Consent will not be free in cases where there is any element of compulsion, pressure or inability to exercise free will. “

Art. 7(4) GDPR emphasizes that the performance of a contract is not supposed to be conditional on consent to the processing of personal data that is not necessary for the performance of the contract. The WP 29 states that “compulsion to agree with the use of personal data additional to what is strictly necessary limits data subject’s choices and stands in the way of free consent.” Depending on the scope of the contract or service, the term “necessary for the performance of a contract… …needs to be interpreted strictly”. The WP29 lays down examples of cases where the bundling of situations is acceptable.

If a service involves multiple processing operations or multiple purposes, the data subject should have the freedom to choose which purpose they accept. This concept of granularity requires the purposes to be separated and consent to be obtained for each purpose.

Withdrawal of consent has to be possible without any detriment, e.g. in terms of additional costs or downgrade of services. Any other negative consequence such as deception, intimidation or coercion is also considered to be invalidating. The WP29 therefore suggests controllers to ensure proof that consent has been given accordingly.

(will be soon continued in Part 2)

Will Visa Applicants for the USA have to reveal their Social Media Identities in future?

11. January 2018

The U.S. Department of State is aiming for Visa applicants to answer supplemental questions, including information about social media. A 30-Day notice has been published in November in order to gather opinions from all interested individuals and organizations. The goal is to establish a legal basis for the “proper collection of all information necessary to rigorously evaluate all grounds of inadmissibility or deportability, or grounds for the denial of other immigration benefits”.

In concrete terms, applicants are supposed to reveal their social media identifiers used during the last five years. The State Department stresses the fact that “the collection of social media platforms and identifiers will not be used to deny visas based on applicants’ race, religion, ethnicity, national origin, political views, gender, or sexual orientation.”

Meanwhile, the Electronic Privacy Information Center (EPIC) has submitted its comments asking for withdrawal of the proposal to collect social media identifiers and for review of the appropriateness of using social media to make visa determinations.

EPIC not only critizes the lack of transparency as it is “not clear how the State Department intends to use the social media identifiers” and further continues that “the benefits for national security” don’t seem precise. The organization also expresses concerns because the collection of these data enable enhanced profiling and tracking of individuals as well as large scale surveillance of innocent people, maybe even leading to secret profiles.

It remains to be seen how the situation develops and how the public opinion influences the outcome.

Indian government urges people to sign up to Aadhaar – the world’s largest biometric ID system – while the Supreme Court still needs to determine its legality

28. December 2017

As reported in August of this year, the Indian Supreme Court (SC) acknowledged that the right to privacy is “intrinsic to life and liberty” and is “inherently protected under the various fundamental freedoms enshrined under Part III of the Indian Constitution.”

In the same context, the SC had announced it will be hearing petitions on Aadhaar related matters (the term – meaning “foundation” – stands for a 12 digit unique-identity number supposedly issued to all Indian residents based on their biometric and demographic data) in November.

According to a Bloomberg report, India’a Prime Minister Narendra Modi is calling for an expansion of Aadhaar, even though its constitutionality is still to be debated. The SC has set January 10th as the beginning of the final hearings.

While officials say Aadhaar is saving the government billions of dollars by better targeting beneficiaries of subsidized food and cash transfers, critics point to unfair exclusions and data leaks. The latter on the one hand also fear that the database might lead India into becoming a state of surveillance. On the other hand, they are concerned about the high risk of major leaks, such as the ones reported by a news agency in India, the PTI (Press Trust of India): “Personal details of several Aadhaar users were made public on over 200 central and state government websites.”

Meanwhile, Medianama, a source of information and analysis on Digital and Telecom businesses in India, has launched a list of already compromised leaks and encourages people to point out any similar incidents.

Category: Data Breach · General · India · Personal Data
Tags: ,

WP 29 adopts guidelines on transparency under the GDPR

21. December 2017

The Article 29 Working Party (WP 29) has adopted guidelines on transparency under the General Data Protection Regulation (GDPR). The guideline intends to bring clearance into the transparency requirement regarding the processing of personal data and gives practical advice.

Transparency as such is not defined in the GDPR. However, Recital 39 describes what the transparency obligation requires when personal data is processed. Providing information to a data subject about the processing of personal data is one major aspect of transparency.

In order to explain transparency and its requirements, the WP 29 points out “elements of transparency under the GDPR” and explains their understanding of these. The following elements are named and described:

– “Concise, transparent, intelligible and easily accessible”
– “Clear and plain language”
– “Providing information to children”
– “In writing or by other means”
– “..the information may be provided orally”
– “Free of charge”

In a schedule, the WP 29 lists which information under Art. 13 and Art. 14 GDPR shall be provided to a data subject and which information is not required.

New and surprising password guidelines released by NIST

The National Institute of Standards and Technology (NIST), a non-regulatory federal agency within the U.S. Department of Commerce that promotes innovation and industrial competitiveness often by recommending best practices in matters of security, has released its Digital Identity Guidelines uttering advice for user password management.

Considering that Bill Burr, the pioneer of password management, has admitted regretting his recommendations in a publication back in 2003, the NIST is taking appropriate action by revising wide-spread practices.

For over a decade, people were encouraged to create complex passwords with capital letters, numbers and „obscure“ characters – along with frequent changes.

Research has now shown that these requirements don’t necessarily improve the level of security, but instead might even make it easier for hackers to crack the code as people tend to make minor changes when they have to change their already complex password – usually pressed for time.

This is why the NIST is now recommending to let go of periodic password change requirements alongside of algorithmic complexity.

Rather than holding on to these practices, the experts emphasize the importance of password length. The NIST states, that „password length has been found to be a primary factor in characterizing password strength. Passwords that are too short yield to brute force attacks as well as to dictionary attacks using words and commonly chosen passwords.“

It takes years for computers to figure out passwords with 20 or more characters as long as the password is not commonly used.

The NIST advises to screen new passwords against specific lists: „For example, the list may include, but is not limited to passwords obtained from previous breach corpuses, dictionary words, repetitive or sequential characters (e.g. ‘aaaaaa’, ‚1234abcd’), context-specific words, such as the name of the service, the username, and derivatives thereof.“

Subsequently, the NIST completely abandons its own suggestions and causes great relief for industries all over:

„Length and complexity requirements beyond those recommended here significantly increase the difficulty of memorized secrets and increase user frustration. As a result, users often work around these restrictions in a way that is counterproductive. Furthermore, other mitigations such as blacklists, secure hashed storage, and rate limiting are more effective at preventing modern brute-force attacks. Therefore, no additional complexity requirements are imposed.“

French Data Protection Commission threatens WhatsApp with sanctions

The French National Data Protection Commission (CNIL) has found violations of the French Data Protection Act in the course of an investigation conducted in order to verify compliance of WhatsApps data Transfer to Facebook with legal requirements.

In 2016, WhatsApp had announced to transfer data to Facebook for the purpose of targeted advertising, security and business intelligence (technology-driven process for analyzing data and presenting actionable information to help executives, managers and other corporate end users make informed business decisions).

Immediately after the announcement, the Working Party 29 (an independent European advisory body on data protection and privacy, set up under Article 29 of Directive 95/46/EC; hereinafter referred to as „WP29“) asked the company to stop the data transfer for targeted advertising as French law doesn’t provide an adequate legal basis.

„While the security purpose seems to be essential to the efficient functioning of the application, it is not the case for the “business intelligence” purpose which aims at improving performances and optimizing the use of the application through the analysis of its users’ behavior.“

In the wake of the request, WhatsApp had assured the CNIL that it does not process the data of French users for such purposes.

However, the CNIL currently not only came to the result that the users’ consent was not validly collected as it lacked two essential aspects of data protection law: specific function and free choice. But it also denies a legitimate interest when it comes to preserving fundamental rights of users based on the fact that the application cannot be used if the data subjects refuse to allow the processing.

WhatsApp has been asked to provide a sample of the French users’ data transferred to Facebook, but refused to do so because being located in die United States, „it considers that it is only subject to the legislation of this country.“

The inspecting CNIL thus has issued a formal notice to WhatsApp and again requested to comply with the requirements within one month and states:

„Should WhatsApp fail to comply with the formal notice within the specified timescale, the Chair may appoint an internal investigator, who may draw up a report proposing that the CNIL’s restricted committee responsible for examining breaches of the Data Protection Act issue a sanction against the company.“

 

WP29: Guideline for profiling and automated decision-making

19. October 2017

The Article 29 Data Protection Working Party (WP29) adopted a guideline for the automated individual decision-making and profiling which are addressed by the General Data Protection Regulation (GDPR). The GDPR will be applicable from the 25th May 2018. WP29 acknowledges that “profiling and automated decision-making can be useful for individuals and organisations as well as for the economy and society as a whole”. “Increased efficiencies” and “resource savings” are two examples that were named.

However, it was also stated that “profiling and automated decision-making can pose significant risks for individuals’ rights and freedoms which require appropriate safeguards”. One risk could be that profiling may “perpetuate existing stereotypes and social segregation”.

The Guideline covers inter alia definitions of profiling and automated decision-making as well as the general approach of the GDPR to these. It is addressed that the GDPR introduces provisions to ensure that the use of profiling and automated decision-making does not have an “unjustified impact on individuals’ rights” and names examples, such as “specific transparency and fairness requirements” and “greater accountability obligations”.

Pages: Prev 1 2 3 ... 12 13 14 15 16 17 18 19 20 21 22 Next
1 16 17 18 19 20 22