Category: Personal Data

New and surprising password guidelines released by NIST

21. December 2017

The National Institute of Standards and Technology (NIST), a non-regulatory federal agency within the U.S. Department of Commerce that promotes innovation and industrial competitiveness often by recommending best practices in matters of security, has released its Digital Identity Guidelines uttering advice for user password management.

Considering that Bill Burr, the pioneer of password management, has admitted regretting his recommendations in a publication back in 2003, the NIST is taking appropriate action by revising wide-spread practices.

For over a decade, people were encouraged to create complex passwords with capital letters, numbers and „obscure“ characters – along with frequent changes.

Research has now shown that these requirements don’t necessarily improve the level of security, but instead might even make it easier for hackers to crack the code as people tend to make minor changes when they have to change their already complex password – usually pressed for time.

This is why the NIST is now recommending to let go of periodic password change requirements alongside of algorithmic complexity.

Rather than holding on to these practices, the experts emphasize the importance of password length. The NIST states, that „password length has been found to be a primary factor in characterizing password strength. Passwords that are too short yield to brute force attacks as well as to dictionary attacks using words and commonly chosen passwords.“

It takes years for computers to figure out passwords with 20 or more characters as long as the password is not commonly used.

The NIST advises to screen new passwords against specific lists: „For example, the list may include, but is not limited to passwords obtained from previous breach corpuses, dictionary words, repetitive or sequential characters (e.g. ‘aaaaaa’, ‚1234abcd’), context-specific words, such as the name of the service, the username, and derivatives thereof.“

Subsequently, the NIST completely abandons its own suggestions and causes great relief for industries all over:

„Length and complexity requirements beyond those recommended here significantly increase the difficulty of memorized secrets and increase user frustration. As a result, users often work around these restrictions in a way that is counterproductive. Furthermore, other mitigations such as blacklists, secure hashed storage, and rate limiting are more effective at preventing modern brute-force attacks. Therefore, no additional complexity requirements are imposed.“

French Data Protection Commission threatens WhatsApp with sanctions

The French National Data Protection Commission (CNIL) has found violations of the French Data Protection Act in the course of an investigation conducted in order to verify compliance of WhatsApps data Transfer to Facebook with legal requirements.

In 2016, WhatsApp had announced to transfer data to Facebook for the purpose of targeted advertising, security and business intelligence (technology-driven process for analyzing data and presenting actionable information to help executives, managers and other corporate end users make informed business decisions).

Immediately after the announcement, the Working Party 29 (an independent European advisory body on data protection and privacy, set up under Article 29 of Directive 95/46/EC; hereinafter referred to as „WP29“) asked the company to stop the data transfer for targeted advertising as French law doesn’t provide an adequate legal basis.

„While the security purpose seems to be essential to the efficient functioning of the application, it is not the case for the “business intelligence” purpose which aims at improving performances and optimizing the use of the application through the analysis of its users’ behavior.“

In the wake of the request, WhatsApp had assured the CNIL that it does not process the data of French users for such purposes.

However, the CNIL currently not only came to the result that the users’ consent was not validly collected as it lacked two essential aspects of data protection law: specific function and free choice. But it also denies a legitimate interest when it comes to preserving fundamental rights of users based on the fact that the application cannot be used if the data subjects refuse to allow the processing.

WhatsApp has been asked to provide a sample of the French users’ data transferred to Facebook, but refused to do so because being located in die United States, „it considers that it is only subject to the legislation of this country.“

The inspecting CNIL thus has issued a formal notice to WhatsApp and again requested to comply with the requirements within one month and states:

„Should WhatsApp fail to comply with the formal notice within the specified timescale, the Chair may appoint an internal investigator, who may draw up a report proposing that the CNIL’s restricted committee responsible for examining breaches of the Data Protection Act issue a sanction against the company.“

 

WP29: Guideline for profiling and automated decision-making

19. October 2017

The Article 29 Data Protection Working Party (WP29) adopted a guideline for the automated individual decision-making and profiling which are addressed by the General Data Protection Regulation (GDPR). The GDPR will be applicable from the 25th May 2018. WP29 acknowledges that “profiling and automated decision-making can be useful for individuals and organisations as well as for the economy and society as a whole”. “Increased efficiencies” and “resource savings” are two examples that were named.

However, it was also stated that “profiling and automated decision-making can pose significant risks for individuals’ rights and freedoms which require appropriate safeguards”. One risk could be that profiling may “perpetuate existing stereotypes and social segregation”.

The Guideline covers inter alia definitions of profiling and automated decision-making as well as the general approach of the GDPR to these. It is addressed that the GDPR introduces provisions to ensure that the use of profiling and automated decision-making does not have an “unjustified impact on individuals’ rights” and names examples, such as “specific transparency and fairness requirements” and “greater accountability obligations”.

Moscow adds facial recognition to its network of surveillance cameras

2. October 2017

Moscow adds facial recognition to its network of 170.000 surveillance cameras across the city to be able to identify criminals and boost security, Bloomberg reports. The camera surveillance started in 2012. The recordings of the camera surveillance system have been held for five days after they are captured, with an amount of 20 million hours of video material stored at any one time. “We soon found it impossible to process such volumes of data by police officers alone,” Artem Ermolaev, who is Head of the Department of Information Technology in Moscow, said according to Bloomberg. “We needed an artificial intelligence to help find what we are looking for.”, he further said.

A Russian start-up, named N-Tech.Lab Ltd designed the facial recognition technology. The start-up is known for its mobile app FindFace which was released last year. With FindFace it is possible to search for users of the Russian social network VKontakte by making a picture of a person’s face and match it against the user profiles of VKontakte.

However, due to high costs the face recognition technology should not be deployed to every camera and therefore only be installed selectively within specific districts where it is needed the most. To maintain the camera surveillance, the Moscow government already should spend about $ 86 million a year and this amount would triple if every camera would use the new facial recognition technology.

The new technology is used to cross-reference images captured by the cameras with those from the Interior Ministry’s database.

New Zealand: Police uses backdoor in law to gather private data

5. September 2017

According to the New Zealand Council of Civil Liberties, in several cases private data was handed over by banks to the police, after the police requested these data from them. It is further explained that the police used forms that looked official, instead of applying to a judge for a search warrant or examination. The police should neither have an oversight, nor a register which tracks the amount of filed requests.

The Police and banks rely on a legal loophole in the Privacy Act that allows organisations to reveal information about persons in order “to avoid prejudice to the maintenance of the law”. The Privacy Commissioner John Edwards is willing to end the further use of this backdoor. Referring to the case of handing over the private information of activist and journalist Martyn Bradbury, he said:

“…we concluded that Police had collected this information in an unlawful way by asking for such sensitive information without first putting the matter before a judicial officer. Our view is that this was a breach of Principle 4 of the Privacy Act, which forbids agencies from collecting information in an unfair, unreasonable or unlawful way.”

TalkTalk fined by ICO

11. August 2017

According to a Press Release from the Information Commissioner’s Office (“ICO”), the TalkTalk Telecom Group (“TalkTalk”) was fined for violating the UK Data Protection Act. More than 21.000 customers could be the victims of scams and frauds.

As a result of an investigation in 2014, the ICO fined TalkTalk 100.000 GPB by failing to protect customer data. The breach was possible because of a lack of security of a portal holding a huge amount of customer data. One company with access to the portal was Wipro, an IT services company in India. 40 employees of Wipro had access to personal data of between 25.000 to 50.000 customers. During the investigation, three accounts were found that had unauthorized access to this portal. The ICO determined that TalkTalk did not ensure the security of the customer data held in this portal. There were different reasons:

  • The portal was accessible via any device. There was no restriction on which devices the portal can be accessed.
  • The search engine of the portal allowed wildcards searches (with * as a placeholder to get many results).
  • The search engine allowed up to 500 results per search.

The access rights were too wide-ranging regarding the high amount of customer data held by the portal. The ICO fined TalkTalk because it breached one of the principles of the UK Data Protection Act by not implementing enough technical and organizational measures.

Category: Personal Data · UK
Tags: , , ,

Nationwide: multistate data breach investigation settled by paying $ 5.5 million

According to Hunton & Williams, on the 9th of August, Nationwide Mutual Insurance Company (“Nationwide”), agreed to pay $ 5.5 million to settle a data breach investigation by attorneys general from 32 states concerning a data breach that exposed personal data of about 1.2 million individuals. They also published the settlement.

In October 2012, Nationwide and its wholly-owned subsidiary Allied Property & Cansualty Insurance Company (“Allied”) experienced a data breach that led to an unauthorized access to and exfiltration of certain personal data of their customers, as well as other consumers. Since Nationwide and Allied provide customers with insurance quotes, inter alia the following personal data are collected: full name, Social Security number, date of birth or credit-related score.

The attorneys general alleged that the data breach occurred when hackers exploited a vulnerability in the companies’ web application hosting software. Further, it is alleged that, after the data was exfiltrated, Nationwide and Allied applied a software patch, that was not previously applied, to address the vulnerability.

Besides the $ 5.5 million Nationwide and Allied agreed to implement a series of steps to update its security practices. Besides other measures that are listed in the settlement a technology officer shall be appointed that should manage and monitor security and software updates to ensure that future patches and other security updates are applied.

India: Is the “right to privacy” a fundamental human right?

4. August 2017

The Indian Supreme Court has to decide if the “right to privacy” should be considered a fundamental human right.

According to the Wire, a bench of nine justices was set up after several petitions that challenged the constitutional validity of India’s Aadhaar scheme, with some petitioners claiming that the biometric authentication system is a violation of the privacy of Indians. The bench examined over the last two weeks the nature of privacy as a right in context of two earlier judgements. Back in 1954 and 1962 these judgements came to the conclusion that the right to privacy was not a fundamental right. Legal experts expect the judgement in the last week of August.

Times of India reports that the Supreme Court outlined a three-tier graded approach to examine the question whether privacy can be considered as a fundamental right. The Bench therefore configures privacy into three zones. As stated by a justice of the Bench, the first zone could be the most intimate zone concerning for example marriage or sexuality. The state should only intrude this zone under “extraordinary circumstances provided it met stringent norms”.

The second zone would be the private zone. This zone could involve personal data like the use of credit card or the income tax declaration. In this zone, “sharing of personal data by an individual will be used only for the purpose for which it is shared by an individual”, it is further said.

The third zone would be the public zone. This zone should require only minimal regulation. However, that should not mean that the individual would lose the right of privacy, but “retain his privacy to body and mind”.

 

Facial recognition on the rise

At Australian airports new technology will be rolled out which will help processing passengers by means of facial recognition. Peter Dutton, Minister for Immigration and Border Protection, said that 105 smart gates will be provided for this purpose as part of a AU$22.5 million contract with Vision-Box Australia. Vision-Box has already implemented a facial recognition system at New York’s JFK airport.

Australian government’s goal is to automatize 90 % of air traveller processing by 2020. After the implementation, passengers will not have to show their passports, but will be processed by biometric recognition of their faces, irises and/or fingerprints.

Meanwhile, at Berlin’s Südkreuz station the testing of a facial recognition system began. The software can recognise known suspects and alert the police. Currently, the software is only scanning the faces of 250 volunteers. Thomas de Maizière, the German interior minister, aims at improving security in Germany after several terrorist attacks.

However, concerns were raised over this technology by privacy activists as well as by well-respected lawyers. They fear that Germany could head towards a surveillance state. Besides, it is stated there was no constitutional basis for the use of these methods.

Article 29 WP releases opinion on data processing at work

11. July 2017

The Article 29 Working Party (WP) has released their opinion on data processing at work on the 8th of June 2017. The Opinion is meant as an amendment to the previous released documents on the surveillance of electronic communications (WP 55) and processing personal data in employment context (WP 48). This update should face the fast-changing technologies, the new forms of processing and the fading boundaries between home and work. It not only covers the Data Protection Directive but also the new rules in the General Data Protection Regulation that goes into effect on 25th of May 2018.

Therefore they listed nine different scenarios in the employment context where data processing can lead to a lack in data protection. These scenarios are data processing in the recruitment process and in-employment screening (especially by using social media platforms), using monitoring tools for information and communication technologies (ICT), usage at home/remote, using monitoring for time and attendance, use of video monitoring, use of vehicles by employees, the disclosure of data to third parties and the international transfer of employee data.

The Article 29 WP also pointed out the main risk for the fundamental rights of the employees. New technologies allow the employer tracking over a long time and nearly everywhere in a less visible way. This can result into chilling effects on the rights of employees because they think of a constant supervision.

As a highlight the Article 29 WP gives the following recommendations for dealing with data processing in the employment context:

  • only collect the data legitimate for the purpose and only with processing taking place under appropriate conditions,
  • consent is highly unlike to be a legal base for data processing, because of the imbalance in power between the employer and the employee,
  • track the location of employees only where it is strictly necessary,
  • communicate every monitoring to your employees effectively,
  • do a proportionality check prior the deployment of any monitoring tool,
  • be more concerned with prevention than with detection,
  • keep in mind data minimization; only process the data you really need to,
  • create privacy spaces for users,
  • on cloud uses: Ensure an adequate level of protection on every international transfer of employee data.
Pages: Prev 1 2 3 4 5 6 7 8 9 10 11 Next
1 5 6 7 8 9 11