Indian government urges people to sign up to Aadhaar – the world’s largest biometric ID system – while the Supreme Court still needs to determine its legality

28. December 2017

As reported in August of this year, the Indian Supreme Court (SC) acknowledged that the right to privacy is “intrinsic to life and liberty” and is “inherently protected under the various fundamental freedoms enshrined under Part III of the Indian Constitution.”

In the same context, the SC had announced it will be hearing petitions on Aadhaar related matters (the term – meaning “foundation” – stands for a 12 digit unique-identity number supposedly issued to all Indian residents based on their biometric and demographic data) in November.

According to a Bloomberg report, India’a Prime Minister Narendra Modi is calling for an expansion of Aadhaar, even though its constitutionality is still to be debated. The SC has set January 10th as the beginning of the final hearings.

While officials say Aadhaar is saving the government billions of dollars by better targeting beneficiaries of subsidized food and cash transfers, critics point to unfair exclusions and data leaks. The latter on the one hand also fear that the database might lead India into becoming a state of surveillance. On the other hand, they are concerned about the high risk of major leaks, such as the ones reported by a news agency in India, the PTI (Press Trust of India): “Personal details of several Aadhaar users were made public on over 200 central and state government websites.”

Meanwhile, Medianama, a source of information and analysis on Digital and Telecom businesses in India, has launched a list of already compromised leaks and encourages people to point out any similar incidents.

Category: Data Breach · General · India · Personal Data
Tags: ,

Cancer Care Organization settles for 2.3 Mio $ after Data Breach

22. December 2017

In 2015, a data breach occurred at 21st Century Oncology  (21stCO), one of the leading providers of cancer care services in the USA, potentially affecting names, social security numbers, medical diagnoses and health insurance information of at least 2.2 million patients.

On its website, the provider had announced in 2016 that one of its databases was inappropriately accessed by an unauthorized third party, though an FBI investigation had already detected an attack as early as October 2015. The FBI, however, requested 21stCO to delay the notification because of ongoing federal investigations.

21stCO had then stated that ““we continue to work closely with the FBI on its investigation of the intrusion into our system” and “in addition to security measures already in place, we have also taken additional steps to enhance internal security protocols to help prevent a similar incident in the future.” To make amends for the security gap patients had been offered one year of free credit monitoring services.

Nevertheless, the provider now has to pay a fine worth 2.3 million dollars as settled with the Office for Civil Rights (OCR; part of the U.S. Department of Health and Human Services).

It has been accused of not implementing appropriate security measures and procedures to regularly review information system activity such as access or security incident reports, despite the disclosure by the FBI.

The OCR further stated that “the organization also disclosed protected health information to its business associates without having a proper business associate agreement in place”.

The settlement additionally requires 21stCO to set up a corrective action plan including the appointment of a compliance representative, completion of risk analysis and management, revision of cybersecurity policies, an internal breach reporting plan and overall in-depth IT-security. The organization will, in addition, need to maintain all relevant documents and records for six years, so the OCR can inspect and copy the documents if necessary.

Following the settlement, District Attorney Stephen Muldrow stated “we appreciate that 21st Century Oncology self-reported a major fraud affecting Medicare, and we are also pleased that the company has agreed to accept financial responsibility for past compliance failures.”

WP 29 adopts guidelines on transparency under the GDPR

21. December 2017

The Article 29 Working Party (WP 29) has adopted guidelines on transparency under the General Data Protection Regulation (GDPR). The guideline intends to bring clearance into the transparency requirement regarding the processing of personal data and gives practical advice.

Transparency as such is not defined in the GDPR. However, Recital 39 describes what the transparency obligation requires when personal data is processed. Providing information to a data subject about the processing of personal data is one major aspect of transparency.

In order to explain transparency and its requirements, the WP 29 points out “elements of transparency under the GDPR” and explains their understanding of these. The following elements are named and described:

– “Concise, transparent, intelligible and easily accessible”
– “Clear and plain language”
– “Providing information to children”
– “In writing or by other means”
– “..the information may be provided orally”
– “Free of charge”

In a schedule, the WP 29 lists which information under Art. 13 and Art. 14 GDPR shall be provided to a data subject and which information is not required.

New and surprising password guidelines released by NIST

The National Institute of Standards and Technology (NIST), a non-regulatory federal agency within the U.S. Department of Commerce that promotes innovation and industrial competitiveness often by recommending best practices in matters of security, has released its Digital Identity Guidelines uttering advice for user password management.

Considering that Bill Burr, the pioneer of password management, has admitted regretting his recommendations in a publication back in 2003, the NIST is taking appropriate action by revising wide-spread practices.

For over a decade, people were encouraged to create complex passwords with capital letters, numbers and „obscure“ characters – along with frequent changes.

Research has now shown that these requirements don’t necessarily improve the level of security, but instead might even make it easier for hackers to crack the code as people tend to make minor changes when they have to change their already complex password – usually pressed for time.

This is why the NIST is now recommending to let go of periodic password change requirements alongside of algorithmic complexity.

Rather than holding on to these practices, the experts emphasize the importance of password length. The NIST states, that „password length has been found to be a primary factor in characterizing password strength. Passwords that are too short yield to brute force attacks as well as to dictionary attacks using words and commonly chosen passwords.“

It takes years for computers to figure out passwords with 20 or more characters as long as the password is not commonly used.

The NIST advises to screen new passwords against specific lists: „For example, the list may include, but is not limited to passwords obtained from previous breach corpuses, dictionary words, repetitive or sequential characters (e.g. ‘aaaaaa’, ‚1234abcd’), context-specific words, such as the name of the service, the username, and derivatives thereof.“

Subsequently, the NIST completely abandons its own suggestions and causes great relief for industries all over:

„Length and complexity requirements beyond those recommended here significantly increase the difficulty of memorized secrets and increase user frustration. As a result, users often work around these restrictions in a way that is counterproductive. Furthermore, other mitigations such as blacklists, secure hashed storage, and rate limiting are more effective at preventing modern brute-force attacks. Therefore, no additional complexity requirements are imposed.“

French Data Protection Commission threatens WhatsApp with sanctions

The French National Data Protection Commission (CNIL) has found violations of the French Data Protection Act in the course of an investigation conducted in order to verify compliance of WhatsApps data Transfer to Facebook with legal requirements.

In 2016, WhatsApp had announced to transfer data to Facebook for the purpose of targeted advertising, security and business intelligence (technology-driven process for analyzing data and presenting actionable information to help executives, managers and other corporate end users make informed business decisions).

Immediately after the announcement, the Working Party 29 (an independent European advisory body on data protection and privacy, set up under Article 29 of Directive 95/46/EC; hereinafter referred to as „WP29“) asked the company to stop the data transfer for targeted advertising as French law doesn’t provide an adequate legal basis.

„While the security purpose seems to be essential to the efficient functioning of the application, it is not the case for the “business intelligence” purpose which aims at improving performances and optimizing the use of the application through the analysis of its users’ behavior.“

In the wake of the request, WhatsApp had assured the CNIL that it does not process the data of French users for such purposes.

However, the CNIL currently not only came to the result that the users’ consent was not validly collected as it lacked two essential aspects of data protection law: specific function and free choice. But it also denies a legitimate interest when it comes to preserving fundamental rights of users based on the fact that the application cannot be used if the data subjects refuse to allow the processing.

WhatsApp has been asked to provide a sample of the French users’ data transferred to Facebook, but refused to do so because being located in die United States, „it considers that it is only subject to the legislation of this country.“

The inspecting CNIL thus has issued a formal notice to WhatsApp and again requested to comply with the requirements within one month and states:

„Should WhatsApp fail to comply with the formal notice within the specified timescale, the Chair may appoint an internal investigator, who may draw up a report proposing that the CNIL’s restricted committee responsible for examining breaches of the Data Protection Act issue a sanction against the company.“

 

WP29 releases opinion on joint review of Privacy Shield

11. December 2017

The Working Party 29 (WP29),  an independent European advisory body on data protection and privacy, has evaluated the Privacy Shield agreement  (framework for transatlantic exchanges of personal data for commercial purposes between the European Union and the United States, see also our report on One year of Privacy Shield).

In its joint review, the WP29 focusses on the assessment of commercial aspects and governmental access to personal data for national security purposes.

Though acknowledging progress, the WP29 still finds unresolved issues on both sides.

It criticizes the lack of guidance and clear information on the principles of the Privacy Shield, especially with regards to onward transfers, the rights of the data subject and remedies.

The US authorities are further requested to clearly distinguish the status of data processors from that of data controllers.

Another important issue to be tackled is the handling of Human Resource (HR)  data and the rules governing automated-decision making and profiling.

Also, the process of self-certification for companies requires improvement.

In terms of access by public authorities, the WP 29 concludes that the US government has made effort to become more transparent.

However, some of the main concerns still are to be resolved by May 25th, 2018.

The WP 29 calls for further evidence or legally binding commitments to confirm non-discrimination and the fact that authorities don’t get access on a generalized basis to data transferred to the USA from the EU.

Aside from these matters, an Ombudsperson still needs to be appointed and her/his exact powers need to be specified. According to the WP 29, the existing powers to remedy non-compliance are not sufficient.

In case no remedy is brought to these concerns in the given time frames, the members of WP29 will take appropriate action, including bringing the Privacy Shield Adequacy decision to national courts for them to make a reference to the Court of Justice of the European Union (CJEU) for a preliminary ruling.

Facial recognition data may become purchasable for private companies in Australia

5. December 2017

The Australian government is considering making facial recognition data available for private companies.

By paying a fee they are supposed to get access to data originally collected for the sake of national security.

However, the companies are to be restricted to cases where the person has given her/his consent.

In an interview with The Guardian, Monique Mann, a director of the Australian Privacy Foundation and a lecturer at the faculty of law at the Queensland University of Technology, says that requiring companies to ask for consent may not be enough to protect consumers’ rights or mitigate the risks involved with biometric data, and would encourage firms to store more data.

As also reported by The Guardian, the government struck a deal with states and territories over the controversial national facial recognition database last month. It is said, that according to the documents, which predate the agreement, at that time 50% of the population was already included in the database.

With the help of state and territory governments, the federal Attorney General’s Department planned to expand that number to cover 85% of Australians.

Google gathers location data even if location services are disabled

23. November 2017

As Quartz reports, since the beginning of 2017, Google is gathering location data from Android phones, even if the location services are disabled. To do so, Google has been collecting the addresses from nearby cellular towers. With the gathered information Google has access to the location of Android phone users and data about their movements.

Quartz further reports, that according to a Google spokesman, the tower addresses were sent to a Google system that manages push notifications and messages on Android phones and that the collected data had never been stored. It is further said that by the end of November Google will end this practice.

Category: General

Uber hid massive data breach

22. November 2017

Uber just admitted that hackers stole personal data of 50 million Uber customers and 7 million drivers. The data breach happened in October 2016, over a year ago, but was only published this week.

The data include names, e-mail addresses, phone numbers and the license numbers of 600.000 drivers. According to Uber neither social security numbers, nor credit card information, or trip location details were taken.

Uber did not disclose the data breach to public, as required by data protection law, but paid the hackers 100.000,00 $ to delete the information. Uber assumes that the data was not used.

Referring to Uber the hackers came in through a badly protected database in a cloud service to the data. Uber security Chief Joe Sullivan and another manager lost their jobs.

This data breach wasn’t the first incident that happened to Uber. Uber has a well-documented history of abusing consumer privacy.

Uber said it has hired Matt Olsen, former general counsel at the National Security Agency and director of the National Counterterrorism Center, as an adviser.  He will help the company restructure its security teams.

Category: Cyber Security · Data Breach · USA
Tags:

Vast majority of European businesses unprepared for GDPR

20. November 2017

According to a study only 8 % of businesses are ready for the EU General Data Protection Regulation (GDPR) and nearly one third of the companies are even unaware of the GDPR, coming into effect on 25. May 2018.

Although the new Regulation is considered too complex especially for small and medium-sized businesses, the majority of businesses agree that new rules in the field of personal data protection are necessary.

Infringements of GDPR provisions could lead to fines of up to €20 million or 4 % of the total worldwide annual turnover for the preceding financial year, whichever is higher.

Category: GDPR
Pages: Prev 1 2 3 ... 45 46 47 48 49 50 51 ... 67 68 69 Next
1 46 47 48 49 50 69