Category: General

How is a company transferring data with a non-European company able to ensure the data-protection standard according to the General Data Protection Regulation (GDPR)?

21. March 2018

A trading deal between two companies often includes a high number of coincidentally transferred personal data. From the 25th May 2018 on the new GDPR regulates the data flow in the European Economic Area (EEA) that consists of all the members of the European Union, Iceland, Liechtenstein and Norway. The future status of Great Britain will be primarily the status of a third country.

Otherwise, business relationships to companies from non-EU or EEA States (like the USA, China, …) cannot guarantee the data protection standard of the GDPR automatically. Especially since the overruling of the “safe-harbour” agreement of the EU with the USA by the European Court of Justice (ECJ), every company that transfers data over the Atlantic is obligated to fulfil the data protection by itself. The European Commission (EC) recommends in its communication from the 10th January 2017 the use of so-called standard contractual clauses (SCC) or binding corporate rules (BCR), when an EU-based company transfers personal data to a non-EU based company or non-EU based entity of its corporate group.

This has a wide impact to the daily trade deals that are made all over Europe with third country companies. The EU recommends the data protection going hand in hand with the trading deals, to ensure the relatively high data protection level, which is based on Article 8 of the Charter of Fundamental Rights of the European Union. Especially until the ePrivacy-Regulation of the EU is not in force, every company has to ensure the standard of the GDPR by implementing a privacy policy, in which transfers of data to a third country has to be mentioned.

In conclusion, a company that trades with third country companies needs to enter a special data protection contract with the trading partner and needs to inform its clients by its privacy policy.

WP29 Guidelines on the notion of consent according to the GDPR – Part 1

26. January 2018

According to the GDPR, consent is one of the six lawful bases mentioned in Art. 6. In order for consent to be valid and compliant with the GDPR it needs to reflect the data subjects real choice and control.

The Working Party 29 (WP 29) clarifies and specifies the “requirements for obtaining and demonstrating” such a valid consent in its Guidelines released in December 2017.

The guidelines start off with an analysis of Article 4 (11) of the GDPR and then discusses the elements of valid consent. Referring to the Opinion 15/2011 on the definition of consent, “obtaining consent also does not negate or in any way diminish the controller’s obligations to observe the principles of processing enshrined in the GDPR, especially Article 5 of the GDPR with regard to fairness, necessity and proportionality, as well as data quality.”

The WP29 illustrates the elements of valid consent, such as the consent being freely given, specific, informed and unambiguous. For example, a consent is not considered as freely given if a mobile app for photo editing requires the users to have their GPS location activated simply in order to collect behavioural data aside from the photo editing. The WP29 emphasizes that consent to processing of unnecessary personal data “cannot be seen as a mandatory consideration in exchange for performance.”

Another important aspect taken into consideration is the imbalance of powers, e.g. in the matter of public authorities or in the context of employment. “Consent can only be valid if the data subject is able to exercise a real choice, and there is no risk of deception, intimidation, coercion or significant negative consequences (e.g. substantial extra costs) if he/she does not consent. Consent will not be free in cases where there is any element of compulsion, pressure or inability to exercise free will. “

Art. 7(4) GDPR emphasizes that the performance of a contract is not supposed to be conditional on consent to the processing of personal data that is not necessary for the performance of the contract. The WP 29 states that “compulsion to agree with the use of personal data additional to what is strictly necessary limits data subject’s choices and stands in the way of free consent.” Depending on the scope of the contract or service, the term “necessary for the performance of a contract… …needs to be interpreted strictly”. The WP29 lays down examples of cases where the bundling of situations is acceptable.

If a service involves multiple processing operations or multiple purposes, the data subject should have the freedom to choose which purpose they accept. This concept of granularity requires the purposes to be separated and consent to be obtained for each purpose.

Withdrawal of consent has to be possible without any detriment, e.g. in terms of additional costs or downgrade of services. Any other negative consequence such as deception, intimidation or coercion is also considered to be invalidating. The WP29 therefore suggests controllers to ensure proof that consent has been given accordingly.

(will be soon continued in Part 2)

Happy New Year!

1. January 2018

Dear readers,

the team of the blog privacy-ticker.com wish you a happy new year and all the best for 2018.

Once again this year we will keep you up to date on the subject of data protection.

Best regards,

privacy-ticker.com

Category: General

Indian government urges people to sign up to Aadhaar – the world’s largest biometric ID system – while the Supreme Court still needs to determine its legality

28. December 2017

As reported in August of this year, the Indian Supreme Court (SC) acknowledged that the right to privacy is “intrinsic to life and liberty” and is “inherently protected under the various fundamental freedoms enshrined under Part III of the Indian Constitution.”

In the same context, the SC had announced it will be hearing petitions on Aadhaar related matters (the term – meaning “foundation” – stands for a 12 digit unique-identity number supposedly issued to all Indian residents based on their biometric and demographic data) in November.

According to a Bloomberg report, India’a Prime Minister Narendra Modi is calling for an expansion of Aadhaar, even though its constitutionality is still to be debated. The SC has set January 10th as the beginning of the final hearings.

While officials say Aadhaar is saving the government billions of dollars by better targeting beneficiaries of subsidized food and cash transfers, critics point to unfair exclusions and data leaks. The latter on the one hand also fear that the database might lead India into becoming a state of surveillance. On the other hand, they are concerned about the high risk of major leaks, such as the ones reported by a news agency in India, the PTI (Press Trust of India): “Personal details of several Aadhaar users were made public on over 200 central and state government websites.”

Meanwhile, Medianama, a source of information and analysis on Digital and Telecom businesses in India, has launched a list of already compromised leaks and encourages people to point out any similar incidents.

Category: Data Breach · General · India · Personal Data
Tags: ,

Cancer Care Organization settles for 2.3 Mio $ after Data Breach

22. December 2017

In 2015, a data breach occurred at 21st Century Oncology  (21stCO), one of the leading providers of cancer care services in the USA, potentially affecting names, social security numbers, medical diagnoses and health insurance information of at least 2.2 million patients.

On its website, the provider had announced in 2016 that one of its databases was inappropriately accessed by an unauthorized third party, though an FBI investigation had already detected an attack as early as October 2015. The FBI, however, requested 21stCO to delay the notification because of ongoing federal investigations.

21stCO had then stated that ““we continue to work closely with the FBI on its investigation of the intrusion into our system” and “in addition to security measures already in place, we have also taken additional steps to enhance internal security protocols to help prevent a similar incident in the future.” To make amends for the security gap patients had been offered one year of free credit monitoring services.

Nevertheless, the provider now has to pay a fine worth 2.3 million dollars as settled with the Office for Civil Rights (OCR; part of the U.S. Department of Health and Human Services).

It has been accused of not implementing appropriate security measures and procedures to regularly review information system activity such as access or security incident reports, despite the disclosure by the FBI.

The OCR further stated that “the organization also disclosed protected health information to its business associates without having a proper business associate agreement in place”.

The settlement additionally requires 21stCO to set up a corrective action plan including the appointment of a compliance representative, completion of risk analysis and management, revision of cybersecurity policies, an internal breach reporting plan and overall in-depth IT-security. The organization will, in addition, need to maintain all relevant documents and records for six years, so the OCR can inspect and copy the documents if necessary.

Following the settlement, District Attorney Stephen Muldrow stated “we appreciate that 21st Century Oncology self-reported a major fraud affecting Medicare, and we are also pleased that the company has agreed to accept financial responsibility for past compliance failures.”

New and surprising password guidelines released by NIST

21. December 2017

The National Institute of Standards and Technology (NIST), a non-regulatory federal agency within the U.S. Department of Commerce that promotes innovation and industrial competitiveness often by recommending best practices in matters of security, has released its Digital Identity Guidelines uttering advice for user password management.

Considering that Bill Burr, the pioneer of password management, has admitted regretting his recommendations in a publication back in 2003, the NIST is taking appropriate action by revising wide-spread practices.

For over a decade, people were encouraged to create complex passwords with capital letters, numbers and „obscure“ characters – along with frequent changes.

Research has now shown that these requirements don’t necessarily improve the level of security, but instead might even make it easier for hackers to crack the code as people tend to make minor changes when they have to change their already complex password – usually pressed for time.

This is why the NIST is now recommending to let go of periodic password change requirements alongside of algorithmic complexity.

Rather than holding on to these practices, the experts emphasize the importance of password length. The NIST states, that „password length has been found to be a primary factor in characterizing password strength. Passwords that are too short yield to brute force attacks as well as to dictionary attacks using words and commonly chosen passwords.“

It takes years for computers to figure out passwords with 20 or more characters as long as the password is not commonly used.

The NIST advises to screen new passwords against specific lists: „For example, the list may include, but is not limited to passwords obtained from previous breach corpuses, dictionary words, repetitive or sequential characters (e.g. ‘aaaaaa’, ‚1234abcd’), context-specific words, such as the name of the service, the username, and derivatives thereof.“

Subsequently, the NIST completely abandons its own suggestions and causes great relief for industries all over:

„Length and complexity requirements beyond those recommended here significantly increase the difficulty of memorized secrets and increase user frustration. As a result, users often work around these restrictions in a way that is counterproductive. Furthermore, other mitigations such as blacklists, secure hashed storage, and rate limiting are more effective at preventing modern brute-force attacks. Therefore, no additional complexity requirements are imposed.“

French Data Protection Commission threatens WhatsApp with sanctions

The French National Data Protection Commission (CNIL) has found violations of the French Data Protection Act in the course of an investigation conducted in order to verify compliance of WhatsApps data Transfer to Facebook with legal requirements.

In 2016, WhatsApp had announced to transfer data to Facebook for the purpose of targeted advertising, security and business intelligence (technology-driven process for analyzing data and presenting actionable information to help executives, managers and other corporate end users make informed business decisions).

Immediately after the announcement, the Working Party 29 (an independent European advisory body on data protection and privacy, set up under Article 29 of Directive 95/46/EC; hereinafter referred to as „WP29“) asked the company to stop the data transfer for targeted advertising as French law doesn’t provide an adequate legal basis.

„While the security purpose seems to be essential to the efficient functioning of the application, it is not the case for the “business intelligence” purpose which aims at improving performances and optimizing the use of the application through the analysis of its users’ behavior.“

In the wake of the request, WhatsApp had assured the CNIL that it does not process the data of French users for such purposes.

However, the CNIL currently not only came to the result that the users’ consent was not validly collected as it lacked two essential aspects of data protection law: specific function and free choice. But it also denies a legitimate interest when it comes to preserving fundamental rights of users based on the fact that the application cannot be used if the data subjects refuse to allow the processing.

WhatsApp has been asked to provide a sample of the French users’ data transferred to Facebook, but refused to do so because being located in die United States, „it considers that it is only subject to the legislation of this country.“

The inspecting CNIL thus has issued a formal notice to WhatsApp and again requested to comply with the requirements within one month and states:

„Should WhatsApp fail to comply with the formal notice within the specified timescale, the Chair may appoint an internal investigator, who may draw up a report proposing that the CNIL’s restricted committee responsible for examining breaches of the Data Protection Act issue a sanction against the company.“

 

Facial recognition data may become purchasable for private companies in Australia

5. December 2017

The Australian government is considering making facial recognition data available for private companies.

By paying a fee they are supposed to get access to data originally collected for the sake of national security.

However, the companies are to be restricted to cases where the person has given her/his consent.

In an interview with The Guardian, Monique Mann, a director of the Australian Privacy Foundation and a lecturer at the faculty of law at the Queensland University of Technology, says that requiring companies to ask for consent may not be enough to protect consumers’ rights or mitigate the risks involved with biometric data, and would encourage firms to store more data.

As also reported by The Guardian, the government struck a deal with states and territories over the controversial national facial recognition database last month. It is said, that according to the documents, which predate the agreement, at that time 50% of the population was already included in the database.

With the help of state and territory governments, the federal Attorney General’s Department planned to expand that number to cover 85% of Australians.

Google gathers location data even if location services are disabled

23. November 2017

As Quartz reports, since the beginning of 2017, Google is gathering location data from Android phones, even if the location services are disabled. To do so, Google has been collecting the addresses from nearby cellular towers. With the gathered information Google has access to the location of Android phone users and data about their movements.

Quartz further reports, that according to a Google spokesman, the tower addresses were sent to a Google system that manages push notifications and messages on Android phones and that the collected data had never been stored. It is further said that by the end of November Google will end this practice.

Category: General

Moscow adds facial recognition to its network of surveillance cameras

2. October 2017

Moscow adds facial recognition to its network of 170.000 surveillance cameras across the city to be able to identify criminals and boost security, Bloomberg reports. The camera surveillance started in 2012. The recordings of the camera surveillance system have been held for five days after they are captured, with an amount of 20 million hours of video material stored at any one time. “We soon found it impossible to process such volumes of data by police officers alone,” Artem Ermolaev, who is Head of the Department of Information Technology in Moscow, said according to Bloomberg. “We needed an artificial intelligence to help find what we are looking for.”, he further said.

A Russian start-up, named N-Tech.Lab Ltd designed the facial recognition technology. The start-up is known for its mobile app FindFace which was released last year. With FindFace it is possible to search for users of the Russian social network VKontakte by making a picture of a person’s face and match it against the user profiles of VKontakte.

However, due to high costs the face recognition technology should not be deployed to every camera and therefore only be installed selectively within specific districts where it is needed the most. To maintain the camera surveillance, the Moscow government already should spend about $ 86 million a year and this amount would triple if every camera would use the new facial recognition technology.

The new technology is used to cross-reference images captured by the cameras with those from the Interior Ministry’s database.

Pages: Prev 1 2 3 ... 22 23 24 25 26 27 28 29 30 31 32 Next
1 26 27 28 29 30 32