Tag: data protection

Facebook releases new Privacy Tool for global use

31. January 2020

On Data Privacy Day, Facebook launched its new privacy tool, which gives its users control over how they are tracked across the net.

In a blog post, Facebook CEO Mark Zuckerberg introduced its “Off-Facebook Activity” tool, which had been promised since May 2008, to social network’s worldwide audience. It originally had slow roll-outs throughout different countries since August 2019, but is now officially available globally.

Facebook is known for its vast reaching tracking of internet activity, ranging from doorbell apps over sellers’ websites to health apps. It had been criticized by law-makers for its tracking practices, especially considering the social network keeps tracking your data when you deactivate your account.

Now, wanting the start into the new decade to be more privacy oriented, Mark Zuckerberg is prompting Facebook users to review their privacy settings. On top of deleting your tracking history, it is now possible to turn off future tracking altogether. Though it is important to keep in mind that Facebook does not stop advertisers and businesses from targeting ads based on other factors.

Overall, the tool is supposed to complement Facebook’s Privacy Checkup feature, to allow for users to regulate their privacy more thoroughly, and more importantly, on their own terms.

Germany: Large Data leak reveals Personal Data of more than 3 Million Customers

27. January 2020

The German car rental company Buchbinder is responsible for leaking Personal Data of more than 3 Million customers from all over Europe. The data leak exposed more than 10 Terabyte of sensitive customer data over several weeks without the company noticing it.

A German cybersecurity firm was executing routine network scans when it found the data leak. The firm reported it twice to Buchbinder via e-mail, but did not receive a reply. After that, the cybersecurity firm reported the leak to the Bavarian Data Protection Authority (DPA) and informed the German computer magazine c’t and newspaper DIE ZEIT.

According to c’t, a configuration error of a Backup-Server was the cause of the leak. The Personal Data exposed included customers’ names, private addresses, birth dates, telephone numbers, rental data, bank details, accident reports, legal documents, as well as Buchbinder employees’ e-mails and access data to internal networks.

The data leak is particularly serious because of the vast amount of leaked Personal Data that could easily be abused through Spam e-mails, Fraud, Phishing, or Identity theft. It is therefore likely that the German DPA will impose a GDPR fine on the company in the future.

Buchbinder released a press statement apologising for the data leak and promising to enhance the level of their defense and cybersecurity system.

CNIL publishes recommendations on how to get users’ cookie consent

21. January 2020

On 14 January 2020, the French data protection authority (“CNIL”) published recommendations on practical modalities for obtaining the consent of users to store or read non-essential cookies and similar technologies on their devices. In addition, the CNIL also published a series of questions and answers on the recommendations.

The purpose of the recommendations is to help private and public organisations to implement the CNIL guidelines on cookies and similar technologies dated 4 July 2019. To this end, CNIL describes the practical arrangements for obtaining users’ consent, gives concrete examples of the user interface to obtain consent and presents “best practices” that also go beyond the rules.

In order to find pragmatic and privacy-friendly solutions, CNIL consulted with organisations representing industries in the ad tech ecosystem and civil society organisations in advance and discussed the issue with them. The recommendations are neither binding or prescriptive nor exhaustive. Organisations may use other methods to obtain user consent, as long as these methods are in accordance with the guidelines.

Among the most important recommendations are:

Information about the purpose of cookies
First, the purposes of the cookies should be listed. The recommendations contain examples of this brief description for the following purposes or types of cookies:
(1) targeted or personalised advertising;
(2) non-personalized advertising;
(3) personalised advertising based on precise geolocation;
(4) customization of content or products and services provided by the Web Publisher;
(5) social media sharing;
(6) audience measurement/analysis.
In addition, the list of purposes should be complemented by a more detailed description of these purposes, which should be directly accessible, e.g. via a drop-down button or hyperlink.

Information on the data controllers
An exhaustive list of data controllers should be directly accessible, e.g. via a drop-down button or hyperlink. When users click on this hyperlink or button, they should receive specific information on data controllers (name and link to their privacy policy). However, web publishers do not have to list all third parties that use cookies on their website or application, but only those who are also data controllers. Therefore, the role of the parties (data controller, joint data controller, or data processor) has to be assessed individually for each cookie. This list should be regularly updated and should be permanently accessible (e.g. through the cookie consent mechanism, which would be available via a static icon or hyperlink at the bottom of each web page). Should a “substantial” addition be made to the list of data controllers, users’ consent should be sought again.

Real choice between accepting or rejecting cookies
Users must be offered a real choice between accepting or rejecting cookies. This can be done by means of two (not pre-ticked) checkboxes or buttons (“accept” / “reject”, “allow” / “deny”, etc.) or equivalent elements such as “on”/”off” sliders, which should be disabled by default. These checkboxes, buttons or sliders should have the same format and be presented at the same level. Users should have such a choice for each type or category of cookie.

The ability for users to delay this selection
A “cross” button should be included so that users can close the consent interface and do not have to make a choice. If the user closes the interface, no consent cookies should be set. However, consent could be obtained again until the user makes a choice and accepts or rejects cookies.

Overall consent for multiple sites
It is acceptable to obtain user consent for a group of sites rather than individually for each site. However, this requires that users are informed of the exact scope of their consent (i.e., by providing them with a list of sites to which their consent applies) and that they have the ability to refuse all cookies on those sites altogether (e.g., if there is a “refuse all” button along with an “accept all” button). To this end, the examples given in the recommendations include three buttons: “Personalize My Choice” (where users can make a more precise selection based on the purpose or type of cookies), “Reject All” and “Accept All”.

Duration of validity of the consent
It is recommended that users re-submit their consent at regular intervals. CNIL considers a period of 6 months to be appropriate.

Proof of consent
Data controllers should be able to provide individual proof of users’ consent and to demonstrate that their consent mechanism allows a valid consent to be obtained.

The recommendations are open for public consultation until 25 February 2020. A new version of the recommendations will then be submitted to the members of CNIL for adoption during a plenary session. CNIL will carry out enforcement inspections six months after the adoption of the recommendations. The final recommendations may also be updated and completed over time to take account of new technological developments and the responses to the questions raised by professionals and individuals on this subject.

Fine imposed on the City of Oslo

2. January 2020

The Norwegian data protection authority (datatilsynet) recently imposed a fine of €49,300 on the city of Oslo. The reason for the fine was that the city has kept patient data outside the electronic health record system at the city’s nursing homes/health centres from 2007 to November 2018.

The case became known because the City of Oslo reported a data breach to the Data Protection Authority in November 2018. This report included information that various governmental and private nursing homes/health centres were using work sheets. These contained information about the residents, such as their daily needs and care routines, but also full names and room numbers. The work sheets were stored on the respective intranet of the institution and all employees, including for example cleaning staff, had access to this data.

After the procedure came to the surface, the Nursing Home Agency instructed all nursing homes/health centres to delete the work sheets immediately. Due to the way the data was stored, it is not possible to determine who exactly accessed the data and when, and whether unauthorised persons were among them.

In calculating the amount of the fine, the Data Protection Agency has taken into account that the City of Oslo reported the incident itself and has taken quick steps to delete the data. It was also taken into account that the incident occurred for the most part in the period before the new Data Protection Act (in force since July 2018) came into force and that under the old Data Protection Act the maximum amount of a fine was €100,000.

Advocate General releases opinion on the validity of SCCs in case of Third Country Transfers

19. December 2019

Today, Thursday 19 of December, the European Court of Justice’s (CJEU) Advocate General Henrik Saugmandsgaard Øe released his opinion on the validity of Standard Contractual Clauses (SCCs) in cases of personal data transfers to processors situated in third countries.

The background of the case, on which the opinion builds on, originates in the proceedings initiated by Mr. Maximillian Schrems, where he stepped up against Facebook’s business practice of transferring the personal data of its European subscribers to servers located in the United States. The case (Schrems I) led the CJEU on October 6, 2015, to invalidate the Safe Harbor arrangement, which up to that point governed data transfers between the EU and the U.S.A.

Following the ruling, Mr. Schrems decided to challenge the transfers performed on the basis of the EU SCCs, the alternative mechanism Facebook has chosen to rely on to legitimize its EU-U.S. data flows, on the basis of similar arguments to those raised in the Schrems I case. The Irish DPA brought proceedings before the Irish High Court, which referred 11 questions to the CJEU for a preliminary ruling, the Schrems II case.

In the newly published opinion, the Advocate General validates the established SCCs in case of a commercial transfer, despite the possibility of public authorities in the third country processing the personal data for national security reasons. Furthermore, the Advocate General states that the continuity of the high level of protection is not only guaranteed by the adequacy decision of the court, but just as well by the contractual safeguards which the exporter has in place that need to match that level of protection. Therefore, the SCCs represent a general mechanism applicable to transfers, no matter the third country and its adequacy of protection. In addition, and in light of the Charter, there is an obligation for the controller as well as the supervisory authority to suspend any third country transfer if, because of a conflict between the SCCs and the laws in the third country, the SCCs cannot be complied with.

In the end, the Advocate General also clarified that the EU-U.S. Privacy Shield decision of 12 July 2016 is not part of the current proceedings, since those only cover the SCCs under Decision 2010/87, taking the questions of the validity of the Privacy Shield off the table.

While the Advocate General’s opinion is not binding, it represents the suggestion of a legal solution for cases for which the CJEU is responsible. However, the CJEU’s decision on the matter is not expected until early 2020, setting the curiosity on the outcome of the case high.

Berlin commissioner for data protection imposes fine on real estate company

6. November 2019

On October 30th, 2019, the Berlin Commissioner for Data Protection and Freedom of Information issued a fine of around 14.5 million euros against the real estate company Deutsche Wohnen SE for violations of the General Data Protection Regulation (GDPR).

During on-site inspections in June 2017 and March 2019, the supervisory authority determined that the company used an archive system for the storage of personal data of tenants that did not provide for the possibility of removing data that was no longer required. Personal data of tenants were stored without checking whether storage was permissible or even necessary. In individual cases, private data of the tenants concerned could therefore be viewed, even though some of them were years old and no longer served the purpose of their original survey. This involved data on the personal and financial circumstances of tenants, such as salary statements, self-disclosure forms, extracts from employment and training contracts, tax, social security and health insurance data and bank statements.

After the commissioner had made the urgent recommendation to change the archive system in the first test date of 2017, the company was unable to demonstrate either a cleansing of its database nor legal reasons for the continued storage in March 2019, more than one and a half years after the first test date and nine months after the GDPR came into force. Although the enterprise had made preparations for the removal of the found grievances, nevertheless these measures did not lead to a legal state with the storage of personal data. Therefore the imposition of a fine was compelling because of a violation of article 25 Abs. 1 GDPR as well as article 5 GDPR for the period between May 2018 and March 2019.

The starting point for the calculation of fines is, among other things, the previous year’s worldwide sales of the affected companies. According to its annual report for 2018, the annual turnover of Deutsche Wohnen SE exceeded one billion euros. For this reason, the legally prescribed framework for the assessment of fines for the established data protection violation amounted to approximately 28 million euros.

For the concrete determination of the amount of the fine, the commissioner used the legal criteria, taking into account all burdening and relieving aspects. The fact that Deutsche Wohnen SE had deliberately set up the archive structure in question and that the data concerned had been processed in an inadmissible manner over a long period of time had a particularly negative effect. However, the fact that the company had taken initial measures to remedy the illegal situation and had cooperated well with the supervisory authority in formal terms was taken into account as a mitigating factor. Also with regard to the fact that the company was not able to prove any abusive access to the data stored, a fine in the middle range of the prescribed fine framework was appropriate.

In addition to sanctioning this violation, the commissioner imposed further fines of between 6,000 and 17,000 euros on the company for the inadmissible storage of personal data of tenants in 15 specific individual cases.

The decision on the fine has not yet become final. Deutsche Wohnen SE can lodge an appeal against this decision.

Apple wants to evaluate “Siri”-recordings again

14. October 2019

Apple wants to evaluate Siri-recordings again in the future. After it became public that Apple automatically saved the audio recordings of Siri entries and had some of them evaluated by employees of external companies, the company stopped this procedure. Although Apple stated that only less than 0.2 % of the queries were actually evaluated, the system received around 10 billion queries per month (as of 2018).

In the future, audio recordings from the Siri language assistant will be stored and evaluated again. This time, however, only after the user has consented. This procedure will be tested with the latest beta versions of the Apple IOS software for iPhone and iPad.

Apple itself hopes that many users will agree and thus contribute to the improvement of Siri. A later opt-out is possible at any time, but for each device individually. In addition, only apple’s own employees, who are – according to Apple -subject to strict confidentiality obligations ,will evaluate the recordings. Recordings that have been generated by an unintentional activation of Siri will be completely deleted.

In addition, a delete function for Siri-recordings is to be introduced. Users can then choose in their settings to delete all data recorded by Siri. If this deletion is requested within 24 hours of a Siri request, the respective recordings and transcripts will not be released for evaluation.

However, even if the user does not opt-in to the evaluation of his Siri recordings, a computer-generated transcript will continue to be created and kept by Apple for a certain period of time. Although these transcripts are to be anonymized and linked to a random ID, they still could be evaluated according to Apple.

Category: General
Tags: ,

China publishes provisions on the protection of personal data of children

10. October 2019

On 23 August 2019, the Cyberspace Administration of China published regulations on the cyber protection of personal data of children, which came into force on 1 October 2019. China thus enacted the first rules focusing exclusively on the protection of children’s personal data.

In the regulations, “children” refers to minors under the age of 14. This corresponds to the definition in the national “Information Security Technology – Personal Information Security Specification”.

The provisions regulate activities related to the collection, storage, use, transfer and disclosure of personal data of children through networks located on the territory of China. However, the provisions do not apply to activities conducted outside of China or to similar activities conducted offline.

The provisions provide a higher standard of consent than the Cybersecurity Law of China. To obtain the consent of a guardian, a network operator has to provide the possibility of refusal and expressly inform the guardian of the following:

  • Purpose, means and scope of collection, storage, use, transfer and disclosure of children’s personal information;
  • Storage location of children’s personal information, retention period and how the relevant information will be handled after expiration of the retention period;
  • Safeguard measures protecting children’s personal information;
  • Consequences of rejection by a guardian;
  • The channels and means of filing or reporting complaints; and
  • How to correct and delete children’s personal information.

The network operator also has to restrict internal access to children’s personal information. In particular, before accessing the information, personnel must obtain consent of the person responsible for the protection of children’s personal data or an authorised administrator.

If children’s personal data are processed by a third party processor, the network operator is obliged to carry out a security assessment of the data processor commissioned to process the children’s personal data. He also has to conclude an entrustment agreement with the data processor. The data processor is obliged to support the network operator in fulfilling the request of the guardian to delete the data of a child after termination of the service. Subletting or subcontracting by the data processor is prohibited.

If personal data of children is transferred to a third party, the network operator shall carry out a security assessment of the commissioned person or commission a third party to carry out such an assessment.

Children or their legal guardians have the right to demand the deletion of children’s personal data under certain circumstances. In any case, they have the right to demand the correction of personal data of children if they are collected, stored, used or disclosed by a network operator. In addition, the legal guardians have the right to withdraw their consent in its entirety.

In the event of actual or potential data breaches, the network operator is obliged to immediately initiate its emergency plan and take remedial action. If the violation has or may have serious consequences, the network operator must immediately report the violation to the competent authorities and inform the affected children and their legal guardians by e-mail, letter, telephone or push notification. Where it is challenging to send the notification to any data subject, the network operator shall take appropriate and effective measures to make the notification public. However, the rules do not contain a precise definition of the serious consequences.

In the event that the data breach is caused or observed by a data processor, the data processor is obliged to inform the network operator in good time.

CJEU rules that Right To Be Forgotten is only applicable in Europe

27. September 2019

In a landmark case on Tuesday the Court of Justice of the European Union (CJEU) ruled that Google will not have to apply the General Data Privacy Regulation’s (GDPR) “Right to be Forgotten” to its search engines outside of the European Union. The ruling is a victory for Google in a case against a fine imposed by the french Commission nationale de l’informatique et des libertés (CNIL) in 2015 in an effort to force the company and other search engines to take down links globally.

Seeing as the internet has grown into a worldwide media net with no borders, this case is viewed as a test of wether people can demand a blanket removal of information about themselves from searches without overbearing on the principles of free speech and public interest. Around the world, it has also been perceived as a trial to see if the European Union can extend its laws beyond its own borders.

“The balance between right to privacy and protection of personal data, on the one hand, and the freedom of information of internet users, on the other, is likely to vary significantly around the world,” the court stated in its decision.The Court also expressed in the judgement that the protection of personal data is not an absolute right.

While this leads to companies not being forced to delete sensitive information on their search engines outside of the EU upon request, they must take precautions to seriously discourage internet users from going onto non-EU versions of their pages. Furthermore, companies with search engines within the EU will have to closely weigh freedom of speech against the protection of privacy, keeping the currently common case to case basis for deletion requests.

In effect, since the Right to be Forgotten had been first determined by the CJEU in 2014, Google has since received over 3,3 million deletion requests. In 45% of the cases it has complied with the delisting of links from its search engine. As it stands, even while complying with deletion requests, the delisted links within the EU search engines can still be accessed by using VPN and gaining access to non-EU search engines, circumventing the geoblocking. This is an issue to which a solution has not yet been found.

Data Breach: Millions of patient data available on the Internet

20. September 2019

As reported by the US investment platform ProPublica and the German broadcaster Bayerischer Rundfunk, millions of highly sensitive patient data were discovered freely accessible on the Internet.

Among the data sets are high-resolution X-ray images, breast cancer screenings, CT scans and other medical images. Most of them are provided with personal data such as birth dates, names and information about their doctor and their medical treatment. The data could be found for years on unprotected servers.

In Germany, around 13,000 data records are affected, and more than 16 million worldwide, including more than 5 million patients in the USA.

When X-ray or MRI images of patients are taken, they are stored on “Picture Archiving Communication System” (PACS) servers. If these servers are not sufficiently secured, it is easy to access the data. In 2016, Oleg Pianykh, Professor of Radiology at Harvard Medical School, published a study on unsecured PACS servers. He was able to locate more than 2700 open systems, but the study did not prompt anyone in the industry to act.

The German Federal Ministry for Information Security has now informed authorities in 46 countries. Now it remains to be seen how they will react to the incident.

Pages: 1 2 3 4 Next
1 2 3 4