Tag: data protection

Advocate General releases opinion on the validity of SCCs in case of Third Country Transfers

19. December 2019

Today, Thursday 19 of December, the European Court of Justice’s (CJEU) Advocate General Henrik Saugmandsgaard Øe released his opinion on the validity of Standard Contractual Clauses (SCCs) in cases of personal data transfers to processors situated in third countries.

The background of the case, on which the opinion builds on, originates in the proceedings initiated by Mr. Maximillian Schrems, where he stepped up against Facebook’s business practice of transferring the personal data of its European subscribers to servers located in the United States. The case (Schrems I) led the CJEU on October 6, 2015, to invalidate the Safe Harbor arrangement, which up to that point governed data transfers between the EU and the U.S.A.

Following the ruling, Mr. Schrems decided to challenge the transfers performed on the basis of the EU SCCs, the alternative mechanism Facebook has chosen to rely on to legitimize its EU-U.S. data flows, on the basis of similar arguments to those raised in the Schrems I case. The Irish DPA brought proceedings before the Irish High Court, which referred 11 questions to the CJEU for a preliminary ruling, the Schrems II case.

In the newly published opinion, the Advocate General validates the established SCCs in case of a commercial transfer, despite the possibility of public authorities in the third country processing the personal data for national security reasons. Furthermore, the Advocate General states that the continuity of the high level of protection is not only guaranteed by the adequacy decision of the court, but just as well by the contractual safeguards which the exporter has in place that need to match that level of protection. Therefore, the SCCs represent a general mechanism applicable to transfers, no matter the third country and its adequacy of protection. In addition, and in light of the Charter, there is an obligation for the controller as well as the supervisory authority to suspend any third country transfer if, because of a conflict between the SCCs and the laws in the third country, the SCCs cannot be complied with.

In the end, the Advocate General also clarified that the EU-U.S. Privacy Shield decision of 12 July 2016 is not part of the current proceedings, since those only cover the SCCs under Decision 2010/87, taking the questions of the validity of the Privacy Shield off the table.

While the Advocate General’s opinion is not binding, it represents the suggestion of a legal solution for cases for which the CJEU is responsible. However, the CJEU’s decision on the matter is not expected until early 2020, setting the curiosity on the outcome of the case high.

Berlin commissioner for data protection imposes fine on real estate company

6. November 2019

On October 30th, 2019, the Berlin Commissioner for Data Protection and Freedom of Information issued a fine of around 14.5 million euros against the real estate company Deutsche Wohnen SE for violations of the General Data Protection Regulation (GDPR).

During on-site inspections in June 2017 and March 2019, the supervisory authority determined that the company used an archive system for the storage of personal data of tenants that did not provide for the possibility of removing data that was no longer required. Personal data of tenants were stored without checking whether storage was permissible or even necessary. In individual cases, private data of the tenants concerned could therefore be viewed, even though some of them were years old and no longer served the purpose of their original survey. This involved data on the personal and financial circumstances of tenants, such as salary statements, self-disclosure forms, extracts from employment and training contracts, tax, social security and health insurance data and bank statements.

After the commissioner had made the urgent recommendation to change the archive system in the first test date of 2017, the company was unable to demonstrate either a cleansing of its database nor legal reasons for the continued storage in March 2019, more than one and a half years after the first test date and nine months after the GDPR came into force. Although the enterprise had made preparations for the removal of the found grievances, nevertheless these measures did not lead to a legal state with the storage of personal data. Therefore the imposition of a fine was compelling because of a violation of article 25 Abs. 1 GDPR as well as article 5 GDPR for the period between May 2018 and March 2019.

The starting point for the calculation of fines is, among other things, the previous year’s worldwide sales of the affected companies. According to its annual report for 2018, the annual turnover of Deutsche Wohnen SE exceeded one billion euros. For this reason, the legally prescribed framework for the assessment of fines for the established data protection violation amounted to approximately 28 million euros.

For the concrete determination of the amount of the fine, the commissioner used the legal criteria, taking into account all burdening and relieving aspects. The fact that Deutsche Wohnen SE had deliberately set up the archive structure in question and that the data concerned had been processed in an inadmissible manner over a long period of time had a particularly negative effect. However, the fact that the company had taken initial measures to remedy the illegal situation and had cooperated well with the supervisory authority in formal terms was taken into account as a mitigating factor. Also with regard to the fact that the company was not able to prove any abusive access to the data stored, a fine in the middle range of the prescribed fine framework was appropriate.

In addition to sanctioning this violation, the commissioner imposed further fines of between 6,000 and 17,000 euros on the company for the inadmissible storage of personal data of tenants in 15 specific individual cases.

The decision on the fine has not yet become final. Deutsche Wohnen SE can lodge an appeal against this decision.

Apple wants to evaluate “Siri”-recordings again

14. October 2019

Apple wants to evaluate Siri-recordings again in the future. After it became public that Apple automatically saved the audio recordings of Siri entries and had some of them evaluated by employees of external companies, the company stopped this procedure. Although Apple stated that only less than 0.2 % of the queries were actually evaluated, the system received around 10 billion queries per month (as of 2018).

In the future, audio recordings from the Siri language assistant will be stored and evaluated again. This time, however, only after the user has consented. This procedure will be tested with the latest beta versions of the Apple IOS software for iPhone and iPad.

Apple itself hopes that many users will agree and thus contribute to the improvement of Siri. A later opt-out is possible at any time, but for each device individually. In addition, only apple’s own employees, who are – according to Apple -subject to strict confidentiality obligations ,will evaluate the recordings. Recordings that have been generated by an unintentional activation of Siri will be completely deleted.

In addition, a delete function for Siri-recordings is to be introduced. Users can then choose in their settings to delete all data recorded by Siri. If this deletion is requested within 24 hours of a Siri request, the respective recordings and transcripts will not be released for evaluation.

However, even if the user does not opt-in to the evaluation of his Siri recordings, a computer-generated transcript will continue to be created and kept by Apple for a certain period of time. Although these transcripts are to be anonymized and linked to a random ID, they still could be evaluated according to Apple.

Category: General
Tags: ,

China publishes provisions on the protection of personal data of children

10. October 2019

On 23 August 2019, the Cyberspace Administration of China published regulations on the cyber protection of personal data of children, which came into force on 1 October 2019. China thus enacted the first rules focusing exclusively on the protection of children’s personal data.

In the regulations, “children” refers to minors under the age of 14. This corresponds to the definition in the national “Information Security Technology – Personal Information Security Specification”.

The provisions regulate activities related to the collection, storage, use, transfer and disclosure of personal data of children through networks located on the territory of China. However, the provisions do not apply to activities conducted outside of China or to similar activities conducted offline.

The provisions provide a higher standard of consent than the Cybersecurity Law of China. To obtain the consent of a guardian, a network operator has to provide the possibility of refusal and expressly inform the guardian of the following:

  • Purpose, means and scope of collection, storage, use, transfer and disclosure of children’s personal information;
  • Storage location of children’s personal information, retention period and how the relevant information will be handled after expiration of the retention period;
  • Safeguard measures protecting children’s personal information;
  • Consequences of rejection by a guardian;
  • The channels and means of filing or reporting complaints; and
  • How to correct and delete children’s personal information.

The network operator also has to restrict internal access to children’s personal information. In particular, before accessing the information, personnel must obtain consent of the person responsible for the protection of children’s personal data or an authorised administrator.

If children’s personal data are processed by a third party processor, the network operator is obliged to carry out a security assessment of the data processor commissioned to process the children’s personal data. He also has to conclude an entrustment agreement with the data processor. The data processor is obliged to support the network operator in fulfilling the request of the guardian to delete the data of a child after termination of the service. Subletting or subcontracting by the data processor is prohibited.

If personal data of children is transferred to a third party, the network operator shall carry out a security assessment of the commissioned person or commission a third party to carry out such an assessment.

Children or their legal guardians have the right to demand the deletion of children’s personal data under certain circumstances. In any case, they have the right to demand the correction of personal data of children if they are collected, stored, used or disclosed by a network operator. In addition, the legal guardians have the right to withdraw their consent in its entirety.

In the event of actual or potential data breaches, the network operator is obliged to immediately initiate its emergency plan and take remedial action. If the violation has or may have serious consequences, the network operator must immediately report the violation to the competent authorities and inform the affected children and their legal guardians by e-mail, letter, telephone or push notification. Where it is challenging to send the notification to any data subject, the network operator shall take appropriate and effective measures to make the notification public. However, the rules do not contain a precise definition of the serious consequences.

In the event that the data breach is caused or observed by a data processor, the data processor is obliged to inform the network operator in good time.

CJEU rules that Right To Be Forgotten is only applicable in Europe

27. September 2019

In a landmark case on Tuesday the Court of Justice of the European Union (CJEU) ruled that Google will not have to apply the General Data Privacy Regulation’s (GDPR) “Right to be Forgotten” to its search engines outside of the European Union. The ruling is a victory for Google in a case against a fine imposed by the french Commission nationale de l’informatique et des libertés (CNIL) in 2015 in an effort to force the company and other search engines to take down links globally.

Seeing as the internet has grown into a worldwide media net with no borders, this case is viewed as a test of wether people can demand a blanket removal of information about themselves from searches without overbearing on the principles of free speech and public interest. Around the world, it has also been perceived as a trial to see if the European Union can extend its laws beyond its own borders.

“The balance between right to privacy and protection of personal data, on the one hand, and the freedom of information of internet users, on the other, is likely to vary significantly around the world,” the court stated in its decision.The Court also expressed in the judgement that the protection of personal data is not an absolute right.

While this leads to companies not being forced to delete sensitive information on their search engines outside of the EU upon request, they must take precautions to seriously discourage internet users from going onto non-EU versions of their pages. Furthermore, companies with search engines within the EU will have to closely weigh freedom of speech against the protection of privacy, keeping the currently common case to case basis for deletion requests.

In effect, since the Right to be Forgotten had been first determined by the CJEU in 2014, Google has since received over 3,3 million deletion requests. In 45% of the cases it has complied with the delisting of links from its search engine. As it stands, even while complying with deletion requests, the delisted links within the EU search engines can still be accessed by using VPN and gaining access to non-EU search engines, circumventing the geoblocking. This is an issue to which a solution has not yet been found.

Data Breach: Millions of patient data available on the Internet

20. September 2019

As reported by the US investment platform ProPublica and the German broadcaster Bayerischer Rundfunk, millions of highly sensitive patient data were discovered freely accessible on the Internet.

Among the data sets are high-resolution X-ray images, breast cancer screenings, CT scans and other medical images. Most of them are provided with personal data such as birth dates, names and information about their doctor and their medical treatment. The data could be found for years on unprotected servers.

In Germany, around 13,000 data records are affected, and more than 16 million worldwide, including more than 5 million patients in the USA.

When X-ray or MRI images of patients are taken, they are stored on “Picture Archiving Communication System” (PACS) servers. If these servers are not sufficiently secured, it is easy to access the data. In 2016, Oleg Pianykh, Professor of Radiology at Harvard Medical School, published a study on unsecured PACS servers. He was able to locate more than 2700 open systems, but the study did not prompt anyone in the industry to act.

The German Federal Ministry for Information Security has now informed authorities in 46 countries. Now it remains to be seen how they will react to the incident.

Google strives to reconcile advertising and privacy

27. August 2019

While other browser developers are critical of tracking, Google wants to introduce new standards to continue enabling personalized advertising. With the implementation of the “Privacy Sandbox” and the introduction of a new identity management system, the developer of the Chrome browser wants to bring browsers to an uniform level in processing of user data and protect the privacy of users more effectively.

The suggestions are the first steps of the privacy initiative announced by Google in May. Google has published five ideas. For example, browsers are to manage a “Privacy Budget” that gives websites limited access to user data so that users can be sorted into an advertising target group without being personally identified. Google also plans to set up central identity service providers that offer limited access to user data via an application programming interface (API) and inform users about the information they have passed on.

Measures like Apple’s, which have introduced Intelligent Tracking Protection, are not in Google’s interest, as Google generates much of its revenue from personalized advertising. In a blog post, Google also said that blocking cookies promotes non-transparent techniques such as fingerprinting. Moreover, without the ability to display personalized advertising, the future of publishers would be jeopardized. Their costs are covered by advertising. Recent studies have shown, that the financing of publishers decreases by an average of 52% if advertising loses relevance due to the removal of cookies.

Based on these ideas, the discussion among developers about the future of web browsers and how to deal with users’ privacy should now begin. Google’s long-term goal is a standardization process to which all major browser developers should adhere. So far, Google has had only limited success with similar initiatives.

ICO releases a draft Code of Practice to consult on the Use of Personal Data in Political Campaigning

14. August 2019

The United Kingdom’s Information Commissioner’s Office (ICO) plans to give consultations on a new framework code of practice regarding the use of personal data in relation to politcal campaigns.

ICO states that in any democratic society it is vital for political parties,  candidates and campaigners to be able to communicate effectively with voters. Equally vital, though, is that all organisations involved in political campaigning use personal data in a transparent, lawful way that is understood by the people.

Along with the internet, politcal campaigning has become increasingly sophisticated and innovative. Using new technologies and techniques to understand their voters and target them, political campaigning has changed, using social media, the electoral register or screening names for ethnicity and age. In a statement from June, ICO has adressed the risk that comes with innovation, which, intended or not, can undermine the democratic process by hidden manipulation through the processing of personal data that the people do not understand.

In this light, ICO expresses that their current guidance is outdated, since it has not been updated since the introduction of the General Data Protection Regulation (GDPR). It does not reflect modern campainging practices. However, the framework does not establish new requirements for campaigners, instead aims at explaining and clarifying data protection and electronic marketing laws as they already stand.

Before drafting the framework, the Information Commissioner launched a call for views in October 2018 in hopes of input from various people and organisations. The framework is hoped to have taken into account the responses the ICO had received in the process.

In hopes of being the basis of a statutory code of practice if the relevant legislation is introduced, the draft of the framework code of practice is now out for public consultation, and will remain open for public access until Ocotber 4th.

Hackers steal millions of Bulgarians’ financial data

18. July 2019

After a cyberattack on the Bulgarian’s tax agency (NRA) millions of taxpayers’ financial data has been stolen. In an estimate, it is said that most working adults in the 7 million country are affected by some of their data being compromised. The stolen data included names, adresses, income and social security information.

The attack happened in June, but an E-mail from the self-proclaimed perpetrator was sent to Bulgarian media on Monday. It stated that more than 110 databases of the agency had been compromised, the hacker calling the NRA’s cybersecurity a parody. The Bulgarian media were further offered access to the stolen data. One stolen file, e-mailed to the newspaper 24 Chasa,  contained up to 1,1 million personal identification numbers with income, social security and healthcare figures.

The country’s finance minister Vladislav Goranov has appologized in parliament and to the Bulgarian citizens, adding that about 3% of the tax agency’s database had been affected. He made clear that whoever attempted to exploit the stolen data would fall under the impact of Bulgarian law.

In result to this hacking attack, the Bulgarian tax agency now faces a fine of up to 20 million euros by the Commission of Personal Data Protection (CPDP). In addition, the issue has reignited an old debate about the lax cybersecurity standards in Bulgaria, and its adjustement to the modern times.

Record fine by ICO for British Airways data breach

11. July 2019

After a data breach in 2018, which affected 500 000 customers, British Airways (BA) has now been fined a record £183m by the UK’s Information Commissioners Office (ICO). According to the BBC, Alex Cruz, chairman and CEO of British Airways, said he was “surprised and disappointed” by the ICO’s initial findings.

The breach happened by a hacking attack that managed to get a script on to the BA website. Unsuspecting users trying to access the BA website had been diverted to a false website, which collected their information. This information included e-mail addresses, names and credit card information. While BA had stated that they would reimburse every customer that had been affected, its owner IAG declared through its chief executive that they would take “all appropriate steps to defend the airline’s position”.

The ICO said that it was the biggest penalty that they had ever handed out and made public under the new rules of the GDPR. “When an organization fails to protect personal data from loss, damage or theft, it is more than an inconvenience,” ICO Commissioner Elizabeth Dunham said to the press.

In fact, the GDPR allows companies to be fined up to 4% of their annual turnover over data protection infringements. In relation, the fine of £183m British Airways received equals to 1,5% of its worldwide turnover for the year 2017, which lies under the possible maximum of 4%.

BA can still put forth an appeal in regards to the findings and the scale of the fine, before the ICO’s final decision is made.

Pages: Prev 1 2 3 4 5 6 7 8 9 Next
1 4 5 6 7 8 9