Category: General

Hearing on the legal challenge of SCC and US-EU Privacy Shield before CJEU

17. July 2019

On Tuesday last week, the European Court of Justice (CJEU) held the hearing on case 311/18, commonly known as “Schrems II”, following a complaint to the Irish Data Protection Commission (DPC) by Maximilian Schrems about the transfer of his personal data from Facebook Ireland to Facebook in the U.S. The case deals with two consecutive questions. The initial question refers to whether U.S. law, the Foreign Intelligence Service Act (FISA), that consists a legal ground for national security agencies to access the personal data of citizens of the European Union (EU) violates EU data protection laws. If confirmed, this would raise the second question namely whether current legal data transfer mechanisms could be invalid (we already reported on the backgrounds).

If both, the US-EU Privacy Shield and the EU Standard Contractual Clauses (SCCs) as currently primeraly used transfer mechanisms, were ruled invalid, businesses would probably have to deal with a complex and diffucult scenario. As Gabriela Zanfir-Fortuna, senior counsel at Future of Privacy Forum said, the hearing would have had a particularly higher impact than the first Schrems/EU-US Safe Harbor case, because this time it could affect not only data transfers from the EU to the U.S., but from the EU to all countries around the world where international data transfers are based on the SCCs.

This is what also Facebook lawyer, Paul Gallagher, argued. He told the CJEU that if SCCs were hold invalid, “the effect on trade would be immense.” He added that not all U.S. companies would be covered by FISA – that would allow them to provide the law enforcement agencies with EU personal data. In particular, Facebook could not be hold responsible for unduly handing personal data over to national security agencies, as there was no evidence of that.

Eileen Barrington, lawyer of the US government assured, of course, by referring to a “hypothetical scenario” in which the US would tap data streams from a cable in the Atlantic, it was not about “undirected” mass surveillance. But about “targeted” collection of data – a lesson that would have been learned from the Snowden revelations according to which the US wanted to regain the trust of Europeans. Only suspicious material would be filtered out using particular selectors. She also had a message for the European feeling of security: “It has been proven that there is an essential benefit to the signal intelligence of the USA – for the security of American as well as EU citizens”.

The crucial factor for the outcome of the proceedings is likely to be how valid the CJEU considers the availability of legal remedies to EU data subjects. Throughout the hearing, there were serious doubts about this. The monitoring of non-US citizens data is essentially based on a presidential directive and an executive order, i.e. government orders and not on formal laws. However, EU citizens will be none the wiser, as particularly, referring to many critisists’ conlusion, they do not know whether they will be actually surveilled or not. It remains the issue regarding the independence of the ombudsperson which the US has committed itself to establish in the Privacy Shield Agreement. Of course, he or she may be independent in terms of the intelligence agencies, but most likely not of the government.

However, Henrik Saugmandsgaard Øe, the Advocate General responsible for the case, intends to present his proposal, which is not binding on the Judges, on December 12th. The court’s decision is then expected in early 2020. Referring to CJEU judge and judge-rapporteur in the case, Thomas von Danwitz, the digital services and networking would be considerably compromised, anyways, if the CJEU would declare the current content of the SCC ineffective.

 

 

Record fine by ICO for British Airways data breach

11. July 2019

After a data breach in 2018, which affected 500 000 customers, British Airways (BA) has now been fined a record £183m by the UK’s Information Commissioners Office (ICO). According to the BBC, Alex Cruz, chairman and CEO of British Airways, said he was “surprised and disappointed” by the ICO’s initial findings.

The breach happened by a hacking attack that managed to get a script on to the BA website. Unsuspecting users trying to access the BA website had been diverted to a false website, which collected their information. This information included e-mail addresses, names and credit card information. While BA had stated that they would reimburse every customer that had been affected, its owner IAG declared through its chief executive that they would take “all appropriate steps to defend the airline’s position”.

The ICO said that it was the biggest penalty that they had ever handed out and made public under the new rules of the GDPR. “When an organization fails to protect personal data from loss, damage or theft, it is more than an inconvenience,” ICO Commissioner Elizabeth Dunham said to the press.

In fact, the GDPR allows companies to be fined up to 4% of their annual turnover over data protection infringements. In relation, the fine of £183m British Airways received equals to 1,5% of its worldwide turnover for the year 2017, which lies under the possible maximum of 4%.

BA can still put forth an appeal in regards to the findings and the scale of the fine, before the ICO’s final decision is made.

China: Tourist mobile phones are scanned by an app

8. July 2019

Foreign tourists who want to enter the Chinese province of Xinjiang by land are spied out via an app.
For the first time, employees of the Süddeutsche Zeitung, Motherboard Vice and various other media portals in cooperation with IT-experts of the Ruhr University Bochum have succeeded in decrypting a Chinese surveillance software that also targets foreigners.

It has been known for some time that the Chinese authorities use apps to monitor the Uighur residents of Xinjiang province (we reported). What is new is that foreign tourists and businessmen coming from Kyrgyzstan to Xinjiang by land have to hand in their mobile phones at the borders and then get the Android app “Fengcai” (“collecting bees”) installed. They are not explicitly informed about this.

The app gets access to the contacts, the calendar, the SMS, the location or the call lists and transmits them to a computer of the border police. In addition, the app scans the phone for over 70,000  files that are suspicious from Chineses government’s point of view. Many scanned files refer to extremist content related to the Islamic state, but also harmless religious content or files related to Taiwan, Tibet or the Dalai Lama are part of the list. If the app discovers anything, it emits a warning tone and thereby informs the border police.

The app also scans the phones to see which apps were installed by the user and even extract usernames but several antivirus firms already updated their products to identify the app as malware.

Neither the Chinese authorities nor the company that developed the app reacted to a request for comment.

Category: General · Personal Data
Tags: ,

Texas amends Data Breach Notification Law

2. July 2019

The Governor of Texas, Greg Abbott, recently signed the House Bill 4390 (HB 4390), which modifies the state’s current Data Breach Notification law and introduces an advisory council (“Texas Privacy Privacy Protection Advisory Council”) charged with studying data privacy laws in Texas, other states and relevant other jurisdictions.

Prior to the new amendment, businesses had to disclose Data Breaches to the Data Subjects “as quickly as possible”. Now, a concrete time period for notifying individuals whose sensitive personal information was acquired by an unauthorized person is determined by the bill. Individual notice must now be provided within 60 days after discovering the breach.

If more than 250 residents of Texas are subject to a Data Breach the Texas Attorney General must also be notified within 60 days. Such a notification must include:
– A detailed description of the nature and circumstances of the data breach;
– The number of the affected residents at that time;
– The measures taken regarding the breach and any measures the responsible person intends to take after the notification;
– Information on whether the law enforcement is engaged in investigating the breach.

The amendments take effect on January, 1 2020.

Category: General · USA
Tags: , ,

Italian DPA fines Facebook

The Italian Data Protection Authority Garante (Garante per la protezione dei dati personali) fined Facebook due to the Cambridge Analytica Scandal of 2015, which was discovered in 2018. The Cambridge Analytica Scandal is connected to the presidential campaign of the current president of the USA Donald Trump.

The Garante has imposed a fine of EUR 1.000.000 for abusing the use of data of more than 200.000 Italian Facebook users and their Facebook friends. According to the Garante, the abused data has not been transferred to Cambridge Analytica, which was also confirmed by a Facebook spokesman.  Nevertheless, the high fine was imposed.

The fine is still based on the old Italian Data Protection law because at the time of the abusive use the GDPR, which now applies throughout Europe, was not yet in force.

Facebook has to answer to the scandal not only in Italy. Legal consequences are also looming in the USA.

 

Consumers should know how much their data is worth

27. June 2019

US Senators Mark R. Warner (Democrats) and Josh Hawley (Republicans) want to know from Facebook, Google and Co. exactly how much the data of their users, measured in dollars and cents, is worth to them.

Last Sunday, the two senators announced their intention for the first time in a US talk show: Every three months, each user is to receive an overview of which data has been collected and stored and how the respective provider rates it. In addition, the aggregated value of all user data is to be reported annually to the US Securities and Exchange Commission. In this report, the companies are to disclose how they store, process and protect data and how and with which partner companies they generate sales with the data. All companies with more than 100 million users per month will be affected.

The value of user data has risen enormously in recent years; so far, companies have protected their internal calculations as company secrets. In addition, there is no recognized method for quantifying the value of user data; only when a company is sold or valued by means of an initial public offering (IPO) does it become obvious. In the case of the WhatsApp takeover it was  $ 55 per user, in the case of Skype it was $ 200.

But one can doubt the significance of these figures. A further indication can be the advertising revenues, which are disclosed by companies per quarter. At the end of 2018, Facebook earned around $6 per user worldwide, while Amazon earned $752 per user. These figures are likely to rise in the future.  “For years, social media companies have told consumers that their products are free to the user. But that’s not true – you are paying with your data instead of your wallet,” said Senator Warner. “But the overall lack of transparency and disclosure in this market have made it impossible for users to know what they’re giving up, who else their data is being shared with, or what it’s worth to the platform. […]” Experts believe it is important for consumers to know the value of their data, because only when you know the value of a good you are able to value it.

On Monday, Warner and Rawley plan to introduce the  Designing Accounting Safeguards to Help Broaden Oversight And Regulations on Data (DASHBOARD) Act to the parliament for its first reading. It remains to be seen whether their plans will meet with the approval of the other senators.

German Data Protection Authority of Baden-Württemberg fines an employee of a public body

24. June 2019

According to an announcement of the LfDI Baden-Würtemberg, which is one of the 16 German State Data Protection Authorities (DPA), a first fine of 1,400 Euro has been filed against an employee of a public body. The police officer unlawfully retrieved personal data in the context of his job solely for private purposes. Referring to the DPA’s statement, this was the first fine imposed on an employee of a public body after the EU General Data Protection Regulation (GDPR) has become applicable.

The police officer used his official user ID to file a request for the owner data relating to a license plate of a private coincidental aquaintance, without reference to his official duties. Using the personal data obtained in this way, he then carried out a second enquiry with the Federal Network Agency. By doing so, he then not only requested the personal data of the data subject, but also the fixed line and mobile numbers. Without the data subject’s official request or consent, the police officer finally used the obtained mobile number to contact the injured party by phone.

By filing the aforementioned requests for private puproses and the usage of the mobile number obtained in this way to make a private contact, the police officer autonomously processed personal data for non-legal purposes. Therefore, the police officer’s department cannot be hold responsible, as this action has not been taken in the course of the official duties of the police officer. As a consequence, the police officer is responsible for this breach of the GDPR as an individual. Neither the public body, which cannot be subject to sanctions according to the State Data Protection Act, is responsible nor the police officer is classified as a public body in the sense of this law.

Taking into account that it was the first of such a violation of the data protection laws, a fine of 1,400 Euro pursuant to Article 83 Paragraph 5 GDPR was considered to be appropriate. However, this case shows that even an employee of a public body might become subject to a fine if this person unlawfully processes personal data for private purposes only.

Category: GDPR · General
Tags:

Germany: Data of smart home devices as evidence in court?!

11. June 2019

According to a draft resolution for the upcoming conference of interior ministers of the 16 German federal states, data from smart home devices are to be admitted as evidence in court. The ministers of the federal states believe that the digital traces could help to solve crimes in the future, especially capital crimes and terrorist threats.

The interior ministers want to remove constitutional concerns, because the mentioned data is of great interest for the security authorities. According to the draft resolution, judicial approval will be sufficient in the future. However, domestic politicians expect criticism and resistance from the data protection commissioners of both the federal states and the federal government.

Smart home devices are technical devices such as televisions, refrigerators or voice assistants that are connected to the Internet. They are also summarized under the term Internet of the Things (IoT), can be controlled via the smartphone and make daily life easier for the user. Many data are stored and processed.

We have already reported several times about smart home devices, including the fact that in the USA data from smart home devices have already helped to solve crimes (in German).

It cannot be denied that data from smart home devices can (under certain circumstances) help to solve crimes, but it must be neglected that due to the technical design a 100% reliable statement cannot be made. A simple example is this: whether the landlord was actually at home at the time in question or still on his way home, or just wanted to give the impression that he was at home while in fact on the other side of the world, cannot be determined on the basis of data from smart home devices. For example, the ability to use the smartphone to control the light/heat management allows the user to control it from anywhere at any time.

In addition, it should be taken into consideration that such interventions, or the mere possibility of intervention, may violate a person’s right to informational self-determination, and it is precisely the protection of this constitutionally protected right that data protection is committed to.

Update: The 210th Conference of the interior ministers has come to an end in the meantime and the approval of smart home data as evidence in court has been rejected. The resolutions of the conference can be found here (in German).

Royal family uses GDPR to protect their privacy

22. May 2019

Last week Prince Harry and Meghan Markle could claim another victory in the royal family’s never ending struggle with paparazzi photographers, securing “a substantial sum” in damages from an agency that released intimate photos of the Oxfordshire home the Duke and Duchess of Sussex rented to the media. In a statement, Splash News apologized for and acknowledged that this situation would represent “an error of judgement”.

The paparazzi agency “Splash News” took photos and footage of the couple’s former Cotswolds home — including their living room, dining area, and bedroom — using a helicopter and promptly sold to different news outlets. The lawyers of Prince Harry argued that this situation caused a breach of his right to privacy according to Art. 7 and 8 ECHR as well as a breach of the General Data Protection Regulation (GDPR) and Data Protection Act 2018 (DPA).

Considering the strategy of the Duke’s lawyers, it looks like the royal family have found a potentially attractive alternative to claims of defamation of invasion of privacy. Since in contrast to such a claim, a claimant relying on data protection law neither needs to prove that a statement is at least defamatory and met the threshold for serious harm to reputation nor that the information is private.

However, the (new) European data protection legislation grants all data subjects, regardless of their position and/or fame, a right of respect for their privacy and family lives and protection of their personal data. In particular, the GDPR requires organisations, according to its Article 5, to handle personal data (such as names, pictures and stories relating to them) fairly and in a transparent manner while also using it for a legitimate purpose.

Moreover, when obtaining pictures and footage of an individual’s private or even the intimite sphere, the organization using such materials need a specific reason like some kind of contract, the individual’s consent or be able to argue that using this photos and footage was “in the public interest” or for a “legitimate interest”. As a contract and a consent can be excluded here, the only basis that might be considerd could be a public interest or a legitimate interest of the organization itself. Taking into account the means and the way how these photos and footage of the Duke and Dutchess were created, both of these interest cannot withstand the interest  in protecting the rights and freedom of individuals’ private and intimite sphere.

Referring to this case, it seems pretty likely that the European data protection regime changed the way in how celebrities and the courts enforce the heavy-contested threshold of whether the public is allowed to see and be informed about certain parts and aspects of famous people’s lives or not.

 

 

The global competition for Artificial Intelligence – Is it Time to Regulate Now?

21. May 2019

This year’s edition of the European Identity & Cloud Conference 2019 took place last week.
In the context of this event, various questions relevant from a data protection perspective arose. Dr. Karsten Kinast, Managing Director of KINAST Attorneys at Law and Fellow Analyst of the organizer KuppingerCole, gave a keynote speech on the question of whether internationally uniform regulations should be created in the context of a global competition for artificial intelligence (AI). Dr. Kinast outlined the controversial debate about the danger of future AI on the one hand and the resulting legal problems and solutions on the other. At present, there is no form of AI that understands the concrete content of its processing. Moreover, AI has not yet been able to draw any independent conclusions from a processing operation or even base autonomous decisions on it. Furthermore, from today’s perspective it is not even known how such a synthetic intelligence could be created.
For this reason, it is not primarily a question (as a result) of developing a code of ethics in which AIs can unfold as independent subjects. Rather, from today’s perspective, it would be a matter of a far more profane view of responsibilities.

The entire lecture can be found here.

Pages: Prev 1 2 3 4 5 6 7 8 9 10 ... 12 13 14 Next
1 2 3 4 5 14