Germany: Telecommunications provider receives a 9.5 Million Euro GDPR fine

16. December 2019

The German Federal Commissioner for Data Protection and Freedom of Information (BfDI) has imposed a fine of 9.55 Million Euro on the major telecommunication services provider 1&1 Telecom GmbH (1&1). This is the second multimillion Euro fine that the Data Protection Authorities in Germany have imposed. The first fine of this magnitude (14.5 Million Euro) was imposed last month on a real estate company.

According to the BfDI, the reason for the fine for 1&1 was an inadequate authentication procedure within the company’s customer service department, because any caller to 1&1’s customer service could obtain extensive information on personal customer data, only by providing a customer’s name and date of birth. The particular case that was brought to the data protection authority’s attention was based on a caller’s request of the new mobile phone number of an ex-partner.

The BfDI found that this authentication procedure stands in violation of Art. 32 GDPR, which sets out a company’s obligation to take appropriate technical and organisational measures to systematically protect the processing of personal data.

After the BfDI had pointed 1&1 to the their deficient procedure, the company cooperated with the authorities. In a first step, the company changed their two-factor authentication procedure to a three step authentication procedure in their customer service department. Furthermore, they are working on a new enhanced authentication system in which each customer will receive a personal service PIN.

In his statement, the BfDI explained that the fine was necessary because the violation posed a risk to the personal data of all customers of 1&1. But because of the company’s cooperation with the authorities, the BfDI set the fine at the lower end of the scale.

1&1 has deemed the fine “absolutely disproportionate” and has announced to file a suit against the penalty notice by the BfDI.

Category: GDPR · General · German Law · Personal Data
Tags:

India updates privacy bill

12. December 2019

The new update of the Indian Personal Data Protection Bill is part of India’s broader efforts to tightley control the flow of personal data.

The bill’s latest version enpowers the government to ask companies to provide anonymized personal data, as well as other non-personal data in order to help to deliver governmental services and privacy policies. The draft defines “personal data” as information that can help to identify a person and also has characteristics, traits and any other features of a person’s identity. “Sensitive personal data” also includes financial and biometric data. According to the draft, such “sensitive” data can be transferred outside India for processing, but must be stored locally.

Furthermore, social media platforms will be required to offer a mechanism for users to prove their identities and display a verification sign publicly. Such requirements would raise a host of technical issues for companies such as Facebook, WhatsApp.

As a result, the new bill could affect the way companies process, store and transfer Indian consumers’ data. Therefore, it could cause some difficulties for top technology companies.

Dutch DPA issued a statement regarding cookie consent

The Dutch Data Protection Authority (Autoriteit Persoonsgegevens) has recently issued a statement regarding compliance with the rules on cookie consent. According to the statement the DPA has reviewed 175 websites and e-commerce platforms to see if they meet the requirements for the use of cookies. They found that almost half of the websites and almost all e-commerce platforms do not meet the requirements for cookie consent.

The data protection authority has contacted the companies concerned and requested them to adjust their cookie usage.

In its statement, the Data Protection Authority also refers to the “Planet49case” of the Court of Justice of the European Union (“CJEU”) and clarifies that boxes that have already been clicked do not comply with the obligation to obtain the user’s consent. In addition, it is not equivalent to obtaining consent to the use of cookies if the user merely scrolls down the website. Cookies, which enable websites to track their users, always require explicit consent.

Lastly, the DPA recalls that cookie walls that prevent users, who have not consented to the use of cookies from accessing the website are not permitted.

Category: EU · GDPR · The Netherlands
Tags: ,

Advocate General’s opinion on “Schrems II” is delayed

11. December 2019

The Court of Justice of the European Union (CJEU) Advocate General’s opinion in the case C-311/18 (‘Facebook Ireland and Schrems’) will be released on December 19, 2019. Originally, the CJEU announced that the opinion of the Advocate General in this case, Henrik Saugmandsgaard Øe, would be released on December 12, 2019. The CJEU did not provide a reason for this delay.

The prominent case deals with the complaint to the Irish Data Protection Commission (DPC) by privacy activist and lawyer Maximilian Schrems and the transfer of his personal data from Facebook Ireland Ltd. to Facebook Inc. in the U.S. under the European Commission’s controller-to-processor Standard Contractual Clauses (SCCs).

Perhaps the most consequential question that the High Court of Ireland set before the CJEU is whether the transfers of personal data from the EU to the U.S. under the SCCs violate the rights of the individuals under Articles 7 and/or 8 of the Charter of Fundamental Rights of the European Union (Question No. 4). The decision of the CJEU in “Schrems II” will also have ramifications on the parallel case T-738/16 (‘La Quadrature du net and others’). The latter case poses the question whether the EU-U.S. Privacy Shield for data transfers from the EU to the U.S. protects the rights of EU individuals sufficiently. If it does not, the European Commission would face a “Safe Harbor”-déjà vu after approving of the new Privacy Shield in its adequacy decision from 2016.

The CJEU is not bound to the opinion of the Advocate General (AG), but in some cases, the AG’s opinion may be a weighty indicator of the CJEU’s final ruling. The final decision by the Court is expected in early 2020.

FTC reaches settlements with companies regarding Privacy Shield misrepresentations

10. December 2019

On December 3, 2019, the Federal Trade Commission (FTC) announced that it had reached settlements in four different cases of Privacy Shield misrepresentation. The FTC alleged that in particular Click Labs, Inc., Incentive Services, Inc., Global Data Vault, LLC, and TDARX, Inc. each falsely claimed to have participated in the framework agreements of the EU-US Privacy Shield. According to the FTC, Global Data and TDARX continued to claim participation in the EU-U.S. Privacy Shield upon expiration of their Privacy Shield certifications. Click Labs and Incentive Services have also erroneously claimed to participate in the Swiss-U.S. Privacy Shield Framework. In addition, Global Data and TDARX have violated the Privacy Shield Framework by failing to follow the annual review of whether statements about their privacy shield practices were accurate. Also, according to the complaints, they did not affirm that they would continue to apply Privacy Shield protection to personal information collected during participation in the program.

As part of the proposed settlements, each of the companies is prohibited from misrepresenting its participation in the EU-U.S. Privacy Shield Framework or any other privacy or data security program sponsored by any government or self-regulatory or standard-setting organization. In addition, Global Data Vault and TDARX are required to continue to apply Privacy Shield protection to personal information collected during participation in the program. Otherwise, they are required to return or delete such information.

The EU-U.S. and Swiss-U.S. Privacy Shield Frameworks allow companies to legally transfer personal data from the EU or Switzerland to the USA. Since the framework was established in 2016, the FTC has initiated a total of 21 enforcement measures in connection with the Privacy Shield.

A description of the consent agreements is published in the Federal Register and publicly commented on for 30 days. The FTC will then decide whether the proposed consent orders are final.

LGPD – Brazil’s upcoming Data Protection Law

28. November 2019

The National Congress of Brazil passed in August 2018 a new General Data Protection Law (“Lei Geral de Proteção de Dados” or “LGPD”). This law is slated to come into effect in August 2020. Prior to the LGPD, data protection in Brazil was primarily enforced via a various collection of legal frameworks, including the country’s Civil Rights Framework for the Internet (Internet Act) and Consumer Protection Code.

The new legislation creates a completely new general framework for the use of personal data processed on individuals in Brazil, regardless of where the data processor is located. Brazil also established its own Data Protection Authority, in order to enforce the guidance. Although the Data Protection Authority will initially be tied to the Presidency of the Federative Republic of Brazil, the DPA will become autonomous in the long term, in about two years.

Like the GDPR, the new framework has an extraterritorial application, which means that the law will apply to any individual or organization, private or public that processes or collects personal data in Brazil, regardless of where the Processor is based. The LGPD does not apply to data processing for strictly personal, academic, artistic and journalistic purposes.

Although the LGPD is largely influenced by the GDPR, both frameworks also differ from each other a lot. For instance, both frameworks define personal data differently. The LGPD’s definition is broad and covers any information relating to an identified or identifiable natural person. Furthermore, the LGPD does not permit cross-border transfers based on the controller’s legitimate interest. In the GDPR, the deadline for data breach notification is 72 hours; in the LGPD, the deadline is loosely defined, to name just a few.

Category: General · Personal Data
Tags: ,

Austrian data protection authority imposes 18 million euro fine

22. November 2019

The Austrian Data Protection Authority (DPA) has imposed a fine of 18 million euros on Österreichische Post AG (Austrian Postal Service) for violations of the GDPR.

The company had among other things collected data on the “political affinity” from 2.2 million customers, and thus violated the GDPR. Parties should be able to send purposeful election advertising to the Austrian inhabitants with this information.

In addition, they also collected data on the frequency of parcel deliveries and the relocation probability of customers, so that these can be used for direct marketing.

The penalty is not yet final. Österreichische Post AG, half of which belongs to the Austrian state, can appeal the decision before the Federal Administrative Court. The company has already announced its intention to take legal action.

CNIL publishes report on facial recognition

21. November 2019

The French Data Protection Authority, Commission Nationale de l’Informatique et des Libertés (CNIL), has released guidelines concerning the experimental use of facial recognition software by the french public authorities.

Especially concerned with the risks of using such a technology in the public sector, the CNIL made it clear that the use of facial recognition has vast political as well as societal influences and risks. In its report, the CNIL explicitly stated the software can yield very biased results, since the algorithms are not 100% reliable, and the rate of false-positives can vary depending on the gender and on the ethnicity of the individuals that are recorded.

To minimize the chances of an unlawful use of the technology, the CNIL came forth with three main requirements in its report. It recommended to the public authorities, that are using facial recognition in an experimental phase, to comply with them in order to keep the chances of risks to a minimum.

The three requirements put forth in the report are as follows:

  • Facial recognition should only be put to experimental use if there is an established need to implement an authentication mechanism with a high level of reliability. Further, there should be no less intrusive methods applicable to the situation.
  • The controller must under all circumstances respect the rights of the individuals beig recorded. That extends to the necessity of consent for each device used, data subjects’ control over their own data, information obligation, and transparency of the use and purpose, etc.
  • The experimental use must follow a precise timeline and be at the base of a rigorous methodology in order to minimize the risks.

The CNIL also states that it is important to evaluate each use of the technology on a case by case basis, as the risks depending on the way the software is used can vary between controllers.

While the CNIL wishes to give a red lining to the use of facial recognition in the future, it has also made clear that it will fulfill its role by showing support concerning issues that may arise by giving counsel in regards to legal and methodological use of facial recognition in an experimental stage.

Category: EU · French DPA · GDPR · General
Tags: , , , ,

Health data transfered to Google, Amazon and Facebook

18. November 2019

Websites, spezialized on health topics transfer information of website users to Google, Amazon and Facebook, as the Financial Times reports.

The transferred information are obtained through cookies and include medical symtoms and clinical pictures of the users.

Referring to the report of the Financial Times does the transfer take place without the express consent of the data subject, contrary to the Data Protection Law in the UK. Besides the legal obligations in the UK, the procedure of the website operators, using the cookie, contradicts also the legal requirements of the GDPR.

According to the requirements of the GDPR the processing of health data falls under Art. 9 GDPR and is a prohibition subject to permission, meaning, that the processing of health data is forbidden unless the data subject has given its explicit consent.

The report is also interesting considering the Cookie judgement of the CJEU (we reported). Based on the judgment, consent must be obtained for the use of each cookie.

Accordingly, the procedure of the website operators will (hopefully) change in order to comply with the new case law.

 

Berlin commissioner for data protection imposes fine on real estate company

6. November 2019

On October 30th, 2019, the Berlin Commissioner for Data Protection and Freedom of Information issued a fine of around 14.5 million euros against the real estate company Deutsche Wohnen SE for violations of the General Data Protection Regulation (GDPR).

During on-site inspections in June 2017 and March 2019, the supervisory authority determined that the company used an archive system for the storage of personal data of tenants that did not provide for the possibility of removing data that was no longer required. Personal data of tenants were stored without checking whether storage was permissible or even necessary. In individual cases, private data of the tenants concerned could therefore be viewed, even though some of them were years old and no longer served the purpose of their original survey. This involved data on the personal and financial circumstances of tenants, such as salary statements, self-disclosure forms, extracts from employment and training contracts, tax, social security and health insurance data and bank statements.

After the commissioner had made the urgent recommendation to change the archive system in the first test date of 2017, the company was unable to demonstrate either a cleansing of its database nor legal reasons for the continued storage in March 2019, more than one and a half years after the first test date and nine months after the GDPR came into force. Although the enterprise had made preparations for the removal of the found grievances, nevertheless these measures did not lead to a legal state with the storage of personal data. Therefore the imposition of a fine was compelling because of a violation of article 25 Abs. 1 GDPR as well as article 5 GDPR for the period between May 2018 and March 2019.

The starting point for the calculation of fines is, among other things, the previous year’s worldwide sales of the affected companies. According to its annual report for 2018, the annual turnover of Deutsche Wohnen SE exceeded one billion euros. For this reason, the legally prescribed framework for the assessment of fines for the established data protection violation amounted to approximately 28 million euros.

For the concrete determination of the amount of the fine, the commissioner used the legal criteria, taking into account all burdening and relieving aspects. The fact that Deutsche Wohnen SE had deliberately set up the archive structure in question and that the data concerned had been processed in an inadmissible manner over a long period of time had a particularly negative effect. However, the fact that the company had taken initial measures to remedy the illegal situation and had cooperated well with the supervisory authority in formal terms was taken into account as a mitigating factor. Also with regard to the fact that the company was not able to prove any abusive access to the data stored, a fine in the middle range of the prescribed fine framework was appropriate.

In addition to sanctioning this violation, the commissioner imposed further fines of between 6,000 and 17,000 euros on the company for the inadmissible storage of personal data of tenants in 15 specific individual cases.

The decision on the fine has not yet become final. Deutsche Wohnen SE can lodge an appeal against this decision.

Pages: 1 2 3 4 5 6 7 8 9 10 ... 39 40 41 Next
1 2 3 41