Data Leak of South African IT firm exposes over 1 Million Web Browsing Records

18. December 2019

Security researchers at vpnMentor recently discovered an unsecured and unencrypted database owned by the South African information and communications technology (ICT) company Conor. The breached database consisted of daily logs of user activity by customers of Internet Service Providers (ISPs) that used web filtering software built by Conor.

The leak exposed all internet traffic and activity, along with their personally identifying information and highly sensitive and private information. For two months it revealed activity logs such as website URLs, IP addresses, index names and MSISDN codes which identify mobile users on a specific network. The details contained in this breach included highly sensitive web browsing activity like attempts to visit pornography websites, social media accounts, online storage including iCloud and messaging apps such as WhatsApp. In total, this resulted in 890+ GB of data and over 1 million records being exposed.

“Because the database gave access to a complete record of each user’s activity in a session, our team was able to view every website they visited – or attempted to visit. We could also identify each user,” the vpnMentor team explained in their statement. “For an ICT and software development company not to protect this data is incredibly negligent. Conor’s lapse in data security could create real-world problems for the people exposed.”

Such an incident could make Conor suffer significant reputational damage and integrity loss. In addition, it exposed how their filter system worked and ways to circumvent it. This could lead to their product becoming ineffective against attempts to bypass it, making it redundant. In result, the outcome may lead to a loss of business for Conor, since clients may no longer feel like they can trust the company and the values they propose.

Irish DPC updates Guidance on Data Processing’s Legal Bases

17. December 2019

The Irish Data Protection Commission (DPC) has updated their guidance on the legal bases for personal data processing. It focuses on data processing under the European General Data Protection Regulation (GDPR) as well as data processing requirements under the European Law Enforcement Directive.

The main points of the updates to the guidance are to make companies more sensitive of their reasons for processing personal data and choosing the right legal basis, as well as ensure that data subjects may be able to figure out if their data is being processed lawfully.

The guidance focuses on the different legal bases in Art.6 GDPR, namely consent, contracts, legal obligation, vital interests, public task or legitimate interests. The Irish DPC states that controllers do not only have to choose the right legal basis, but they also have to understand the obligations that come with the chosen one, which is why they wanted to go into further detail.

Overall, the guidance is made to aid both controllers and data subjects. It consists of a way to support a better understanding of the terminology, as well as the legal requirements the GDPR sets out for processing personal data.

Germany: Telecommunications provider receives a 9.5 Million Euro GDPR fine

16. December 2019

The German Federal Commissioner for Data Protection and Freedom of Information (BfDI) has imposed a fine of 9.55 Million Euro on the major telecommunication services provider 1&1 Telecom GmbH (1&1). This is the second multimillion Euro fine that the Data Protection Authorities in Germany have imposed. The first fine of this magnitude (14.5 Million Euro) was imposed last month on a real estate company.

According to the BfDI, the reason for the fine for 1&1 was an inadequate authentication procedure within the company’s customer service department, because any caller to 1&1’s customer service could obtain extensive information on personal customer data, only by providing a customer’s name and date of birth. The particular case that was brought to the Data Protection Authority’s attention was based on a caller’s request of the new mobile phone number of an ex-partner.

The BfDI found that this authentication procedure stands in violation of Art. 32 GDPR, which sets out a company’s obligation to take appropriate technical and organisational measures to systematically protect the processing of personal data.

After the BfDI had pointed 1&1 to the their deficient procedure, the company cooperated with the authorities. In a first step, the company changed their two-factor authentication procedure to a three step authentication procedure in their customer service department. Furthermore, they are working on a new enhanced authentication system in which each customer will receive a personal service PIN.

In his statement, the BfDI explained that the fine was necessary because the violation posed a risk to the personal data of all customers of 1&1. But because of the company’s cooperation with the authorities, the BfDI set the fine at the lower end of the scale.

1&1 has deemed the fine “absolutely disproportionate” and has announced to file a suit against the penalty notice by the BfDI.

India updates privacy bill

12. December 2019

The new update of the Indian Personal Data Protection Bill is part of India’s broader efforts to tightly control the flow of personal data.

The bill’s latest version enpowers the government to ask companies to provide anonymized personal data, as well as other non-personal data in order to help to deliver governmental services and privacy policies. The draft defines “personal data” as information that can help to identify a person and also has characteristics, traits and any other features of a person’s identity. “Sensitive personal data” also includes financial and biometric data. According to the draft, such “sensitive” data can be transferred outside India for processing, but must be stored locally.

Furthermore, social media platforms will be required to offer a mechanism for users to prove their identities and display a verification sign publicly. Such requirements would raise a host of technical issues for companies such as Facebook and WhatsApp.

As a result, the new bill could affect the way companies process, store and transfer Indian consumers’ data. Therefore, it could cause some difficulties for top technology companies.

Dutch DPA issued a statement regarding cookie consent

The Dutch Data Protection Authority (Autoriteit Persoonsgegevens) has recently issued a statement regarding compliance with the rules on cookie consent. According to the statement the DPA has reviewed 175 websites and e-commerce platforms to see if they meet the requirements for the use of cookies. They found that almost half of the websites and nearly all e-commerce platforms do not meet the requirements for cookie consent.

The data protection authority has contacted the companies concerned and requested them to adjust their cookie usage.

In its statement, the Data Protection Authority also refers to the “Planet49case” of the Court of Justice of the European Union (“CJEU”) and clarifies that boxes that have already been clicked do not comply with the obligation to obtain the user’s consent. In addition, it is not equivalent to obtaining consent to the use of cookies if the user merely scrolls down the website. Cookies, which enable websites to track their users, always require explicit consent.

Lastly, the DPA recalls that cookie walls that prevent users, who have not consented to the use of cookies from accessing the website are not permitted.

Category: EU · GDPR · The Netherlands
Tags: ,

Advocate General’s opinion on “Schrems II” is delayed

11. December 2019

The Court of Justice of the European Union (CJEU) Advocate General’s opinion in the case C-311/18 (‘Facebook Ireland and Schrems’) will be released on December 19, 2019. Originally, the CJEU announced that the opinion of the Advocate General in this case, Henrik Saugmandsgaard Øe, would be released on December 12, 2019. The CJEU did not provide a reason for this delay.

The prominent case deals with the complaint to the Irish Data Protection Commission (DPC) by privacy activist and lawyer Maximilian Schrems and the transfer of his personal data from Facebook Ireland Ltd. to Facebook Inc. in the U.S. under the European Commission’s controller-to-processor Standard Contractual Clauses (SCCs).

Perhaps, the most consequential question that the High Court of Ireland set before the CJEU is whether the transfers of personal data from the EU to the U.S. under the SCCs violate the rights of the individuals under Articles 7 and/or 8 of the Charter of Fundamental Rights of the European Union (Question No. 4). The decision of the CJEU in “Schrems II” will also have ramifications on the parallel case T-738/16 (‘La Quadrature du net and others’). The latter case poses the question whether the EU-U.S. Privacy Shield for data transfers from the EU to the U.S. protects the rights of EU individuals sufficiently. If it does not, the European Commission would face a “Safe Harbor”-déjà vu after approving of the new Privacy Shield in its adequacy decision from 2016.

The CJEU is not bound to the opinion of the Advocate General (AG), but in some cases, the AG’s opinion may be a weighty indicator of the CJEU’s final ruling. The final decision by the Court is expected in early 2020.

FTC reaches settlements with companies regarding Privacy Shield misrepresentations

10. December 2019

On December 3, 2019, the Federal Trade Commission (FTC) announced that it had reached settlements in four different cases of Privacy Shield misrepresentation. The FTC alleged that in particular Click Labs, Inc., Incentive Services, Inc., Global Data Vault, LLC, and TDARX, Inc. each falsely claimed to have participated in the framework agreements of the EU-US Privacy Shield. According to the FTC, Global Data and TDARX continued to claim participation in the EU-U.S. Privacy Shield upon expiration of their Privacy Shield certifications. Click Labs and Incentive Services have also erroneously claimed to participate in the Swiss-U.S. Privacy Shield Framework. In addition, Global Data and TDARX have violated the Privacy Shield Framework by failing to follow the annual review of whether statements about their privacy shield practices were accurate. Also, according to the complaints, they did not affirm that they would continue to apply Privacy Shield protection to personal information collected during participation in the program.

As part of the proposed settlements, each of the companies is prohibited from misrepresenting its participation in the EU-U.S. Privacy Shield Framework or any other privacy or data security program sponsored by any government or self-regulatory or standard-setting organization. In addition, Global Data Vault and TDARX are required to continue to apply Privacy Shield protection to personal information collected during participation in the program. Otherwise, they are required to return or delete such information.

The EU-U.S. and Swiss-U.S. Privacy Shield Frameworks allow companies to legally transfer personal data from the EU or Switzerland to the USA. Since the framework was established in 2016, the FTC has initiated a total of 21 enforcement measures in connection with the Privacy Shield.

A description of the consent agreements is published in the Federal Register and publicly commented on for 30 days. The FTC will then decide whether the proposed consent orders are final.

LGPD – Brazil’s upcoming Data Protection Law

28. November 2019

The National Congress of Brazil passed in August 2018 a new General Data Protection Law (“Lei Geral de Proteção de Dados” or “LGPD”). This law is slated to come into effect in August 2020. Prior to the LGPD, data protection in Brazil was primarily enforced via a various collection of legal frameworks, including the country’s Civil Rights Framework for the Internet (Internet Act) and Consumer Protection Code.

The new legislation creates a completely new general framework for the use of personal data processed on individuals in Brazil, regardless of where the data processor is located. Brazil also established its own Data Protection Authority, in order to enforce the guidance. Although the Data Protection Authority will initially be tied to the Presidency of the Federative Republic of Brazil, the DPA will become autonomous in the long term, in about two years.

Like the GDPR, the new framework has an extraterritorial application, which means that the law will apply to any individual or organization, private or public that processes or collects personal data in Brazil, regardless of where the Processor is based. The LGPD does not apply to data processing for strictly personal, academic, artistic and journalistic purposes.

Although the LGPD is largely influenced by the GDPR, both frameworks also differ from each other a lot. For instance, both frameworks define personal data differently. The LGPD’s definition is broad and covers any information relating to an identified or identifiable natural person. Furthermore, the LGPD does not permit cross-border transfers based on the controller’s legitimate interest. In the GDPR, the deadline for data breach notification is 72 hours; in the LGPD, the deadline is loosely defined, to name just a few.

Category: General · Personal Data
Tags: ,

Austrian data protection authority imposes 18 million euro fine

22. November 2019

The Austrian Data Protection Authority (DPA) has imposed a fine of 18 million euros on Österreichische Post AG (Austrian Postal Service) for violations of the GDPR.

The company had among other things collected data on the “political affinity” from 2.2 million customers, and thus violated the GDPR. Parties should be able to send purposeful election advertising to the Austrian inhabitants with this information.

In addition, they also collected data on the frequency of parcel deliveries and the relocation probability of customers, so that these can be used for direct marketing.

The penalty is not yet final. Österreichische Post AG, half of which belongs to the Austrian state, can appeal the decision before the Federal Administrative Court. The company has already announced its intention to take legal action.

CNIL publishes report on facial recognition

21. November 2019

The French Data Protection Authority, Commission Nationale de l’Informatique et des Libertés (CNIL), has released guidelines concerning the experimental use of facial recognition software by the french public authorities.

Especially concerned with the risks of using such a technology in the public sector, the CNIL made it clear that the use of facial recognition has vast political as well as societal influences and risks. In its report, the CNIL explicitly stated the software can yield very biased results, since the algorithms are not 100% reliable, and the rate of false-positives can vary depending on the gender and on the ethnicity of the individuals that are recorded.

To minimize the chances of an unlawful use of the technology, the CNIL came forth with three main requirements in its report. It recommended to the public authorities, that are using facial recognition in an experimental phase, to comply with them in order to keep the chances of risks to a minimum.

The three requirements put forth in the report are as follows:

  • Facial recognition should only be put to experimental use if there is an established need to implement an authentication mechanism with a high level of reliability. Further, there should be no less intrusive methods applicable to the situation.
  • The controller must under all circumstances respect the rights of the individuals beig recorded. That extends to the necessity of consent for each device used, data subjects’ control over their own data, information obligation, and transparency of the use and purpose, etc.
  • The experimental use must follow a precise timeline and be at the base of a rigorous methodology in order to minimize the risks.

The CNIL also states that it is important to evaluate each use of the technology on a case by case basis, as the risks depending on the way the software is used can vary between controllers.

While the CNIL wishes to give a red lining to the use of facial recognition in the future, it has also made clear that it will fulfill its role by showing support concerning issues that may arise by giving counsel in regards to legal and methodological use of facial recognition in an experimental stage.

Category: EU · French DPA · GDPR · General
Tags: , , , ,
Pages: Prev 1 2 3 4 5 6 7 8 9 10 ... 41 42 43 Next
1 2 3 4 5 43