Category: General

Austrian Regional Court grants an Austrian man 800€ in GDPR compensation

20. December 2019

The Austrian Regional Court, Landesgericht Feldkirch, has ruled that the major Austrian postal service Österreichische Post (ÖPAG) has to pay an Austrian man 800 Euros in compensation because of violating the GDPR (LG Feldkirch, Beschl. v. 07.08.2019 – Az.: 57 Cg 30/19b – 15). It is one of the first rulings in Europe in which a civil court granted a data subject compensation based on a GDPR violation. Parallel to this court ruling, ÖPAG is facing an 18 Mio Euro fine from the Austrian Data Protection Authorities.

Based on people’s statements in anonymised surveys, ÖPAG had created marketing groups and used algorithms to calculate the probability of the political affinities that people with certain socioeconomic and regional backgrounds might have. ÖPAG then ascribed customers to these marketing groups and thus also stored data about their calculated political affinities. Among these customers was the plaintiff of this case.

The court ruled that this combination is “personal data revealing political opinions” according to Art. 9 GDPR. Since ÖPAG neither obtained the plaintiff’s consent to process his sensitive data on political opinions nor informed him about the processing itself, ÖPAG violated the plaintiff’s individual rights.

While the plaintiff demanded 2.500 Euros in compensation from ÖPAG, the court granted the plaintiff only a non-material damage compensation of 800 Euros after weighing up the circumstances of the individual case.

The case was appealed and will be tried at the Higher Regional Court Innsbruck.

Advocate General releases opinion on the validity of SCCs in case of Third Country Transfers

19. December 2019

Today, Thursday 19 of December, the European Court of Justice’s (CJEU) Advocate General Henrik Saugmandsgaard Øe released his opinion on the validity of Standard Contractual Clauses (SCCs) in cases of personal data transfers to processors situated in third countries.

The background of the case, on which the opinion builds on, originates in the proceedings initiated by Mr. Maximillian Schrems, where he stepped up against Facebook’s business practice of transferring the personal data of its European subscribers to servers located in the United States. The case (Schrems I) led the CJEU on October 6, 2015, to invalidate the Safe Harbor arrangement, which up to that point governed data transfers between the EU and the U.S.A.

Following the ruling, Mr. Schrems decided to challenge the transfers performed on the basis of the EU SCCs, the alternative mechanism Facebook has chosen to rely on to legitimize its EU-U.S. data flows, on the basis of similar arguments to those raised in the Schrems I case. The Irish DPA brought proceedings before the Irish High Court, which referred 11 questions to the CJEU for a preliminary ruling, the Schrems II case.

In the newly published opinion, the Advocate General validates the established SCCs in case of a commercial transfer, despite the possibility of public authorities in the third country processing the personal data for national security reasons. Furthermore, the Advocate General states that the continuity of the high level of protection is not only guaranteed by the adequacy decision of the court, but just as well by the contractual safeguards which the exporter has in place that need to match that level of protection. Therefore, the SCCs represent a general mechanism applicable to transfers, no matter the third country and its adequacy of protection. In addition, and in light of the Charter, there is an obligation for the controller as well as the supervisory authority to suspend any third country transfer if, because of a conflict between the SCCs and the laws in the third country, the SCCs cannot be complied with.

In the end, the Advocate General also clarified that the EU-U.S. Privacy Shield decision of 12 July 2016 is not part of the current proceedings, since those only cover the SCCs under Decision 2010/87, taking the questions of the validity of the Privacy Shield off the table.

While the Advocate General’s opinion is not binding, it represents the suggestion of a legal solution for cases for which the CJEU is responsible. However, the CJEU’s decision on the matter is not expected until early 2020, setting the curiosity on the outcome of the case high.

Facebook collects location data despite deactivation

Facebook has admitted at the request of several US senators that they continuously collect location data, even if the user previously deactivated this feature.

In case of deactivating this feature, location data is collected, for example, by IP address mapping or user activity. This includes, for example, a self-conducted location-tag in a certain restaurant or at a special location, but also the case of being linked by friends to a photo that contains a location-tag.

In the letter that Senator Josh Hawley published on Twitter, Facebook states that they have only the best intentions in collecting the data. According to the statement, this is the only way, for example, to place personalized ads or inform a user when someone logs in to a completely different location than usual with their account.

While Facebook states that the location data – based on e.g. the IP address –  does not indicate an exact Location but only the postcode, for example, it means that there is no way for users to opt-out of the collection of location data.

Category: General
Tags: ,

Data Leak of South African IT firm exposes over 1 Million Web Browsing Records

18. December 2019

Security researchers at vpnMentor recently discovered an unsecured and unencrypted database owned by the South African information and communications technology (ICT) company Conor. The breached database consisted of daily logs of user activity by customers of Internet Service Providers (ISPs) that used web filtering software built by Conor.

The leak exposed all internet traffic and activity, along with their personally identifying information and highly sensitive and private information. For two months it revealed activity logs such as website URLs, IP addresses, index names and MSISDN codes which identify mobile users on a specific network. The details contained in this breach included highly sensitive web browsing activity like attempts to visit pornography websites, social media accounts, online storage including iCloud and messaging apps such as WhatsApp. In total, this resulted in 890+ GB of data and over 1 million records being exposed.

“Because the database gave access to a complete record of each user’s activity in a session, our team was able to view every website they visited – or attempted to visit. We could also identify each user,” the vpnMentor team explained in their statement. “For an ICT and software development company not to protect this data is incredibly negligent. Conor’s lapse in data security could create real-world problems for the people exposed.”

Such an incident could make Conor suffer significant reputational damage and integrity loss. In addition, it exposed how their filter system worked and ways to circumvent it. This could lead to their product becoming ineffective against attempts to bypass it, making it redundant. In result, the outcome may lead to a loss of business for Conor, since clients may no longer feel like they can trust the company and the values they propose.

Irish DPC updates Guidance on Data Processing’s Legal Bases

17. December 2019

The Irish Data Protection Commission (DPC) has updated their guidance on the legal bases for personal data processing. It focuses on data processing under the European General Data Protection Regulation (GDPR) as well as data processing requirements under the European Law Enforcement Directive.

The main points of the updates to the guidance are to make companies more sensitive of their reasons for processing personal data and choosing the right legal basis, as well as ensure that data subjects may be able to figure out if their data is being processed lawfully.

The guidance focuses on the different legal bases in Art.6 GDPR, namely consent, contracts, legal obligation, vital interests, public task or legitimate interests. The Irish DPC states that controllers do not only have to choose the right legal basis, but they also have to understand the obligations that come with the chosen one, which is why they wanted to go into further detail.

Overall, the guidance is made to aid both controllers and data subjects. It consists of a way to support a better understanding of the terminology, as well as the legal requirements the GDPR sets out for processing personal data.

Germany: Telecommunications provider receives a 9.5 Million Euro GDPR fine

16. December 2019

The German Federal Commissioner for Data Protection and Freedom of Information (BfDI) has imposed a fine of 9.55 Million Euro on the major telecommunication services provider 1&1 Telecom GmbH (1&1). This is the second multimillion Euro fine that the Data Protection Authorities in Germany have imposed. The first fine of this magnitude (14.5 Million Euro) was imposed last month on a real estate company.

According to the BfDI, the reason for the fine for 1&1 was an inadequate authentication procedure within the company’s customer service department, because any caller to 1&1’s customer service could obtain extensive information on personal customer data, only by providing a customer’s name and date of birth. The particular case that was brought to the Data Protection Authority’s attention was based on a caller’s request of the new mobile phone number of an ex-partner.

The BfDI found that this authentication procedure stands in violation of Art. 32 GDPR, which sets out a company’s obligation to take appropriate technical and organisational measures to systematically protect the processing of personal data.

After the BfDI had pointed 1&1 to the their deficient procedure, the company cooperated with the authorities. In a first step, the company changed their two-factor authentication procedure to a three step authentication procedure in their customer service department. Furthermore, they are working on a new enhanced authentication system in which each customer will receive a personal service PIN.

In his statement, the BfDI explained that the fine was necessary because the violation posed a risk to the personal data of all customers of 1&1. But because of the company’s cooperation with the authorities, the BfDI set the fine at the lower end of the scale.

1&1 has deemed the fine “absolutely disproportionate” and has announced to file a suit against the penalty notice by the BfDI.

India updates privacy bill

12. December 2019

The new update of the Indian Personal Data Protection Bill is part of India’s broader efforts to tightly control the flow of personal data.

The bill’s latest version enpowers the government to ask companies to provide anonymized personal data, as well as other non-personal data in order to help to deliver governmental services and privacy policies. The draft defines “personal data” as information that can help to identify a person and also has characteristics, traits and any other features of a person’s identity. “Sensitive personal data” also includes financial and biometric data. According to the draft, such “sensitive” data can be transferred outside India for processing, but must be stored locally.

Furthermore, social media platforms will be required to offer a mechanism for users to prove their identities and display a verification sign publicly. Such requirements would raise a host of technical issues for companies such as Facebook and WhatsApp.

As a result, the new bill could affect the way companies process, store and transfer Indian consumers’ data. Therefore, it could cause some difficulties for top technology companies.

LGPD – Brazil’s upcoming Data Protection Law

28. November 2019

The National Congress of Brazil passed in August 2018 a new General Data Protection Law (“Lei Geral de Proteção de Dados” or “LGPD”). This law is slated to come into effect in August 2020. Prior to the LGPD, data protection in Brazil was primarily enforced via a various collection of legal frameworks, including the country’s Civil Rights Framework for the Internet (Internet Act) and Consumer Protection Code.

The new legislation creates a completely new general framework for the use of personal data processed on individuals in Brazil, regardless of where the data processor is located. Brazil also established its own Data Protection Authority, in order to enforce the guidance. Although the Data Protection Authority will initially be tied to the Presidency of the Federative Republic of Brazil, the DPA will become autonomous in the long term, in about two years.

Like the GDPR, the new framework has an extraterritorial application, which means that the law will apply to any individual or organization, private or public that processes or collects personal data in Brazil, regardless of where the Processor is based. The LGPD does not apply to data processing for strictly personal, academic, artistic and journalistic purposes.

Although the LGPD is largely influenced by the GDPR, both frameworks also differ from each other a lot. For instance, both frameworks define personal data differently. The LGPD’s definition is broad and covers any information relating to an identified or identifiable natural person. Furthermore, the LGPD does not permit cross-border transfers based on the controller’s legitimate interest. In the GDPR, the deadline for data breach notification is 72 hours; in the LGPD, the deadline is loosely defined, to name just a few.

Category: General · Personal Data
Tags: ,

CNIL publishes report on facial recognition

21. November 2019

The French Data Protection Authority, Commission Nationale de l’Informatique et des Libertés (CNIL), has released guidelines concerning the experimental use of facial recognition software by the french public authorities.

Especially concerned with the risks of using such a technology in the public sector, the CNIL made it clear that the use of facial recognition has vast political as well as societal influences and risks. In its report, the CNIL explicitly stated the software can yield very biased results, since the algorithms are not 100% reliable, and the rate of false-positives can vary depending on the gender and on the ethnicity of the individuals that are recorded.

To minimize the chances of an unlawful use of the technology, the CNIL came forth with three main requirements in its report. It recommended to the public authorities, that are using facial recognition in an experimental phase, to comply with them in order to keep the chances of risks to a minimum.

The three requirements put forth in the report are as follows:

  • Facial recognition should only be put to experimental use if there is an established need to implement an authentication mechanism with a high level of reliability. Further, there should be no less intrusive methods applicable to the situation.
  • The controller must under all circumstances respect the rights of the individuals beig recorded. That extends to the necessity of consent for each device used, data subjects’ control over their own data, information obligation, and transparency of the use and purpose, etc.
  • The experimental use must follow a precise timeline and be at the base of a rigorous methodology in order to minimize the risks.

The CNIL also states that it is important to evaluate each use of the technology on a case by case basis, as the risks depending on the way the software is used can vary between controllers.

While the CNIL wishes to give a red lining to the use of facial recognition in the future, it has also made clear that it will fulfill its role by showing support concerning issues that may arise by giving counsel in regards to legal and methodological use of facial recognition in an experimental stage.

Category: EU · French DPA · GDPR · General
Tags: , , , ,

Apple wants to evaluate “Siri”-recordings again

14. October 2019

Apple wants to evaluate Siri-recordings again in the future. After it became public that Apple automatically saved the audio recordings of Siri entries and had some of them evaluated by employees of external companies, the company stopped this procedure. Although Apple stated that only less than 0.2 % of the queries were actually evaluated, the system received around 10 billion queries per month (as of 2018).

In the future, audio recordings from the Siri language assistant will be stored and evaluated again. This time, however, only after the user has consented. This procedure will be tested with the latest beta versions of the Apple IOS software for iPhone and iPad.

Apple itself hopes that many users will agree and thus contribute to the improvement of Siri. A later opt-out is possible at any time, but for each device individually. In addition, only apple’s own employees, who are – according to Apple -subject to strict confidentiality obligations ,will evaluate the recordings. Recordings that have been generated by an unintentional activation of Siri will be completely deleted.

In addition, a delete function for Siri-recordings is to be introduced. Users can then choose in their settings to delete all data recorded by Siri. If this deletion is requested within 24 hours of a Siri request, the respective recordings and transcripts will not be released for evaluation.

However, even if the user does not opt-in to the evaluation of his Siri recordings, a computer-generated transcript will continue to be created and kept by Apple for a certain period of time. Although these transcripts are to be anonymized and linked to a random ID, they still could be evaluated according to Apple.

Category: General
Tags: ,
Pages: Prev 1 2 3 4 5 6 7 8 9 10 ... 16 17 18 Next
1 2 3 4 5 6 18