Category: GDPR

German Data Protection Authority of Baden-Württemberg fines an employee of a public body

24. June 2019

According to an announcement of the LfDI Baden-Würtemberg, which is one of the 16 German State Data Protection Authorities (DPA), a first fine of 1,400 Euro has been filed against an employee of a public body. The police officer unlawfully retrieved personal data in the context of his job solely for private purposes. Referring to the DPA’s statement, this was the first fine imposed on an employee of a public body after the EU General Data Protection Regulation (GDPR) has become applicable.

The police officer used his official user ID to file a request for the owner data relating to a license plate of a private coincidental aquaintance, without reference to his official duties. Using the personal data obtained in this way, he then carried out a second enquiry with the Federal Network Agency. By doing so, he then not only requested the personal data of the data subject, but also the fixed line and mobile numbers. Without the data subject’s official request or consent, the police officer finally used the obtained mobile number to contact the injured party by phone.

By filing the aforementioned requests for private puproses and the usage of the mobile number obtained in this way to make a private contact, the police officer autonomously processed personal data for non-legal purposes. Therefore, the police officer’s department cannot be hold responsible, as this action has not been taken in the course of the official duties of the police officer. As a consequence, the police officer is responsible for this breach of the GDPR as an individual. Neither the public body, which cannot be subject to sanctions according to the State Data Protection Act, is responsible nor the police officer is classified as a public body in the sense of this law.

Taking into account that it was the first of such a violation of the data protection laws, a fine of 1,400 Euro pursuant to Article 83 Paragraph 5 GDPR was considered to be appropriate. However, this case shows that even an employee of a public body might become subject to a fine if this person unlawfully processes personal data for private purposes only.

Category: GDPR · General
Tags:

FTC takes action against companies claiming to participate in EU-U.S. Privacy Shield and other international privacy agreements

The Federal Trade Commission (FTC) announced that it had taken action against several companies that pretended to be compliant with the EU-U.S. Privacy Shield and other international privacy agreements.

According to the FTC, SecureTest, Inc., a background screening company, has falsely claimed on its website to have participated in the EU-U.S. Privacy Shield and Swiss-U.S. Privacy Shield. These framework agreements allow companies to transfer consumer data from member states of the European Union and Switzerland to the United States in accordance with EU or Swiss law.

In September 2017, the company applied to the U.S. Department of Commerce for Privacy Shield certification. However, it did not take the necessary steps to be certified as compliant with the framework agreements.

Following the FTC’s complaint, the FTC and SecureTest, Inc. have proposed a settlement agreement. This proposal includes a prohibition for SecureTest to misrepresent its participation in any privacy or security program sponsored by any government or self-regulatory or standardization organization. The proposed agreement will be published in the Federal Register and subject to public comment for 30 days. Afterwards the FTC will make a determination regarding whether to make the proposed consent order final.

The FTC has also sent warning letters to 13 companies that falsely claimed to participate in the U.S.-EU Safe Harbor and the U.S.-Swiss Safe Harbor frameworks, which were replaced in 2016 by the EU-U.S. Privacy Shield and Swiss-U.S. Privacy Shield frameworks. The FTC asked companies to remove from their websites, privacy policies or other public documents any statements claiming to participate in a safe harbor agreement. If the companies fail to take action within 30 days, the FTC warned that it would take appropriate legal action.

The FTC also sent warning letters with the same request to two companies that falsely claimed in their privacy policies that they were participants in the Asia-Pacific Economic Cooperation (APEC) Cross-Border Privacy Rules (CBPR) system. The APEC CBPR system is an initiative to improve the protection of consumer data moving between APEC member countries through a voluntary but enforceable code of conduct implemented by participating companies. To become a certified participant, a designated third party, known as an APEC-approved Accountability Agent, must verify and confirm that the company meets the requirements of the CBPR program.

Spanish DPA imposes fine on Spanish football league

13. June 2019

The Spanish data protection authority Agencia Española de Protección de Datos (AEPD) has imposed a fine of 250.000 EUR on the organisers of the two Spanish professional football leagues for data protection infringements.

The organisers, Liga Nacional de Fútbol Profesional (LFP), operate an app called “La Liga”, which aims to uncover unlicensed performances of games broadcasted on pay-TV. For this purpose, the app has recorded a sample of the ambient sounds during the game times to detect any live game transmissions and combined this with the location data. Privacy-ticker already reported.

AEPD criticized that the intended purpose of the collected data had not been made transparent enough, as it is necessary according to Art. 5 paragraph 1 GDPR. Users must approve the use explicitly and the authorization for the microphone access can also be revoked in the Android settings. However, AEPD is of the opinion that La Liga has to warn the user of each data processing by microphone again. In the resolution, the AEPD points out that the nature of the mobile devices makes it impossible for the user to remember what he agreed to each time he used the La Liga application and what he did not agree to.

Furthermore, AEPD is of the opinion that La Liga has violated Art. 7 paragraph 3 GDPR, according to which the user has the possibility to revoke his consent to the use of his personal data at any time.

La Liga rejects the sanction because of injustice and will proceed against it. It argues that the AEPD has not made the necessary efforts to understand how the technology works. They explain that the technology used is designed to produce only one particular acoustic fingerprint. This fingerprint contains only 0.75% of the information. The remaining 99.25% is discarded, making it technically impossible to interpret human voices or conversations. This fingerprint is also converted into an alphanumeric code (hash) that is not reversible to the original sound. Nevertheless, the operators of the app have announced that they will remove the controversial feature as of June 30.

Belgian DPA imposes first fine since GDPR

11. June 2019

On 28 May 2019, the Belgian Data Protection Authority (DPA) imposed the first fine since the General Data Protection Regulation (GDPR) came into force. The Belgian DPA fined a Belgian mayor 2.000 EUR for abusing use of personal data.

The Belgian DPA received a complaint from the data subjects alleging that their personal data collected for local administrative purposes had been further used by the mayor for election campaign purposes. The parties were then heard by the Litigation Chamber of the Belgian DPA. Finally, the Belgian DPA ruled that the mayor’s use of the plaintiff’s personal data violated the purpose limitation principle of the GDPR, since the personal data was originally collected for a different purpose and was incompatible with the purpose for which the mayor used the data.

In deciding on the amount of the fine, the Belgian DPA took into account the limited number of data subjects, the nature, gravity and duration of the infringement, resulting in a moderate sum of 2.000 EUR. Nevertheless, the decision conveys the message that compliance with the GDPR is the responsibility of each data controller, including public officials.

CNIL fines French real estate company for violating the GDPR

7. June 2019

The French Data Protection Authority “Commission Nationale de l’Informatique et des Libertés” (CNIL) issued a 400k euro fine for the French real estate company “Sergic” for violating the GDPR.
Sergic is specialized in real estate development, purchase, sale, rental and property management and has published the website www.sergic.com , which allows rental candidates to upload the necessary documents for preparing their file.

In August 2018, a Sergic user contacted the CNIL reporting that he had unencrypted access, from his personal space on the website, to other users’ uploaded files by slightly changing the URL. On September 7, 2018, an online check revealed that rental candidates’ uploaded documents were actually freely accessible for others without prior authentication. Among the documents were copies of identity cards, health cards, tax notices and divorce judgements. CNIL informed Sergic on the same day of this security incident and the violation of personal data. It became apparent that Sergic had been aware of this since March 2018 and, even though it had initiated IT developments to correct it, the final correction did not take place until September 17, 2018.

Based on the investigation, the responsible CNIL body found two violations of the GDPR. Firstly, Sergic had failed to fulfil its obligations according to Art. 32 GDPR, which obliges controllers to implement appropriate technical and organizational measures to ensure a secure level of protection of the personal data. This includes for example a procedure to ensure that personal documents cannot be accessed without prior authentication of the user. In addition, there is the time that the company took to correct the error.

Secondly, the CNIL found out that Sergic kept all the documents sent by candidates in active base, although they had not accessed rental accommodation for more than the time required to allocate housing. According to the GDPR, the controller has the obligation to delete data immediately if they are no longer necessary in relation to the purposes for which they were collected or otherwise processed and no other purpose justifies the storage of the data in an active database.

The CNIL imposed a fine of € 400.000 and decided to make its sanction public due to inter alia the seriousness of the breach, the lack of due diligence by the company and the fact that the documents revealed intimate aspects of people’s lives.

Category: Data breach · French DPA · GDPR
Tags: , ,

Royal family uses GDPR to protect their privacy

22. May 2019

Last week Prince Harry and Meghan Markle could claim another victory in the royal family’s never ending struggle with paparazzi photographers, securing “a substantial sum” in damages from an agency that released intimate photos of the Oxfordshire home the Duke and Duchess of Sussex rented to the media. In a statement, Splash News apologized for and acknowledged that this situation would represent “an error of judgement”.

The paparazzi agency “Splash News” took photos and footage of the couple’s former Cotswolds home — including their living room, dining area, and bedroom — using a helicopter and promptly sold to different news outlets. The lawyers of Prince Harry argued that this situation caused a breach of his right to privacy according to Art. 7 and 8 ECHR as well as a breach of the General Data Protection Regulation (GDPR) and Data Protection Act 2018 (DPA).

Considering the strategy of the Duke’s lawyers, it looks like the royal family have found a potentially attractive alternative to claims of defamation of invasion of privacy. Since in contrast to such a claim, a claimant relying on data protection law neither needs to prove that a statement is at least defamatory and met the threshold for serious harm to reputation nor that the information is private.

However, the (new) European data protection legislation grants all data subjects, regardless of their position and/or fame, a right of respect for their privacy and family lives and protection of their personal data. In particular, the GDPR requires organisations, according to its Article 5, to handle personal data (such as names, pictures and stories relating to them) fairly and in a transparent manner while also using it for a legitimate purpose.

Moreover, when obtaining pictures and footage of an individual’s private or even the intimite sphere, the organization using such materials need a specific reason like some kind of contract, the individual’s consent or be able to argue that using this photos and footage was “in the public interest” or for a “legitimate interest”. As a contract and a consent can be excluded here, the only basis that might be considerd could be a public interest or a legitimate interest of the organization itself. Taking into account the means and the way how these photos and footage of the Duke and Dutchess were created, both of these interest cannot withstand the interest  in protecting the rights and freedom of individuals’ private and intimite sphere.

Referring to this case, it seems pretty likely that the European data protection regime changed the way in how celebrities and the courts enforce the heavy-contested threshold of whether the public is allowed to see and be informed about certain parts and aspects of famous people’s lives or not.

 

 

Public availability of house images using a Google Street View raises legal concerns.

21. May 2019

In recent years, the science of data analytics has dramatically improved the ability to analyse raw data and to make conclusions about that information. Data analytics techniques can reveal trends and patterns that can be used to optimize processes to increase the overall efficiency of a business or system. However, there is an obvious contradiction between the security and privacy of big data and the widespread use of big data.
Google Street View is a quite popular Google service used by millions of people every day to plan trips, explore touristic destinations and more.
In 2017, two university researchers Łukasz Kidziński, Stanford University in California, and Kinga Kita-Wojciechowska,University of Warsaw in Poland, have used Street View images of people’s houses to determine how likely they are to be involved in a car accident.
The researchers worked with an unknown insurance company and analysed 20.000 random addresses of the insurance company clients who had taken out car insurance. They collected information from the insurance company’s database, like age, sex, zip code, claim history and linked that information with Street View images correlated with the policyholder’s residential area. It turned out that a policyholder’s residence is a surprisingly good predictor of the likelihood that he/she will get involved in a car accident.
Subsequently, researchers put those results into a data analytics algorithm, which improved its predictive power by 2%. They also noted that the accuracy of the algorithm could be further improved using larger data sets and data analysis.
The insurance companies rely on data to predict risk and the results of the research are from this perspective impressive, but they are also disturbing. The new utilization of the technology is an important step towards improving risk prediction models. However, having in mind the results of the research, some interesting questions regarding data protection come up: Did the policyholders give their consent to this activity? Could the insurance company use individuals’ data this way given Europe’s strict privacy legislation? “The consent given by the clients to the company to store their addresses does not necessarily mean a consent to store information about the appearance of their houses,” said by Kidziński and Kita-Wojciechowska.”
Studies such as these raise datat protection questions about thepower of data analysis and how the information is collected and shared.

Twitter shared location data on iOS devices

15. May 2019

Twitter recently published a statement admitting that the app shared location data on iOS devices even if the user had not turned on the “precise location” feature.

The problem appeared in cases in which a user used more than one Twitter account on the same iOS device. If he or she had opted into the “precise location” feature for one account it was also turned on when using another account, even if the user had not opted into using the feature on this account. The information on the real-time location was then passed on to trusted partners of Twitter. However, through technical measures, only the postcode or an area of five square kilometres was passed on to the partners. Twitter accounts or other “Unique Account IDs”, which reveal the identity of the user, were allegedly not transmitted.

According to Twitter’s statement, they have fixed the problem and informed the affected users: “We’re very sorry this happened. We recognize and appreciate the trust you place in us and are committed to earning that trust every day”.

Morrisons is Allowed to Appeal Data Protection Class Action

29. April 2019

The British food store chain VM Morrison Supermarkets PLC (“Morrisons”) has been granted permission by the Supreme Court to appeal the data protection class action brought against it and to challenge the judgment for all its grounds. The case is important as it’s the first to be filed in the UK for a data breach and its outcome may affect the number of class actions for data breaches.

An employee who worked as a senior IT auditor for Morrsisons copied the payroll data of almost 100,000 employees onto a USB stick and published it on a file-sharing website. He then reported the violation anonymously to three newspapers. The employee himself was sentenced to eight years in prison for various crimes.

5,518 employees filed a class action lawsuit against Morrisons for the violation. It claimed both primary and representative liability for the company. The Supreme Court dismissed all primary liability claims under the Data Protection Act (“DPA”), as it concluded that the employee had acted independently of Morrisons in violation of the DPA.

However, the court found that Morrisons is vicariously liable for its employee’s actions, although the DPA does not explicitly foresee vicarious liability. The company appealed the decision.

The Court of Appeals dismissed the appeal and upheld the Supreme Court’s ruling that the Company is vicariously liable for its employee’s data breach, even though it was itself acquitted of any misconduct.

In the future appeal of the Supreme Court, it will have to examine, among other things, whether there is deputy liability under the DPA and whether the Court of Appeal’s conclusion that the employee disclosed the data during his employment was incorrect.

EDPS investigates into contractual agreements between EU institutions and Microsoft

10. April 2019

The European Data Protection Supervisor (EDPS) is the supervisory authority for all EU institutions and therefore responsible for their compliance with data protection laws. It is currently investigating the compliance of contractual agreements between EU institutions and Microsoft as the different institutions use Microsoft products and services to conduct their day-to-day businesses including the processing of huge amounts of personal data.

The EDPS refers to a Data Processing Impact Assessment carried out last November by the Dutch Ministry of Justice and Security (we reported) in which they concluded that Microsoft collects and stores personal data of Office users on a large scale without informing them.

Wojciech Wiewiórowski, Assistant EDPS, said: “New data protection rules for the EU institutions and bodies came into force on 11 December 2018. Regulation 2018/1725 introduced significant changes to the rules governing outsourcing. Contractors now have direct responsibilities when it comes to ensuring compliance. However, when relying on third parties to provide services, the EU institutions remain accountable for any data processing carried out on their behalf. They also have a duty to ensure that any contractual arrangements respect the new rules and to identify and mitigate any risks. It is with this in mind that the contractual relationship between the EU institutions and Microsoft is now under EDPS scrutiny.”

The investigation should reveal which products and systems are used right now and whether the existing contractual agreements are compliant with current Data Protection Laws, especially the GDPR.

Category: EU · GDPR · General
Tags: ,
Pages: Prev 1 2 3 4 5 6 7 8 9 Next
1 2 3 4 9