Category: GDPR

Spanish DPA imposes fine on Spanish football league

13. June 2019

The Spanish data protection authority Agencia Española de Protección de Datos (AEPD) has imposed a fine of 250.000 EUR on the organisers of the two Spanish professional football leagues for data protection infringements.

The organisers, Liga Nacional de Fútbol Profesional (LFP), operate an app called “La Liga”, which aims to uncover unlicensed performances of games broadcasted on pay-TV. For this purpose, the app has recorded a sample of the ambient sounds during the game times to detect any live game transmissions and combined this with the location data. Privacy-ticker already reported.

AEPD criticized that the intended purpose of the collected data had not been made transparent enough, as it is necessary according to Art. 5 paragraph 1 GDPR. Users must approve the use explicitly and the authorization for the microphone access can also be revoked in the Android settings. However, AEPD is of the opinion that La Liga has to warn the user of each data processing by microphone again. In the resolution, the AEPD points out that the nature of the mobile devices makes it impossible for the user to remember what he agreed to each time he used the La Liga application and what he did not agree to.

Furthermore, AEPD is of the opinion that La Liga has violated Art. 7 paragraph 3 GDPR, according to which the user has the possibility to revoke his consent to the use of his personal data at any time.

La Liga rejects the sanction because of injustice and will proceed against it. It argues that the AEPD has not made the necessary efforts to understand how the technology works. They explain that the technology used is designed to produce only one particular acoustic fingerprint. This fingerprint contains only 0.75% of the information. The remaining 99.25% is discarded, making it technically impossible to interpret human voices or conversations. This fingerprint is also converted into an alphanumeric code (hash) that is not reversible to the original sound. Nevertheless, the operators of the app have announced that they will remove the controversial feature as of June 30.

Belgian DPA imposes first fine since GDPR

11. June 2019

On 28 May 2019, the Belgian Data Protection Authority (DPA) imposed the first fine since the General Data Protection Regulation (GDPR) came into force. The Belgian DPA fined a Belgian mayor 2.000 EUR for abusing use of personal data.

The Belgian DPA received a complaint from the data subjects alleging that their personal data collected for local administrative purposes had been further used by the mayor for election campaign purposes. The parties were then heard by the Litigation Chamber of the Belgian DPA. Finally, the Belgian DPA ruled that the mayor’s use of the plaintiff’s personal data violated the purpose limitation principle of the GDPR, since the personal data was originally collected for a different purpose and was incompatible with the purpose for which the mayor used the data.

In deciding on the amount of the fine, the Belgian DPA took into account the limited number of data subjects, the nature, gravity and duration of the infringement, resulting in a moderate sum of 2.000 EUR. Nevertheless, the decision conveys the message that compliance with the GDPR is the responsibility of each data controller, including public officials.

CNIL fines French real estate company for violating the GDPR

7. June 2019

The French Data Protection Authority “Commission Nationale de l’Informatique et des Libertés” (CNIL) issued a 400k euro fine for the French real estate company “Sergic” for violating the GDPR.
Sergic is specialized in real estate development, purchase, sale, rental and property management and has published the website www.sergic.com , which allows rental candidates to upload the necessary documents for preparing their file.

In August 2018, a Sergic user contacted the CNIL reporting that he had unencrypted access, from his personal space on the website, to other users’ uploaded files by slightly changing the URL. On September 7, 2018, an online check revealed that rental candidates’ uploaded documents were actually freely accessible for others without prior authentication. Among the documents were copies of identity cards, health cards, tax notices and divorce judgements. CNIL informed Sergic on the same day of this security incident and the violation of personal data. It became apparent that Sergic had been aware of this since March 2018 and, even though it had initiated IT developments to correct it, the final correction did not take place until September 17, 2018.

Based on the investigation, the responsible CNIL body found two violations of the GDPR. Firstly, Sergic had failed to fulfil its obligations according to Art. 32 GDPR, which obliges controllers to implement appropriate technical and organizational measures to ensure a secure level of protection of the personal data. This includes for example a procedure to ensure that personal documents cannot be accessed without prior authentication of the user. In addition, there is the time that the company took to correct the error.

Secondly, the CNIL found out that Sergic kept all the documents sent by candidates in active base, although they had not accessed rental accommodation for more than the time required to allocate housing. According to the GDPR, the controller has the obligation to delete data immediately if they are no longer necessary in relation to the purposes for which they were collected or otherwise processed and no other purpose justifies the storage of the data in an active database.

The CNIL imposed a fine of € 400.000 and decided to make its sanction public due to inter alia the seriousness of the breach, the lack of due diligence by the company and the fact that the documents revealed intimate aspects of people’s lives.

Category: Data Breach · French DPA · GDPR
Tags: , ,

Royal family uses GDPR to protect their privacy

22. May 2019

Last week Prince Harry and Meghan Markle could claim another victory in the royal family’s never ending struggle with paparazzi photographers, securing “a substantial sum” in damages from an agency that released intimate photos of the Oxfordshire home the Duke and Duchess of Sussex rented to the media. In a statement, Splash News apologized for and acknowledged that this situation would represent “an error of judgement”.

The paparazzi agency “Splash News” took photos and footage of the couple’s former Cotswolds home — including their living room, dining area, and bedroom — using a helicopter and promptly sold to different news outlets. The lawyers of Prince Harry argued that this situation caused a breach of his right to privacy according to Art. 7 and 8 ECHR as well as a breach of the General Data Protection Regulation (GDPR) and Data Protection Act 2018 (DPA).

Considering the strategy of the Duke’s lawyers, it looks like the royal family have found a potentially attractive alternative to claims of defamation of invasion of privacy. Since in contrast to such a claim, a claimant relying on data protection law neither needs to prove that a statement is at least defamatory and met the threshold for serious harm to reputation nor that the information is private.

However, the (new) European data protection legislation grants all data subjects, regardless of their position and/or fame, a right of respect for their privacy and family lives and protection of their personal data. In particular, the GDPR requires organisations, according to its Article 5, to handle personal data (such as names, pictures and stories relating to them) fairly and in a transparent manner while also using it for a legitimate purpose.

Moreover, when obtaining pictures and footage of an individual’s private or even the intimite sphere, the organization using such materials need a specific reason like some kind of contract, the individual’s consent or be able to argue that using this photos and footage was “in the public interest” or for a “legitimate interest”. As a contract and a consent can be excluded here, the only basis that might be considerd could be a public interest or a legitimate interest of the organization itself. Taking into account the means and the way how these photos and footage of the Duke and Dutchess were created, both of these interest cannot withstand the interest  in protecting the rights and freedom of individuals’ private and intimite sphere.

Referring to this case, it seems pretty likely that the European data protection regime changed the way in how celebrities and the courts enforce the heavy-contested threshold of whether the public is allowed to see and be informed about certain parts and aspects of famous people’s lives or not.

 

 

Public availability of house images using a Google Street View raises legal concerns.

21. May 2019

In recent years, the science of data analytics has dramatically improved the ability to analyse raw data and to make conclusions about that information. Data analytics techniques can reveal trends and patterns that can be used to optimize processes to increase the overall efficiency of a business or system. However, there is an obvious contradiction between the security and privacy of big data and the widespread use of big data.
Google Street View is a quite popular Google service used by millions of people every day to plan trips, explore touristic destinations and more.
In 2017, two university researchers Łukasz Kidziński, Stanford University in California, and Kinga Kita-Wojciechowska,University of Warsaw in Poland, have used Street View images of people’s houses to determine how likely they are to be involved in a car accident.
The researchers worked with an unknown insurance company and analysed 20.000 random addresses of the insurance company clients who had taken out car insurance. They collected information from the insurance company’s database, like age, sex, zip code, claim history and linked that information with Street View images correlated with the policyholder’s residential area. It turned out that a policyholder’s residence is a surprisingly good predictor of the likelihood that he/she will get involved in a car accident.
Subsequently, researchers put those results into a data analytics algorithm, which improved its predictive power by 2%. They also noted that the accuracy of the algorithm could be further improved using larger data sets and data analysis.
The insurance companies rely on data to predict risk and the results of the research are from this perspective impressive, but they are also disturbing. The new utilization of the technology is an important step towards improving risk prediction models. However, having in mind the results of the research, some interesting questions regarding data protection come up: Did the policyholders give their consent to this activity? Could the insurance company use individuals’ data this way given Europe’s strict privacy legislation? “The consent given by the clients to the company to store their addresses does not necessarily mean a consent to store information about the appearance of their houses,” said by Kidziński and Kita-Wojciechowska.”
Studies such as these raise datat protection questions about thepower of data analysis and how the information is collected and shared.

Twitter shared location data on iOS devices

15. May 2019

Twitter recently published a statement admitting that the app shared location data on iOS devices even if the user had not turned on the “precise location” feature.

The problem appeared in cases in which a user used more than one Twitter account on the same iOS device. If he or she had opted into the “precise location” feature for one account it was also turned on when using another account, even if the user had not opted into using the feature on this account. The information on the real-time location was then passed on to trusted partners of Twitter. However, through technical measures, only the postcode or an area of five square kilometres was passed on to the partners. Twitter accounts or other “Unique Account IDs”, which reveal the identity of the user, were allegedly not transmitted.

According to Twitter’s statement, they have fixed the problem and informed the affected users: “We’re very sorry this happened. We recognize and appreciate the trust you place in us and are committed to earning that trust every day”.

Morrisons is Allowed to Appeal Data Protection Class Action

29. April 2019

The British food store chain VM Morrison Supermarkets PLC (“Morrisons”) has been granted permission by the Supreme Court to appeal the data protection class action brought against it and to challenge the judgment for all its grounds. The case is important as it’s the first to be filed in the UK for a data breach and its outcome may affect the number of class actions for data breaches.

An employee who worked as a senior IT auditor for Morrsisons copied the payroll data of almost 100,000 employees onto a USB stick and published it on a file-sharing website. He then reported the violation anonymously to three newspapers. The employee himself was sentenced to eight years in prison for various crimes.

5,518 employees filed a class action lawsuit against Morrisons for the violation. It claimed both primary and representative liability for the company. The Supreme Court dismissed all primary liability claims under the Data Protection Act (“DPA”), as it concluded that the employee had acted independently of Morrisons in violation of the DPA.

However, the court found that Morrisons is vicariously liable for its employee’s actions, although the DPA does not explicitly foresee vicarious liability. The company appealed the decision.

The Court of Appeals dismissed the appeal and upheld the Supreme Court’s ruling that the Company is vicariously liable for its employee’s data breach, even though it was itself acquitted of any misconduct.

In the future appeal of the Supreme Court, it will have to examine, among other things, whether there is deputy liability under the DPA and whether the Court of Appeal’s conclusion that the employee disclosed the data during his employment was incorrect.

EDPS investigates into contractual agreements between EU institutions and Microsoft

10. April 2019

The European Data Protection Supervisor (EDPS) is the supervisory authority for all EU institutions and therefore responsible for their compliance with data protection laws. It is currently investigating the compliance of contractual agreements between EU institutions and Microsoft as the different institutions use Microsoft products and services to conduct their day-to-day businesses including the processing of huge amounts of personal data.

The EDPS refers to a Data Processing Impact Assessment carried out last November by the Dutch Ministry of Justice and Security (we reported) in which they concluded that Microsoft collects and stores personal data of Office users on a large scale without informing them.

Wojciech Wiewiórowski, Assistant EDPS, said: “New data protection rules for the EU institutions and bodies came into force on 11 December 2018. Regulation 2018/1725 introduced significant changes to the rules governing outsourcing. Contractors now have direct responsibilities when it comes to ensuring compliance. However, when relying on third parties to provide services, the EU institutions remain accountable for any data processing carried out on their behalf. They also have a duty to ensure that any contractual arrangements respect the new rules and to identify and mitigate any risks. It is with this in mind that the contractual relationship between the EU institutions and Microsoft is now under EDPS scrutiny.”

The investigation should reveal which products and systems are used right now and whether the existing contractual agreements are compliant with current Data Protection Laws, especially the GDPR.

Category: EU · GDPR · General
Tags: ,

CNIL publishes model regulation on access control through biometric authentication at the workplace

9. April 2019

The French data protection authority CNIL has published a model regulation which regulates under which conditions devices for access control through biometric authentication may be introduced at the workplace.

Pursuant to Article 4 paragraph 14 of the General Data Protection Regulation (GDPR), biometric data are personal data relating to the physical, physiological or behavioural characteristics of a natural person, obtained by means of specific technical processes, which enable or confirm the unambiguous identification of that natural person. According to Article 9 paragraph 4 GDPR, the member states of the European Union may introduce or maintain additional conditions, including restrictions, as far as the processing of biometric data is concerned.

The basic requirement under the model regulation is that the controller proves that biometric data processing is necessary. To this end, the controller must explain why the use of other means of identification or organisational and technical safeguards is not appropriate to achieve the required level of security.

Moreover, the choice of biometric types must be specifically explained and documented by the employer. This also includes the justification for the choice of one biometric feature over another. Processing must be carried out for the purpose of controlling access to premises classified by the company as restricted or of controlling access to computer devices and applications.

Furthermore, the model regulation of the CNIL describes which types of personal data may be collected, which storage periods and conditions apply and which specific technical and organisational measures must be taken to guarantee the security of personal data. In addition, CNIL states that before implementing data processing, the controller must always carry out an impact assessment and a risk assessment of the rights and freedoms of the individual. This risk assessment must be repeated every three years for updating purposes.

The data protection authority also points out that the model regulation does not exempt from compliance with the regulations of the GDPR, since it is not intended to replace its regulations, but to supplement or specify them.

German Court’s Decision on the Right of Access

Just recently, a German Labour Court (LAG Baden-Württemberg) has decided on the extent of Article 15 of the European General Data Protection Regulation (GDPR) with regard to the information that is supposed to be handed out to the data subject in case such a claim is made.

The decision literally reflects the wording of Art. 15 (1) GDPR which, amongst other things, requires information on

  • the purposes of data processing,
  • the categories of personal data concerned,
  • the recipients or categories of recipient to whom the personal data have been or will be disclosed
  • where possible, the envisaged period for which the personal data will be stored, or, if not possible, the criteria used to determine that period,
  • where the personal data are not collected from the data subject, any available information as to their source.

In contrast to the previous views of the local data protection authorities, which – in the context of information about recipients of personal data – deem sufficient that the data controller discloses recipient categories, the LAG Baden-Württemberg also obliged the data controller to provide the data subject with information about each individual recipient.

In addition, the LAG Baden-Württemberg ordered the data controller to make available to the data subject a copy of all his personal performance data. However, the court did not comment on the extent of copies that are to be made. It is therefore questionable whether, in addition to information from the systems used in the company, copies of all e-mails containing personal data of the person concerned must also be made available to the data subject.

Since the court has admitted the appeal to the Federal Labour Court (BAG) regarding this issue, it remains to be seen whether such an approach will still be valid after a Federal Labour Court decision.

Pages: Prev 1 2 3 ... 14 15 16 17 18 19 20 21 22 23 24 Next
1 15 16 17 18 19 24