CNIL publishes action plan on targeted online advertising

3. July 2019

On 29th June, the French data protection authority CNIL published its 2019-2020 action plan, which aims to set rules for targeted online advertising and guide companies in their compliance efforts.

The Action Plan consists of two main steps. First, new cookie guidelines will be published in July 2019. The last cookie policy dates back to 2013, for which CNIL stated that the policy is no longer valid and will be repealed due to the stricter approval requirements of the GDPR. In order to comply with the new cookie guidelines, companies will be given a transitional period of 12 months. During this period, it will still be possible to define further browsing of a website as consent to the use of cookies. However, CNIL requires that during this transition period Cookies will be set only after consent has been obtained.

As a second major step, working groups composed of CNIL officials and stakeholders from the adtech ecosystem will be formed to develop practical approaches to obtain consent. The draft recommendations developed on the basis of this discussion will be published by CNIL at the end of 2019 or at the latest at the beginning of 2020 in order to make them available for public consultation. CNIL will then implement the final version of the recommendations after a period of six months.

The reason for preparing the Action Plan was that CNIL received numerous complaints about online marketing practices from individuals, non-profit organisations, organisations and associations. In 2018, 21% of complaints related to these issues. At the same time, CNIL received numerous questions from industry professionals trying to better understand their GDPR obligations.

Texas amends Data Breach Notification Law

2. July 2019

The Governor of Texas, Greg Abbott, recently signed the House Bill 4390 (HB 4390), which modifies the state’s current Data Breach Notification law and introduces an advisory council (“Texas Privacy Privacy Protection Advisory Council”) charged with studying data privacy laws in Texas, other states and relevant other jurisdictions.

Prior to the new amendment, businesses had to disclose Data Breaches to the Data Subjects “as quickly as possible”. Now, a concrete time period for notifying individuals whose sensitive personal information was acquired by an unauthorized person is determined by the bill. Individual notice must now be provided within 60 days after discovering the breach.

If more than 250 residents of Texas are subject to a Data Breach the Texas Attorney General must also be notified within 60 days. Such a notification must include:
– A detailed description of the nature and circumstances of the data breach;
– The number of the affected residents at that time;
– The measures taken regarding the breach and any measures the responsible person intends to take after the notification;
– Information on whether the law enforcement is engaged in investigating the breach.

The amendments take effect on January, 1 2020.

Category: General · USA
Tags: , ,

Italian DPA fines Facebook

The Italian Data Protection Authority Garante (Garante per la protezione dei dati personali) fined Facebook due to the Cambridge Analytica Scandal of 2015, which was discovered in 2018. The Cambridge Analytica Scandal is connected to the presidential campaign of the current president of the USA Donald Trump.

The Garante has imposed a fine of EUR 1.000.000 for abusing the use of data of more than 200.000 Italian Facebook users and their Facebook friends. According to the Garante, the abused data has not been transferred to Cambridge Analytica, which was also confirmed by a Facebook spokesman.  Nevertheless, the high fine was imposed.

The fine is still based on the old Italian Data Protection law because at the time of the abusive use the GDPR, which now applies throughout Europe, was not yet in force.

Facebook has to answer to the scandal not only in Italy. Legal consequences are also looming in the USA.

 

Consumers should know how much their data is worth

27. June 2019

US Senators Mark R. Warner (Democrats) and Josh Hawley (Republicans) want to know from Facebook, Google and Co. exactly how much the data of their users, measured in dollars and cents, is worth to them.

Last Sunday, the two senators announced their intention for the first time in a US talk show: Every three months, each user is to receive an overview of which data has been collected and stored and how the respective provider rates it. In addition, the aggregated value of all user data is to be reported annually to the US Securities and Exchange Commission. In this report, the companies are to disclose how they store, process and protect data and how and with which partner companies they generate sales with the data. All companies with more than 100 million users per month will be affected.

The value of user data has risen enormously in recent years; so far, companies have protected their internal calculations as company secrets. In addition, there is no recognized method for quantifying the value of user data; only when a company is sold or valued by means of an initial public offering (IPO) does it become obvious. In the case of the WhatsApp takeover it was  $ 55 per user, in the case of Skype it was $ 200.

But one can doubt the significance of these figures. A further indication can be the advertising revenues, which are disclosed by companies per quarter. At the end of 2018, Facebook earned around $6 per user worldwide, while Amazon earned $752 per user. These figures are likely to rise in the future.  “For years, social media companies have told consumers that their products are free to the user. But that’s not true – you are paying with your data instead of your wallet,” said Senator Warner. “But the overall lack of transparency and disclosure in this market have made it impossible for users to know what they’re giving up, who else their data is being shared with, or what it’s worth to the platform. […]” Experts believe it is important for consumers to know the value of their data, because only when you know the value of a good you are able to value it.

On Monday, Warner and Rawley plan to introduce the  Designing Accounting Safeguards to Help Broaden Oversight And Regulations on Data (DASHBOARD) Act to the parliament for its first reading. It remains to be seen whether their plans will meet with the approval of the other senators.

German Data Protection Authority of Baden-Württemberg fines an employee of a public body

24. June 2019

According to an announcement of the LfDI Baden-Würtemberg, which is one of the 16 German State Data Protection Authorities (DPA), a first fine of 1,400 Euro has been filed against an employee of a public body. The police officer unlawfully retrieved personal data in the context of his job solely for private purposes. Referring to the DPA’s statement, this was the first fine imposed on an employee of a public body after the EU General Data Protection Regulation (GDPR) has become applicable.

The police officer used his official user ID to file a request for the owner data relating to a license plate of a private coincidental aquaintance, without reference to his official duties. Using the personal data obtained in this way, he then carried out a second enquiry with the Federal Network Agency. By doing so, he then not only requested the personal data of the data subject, but also the fixed line and mobile numbers. Without the data subject’s official request or consent, the police officer finally used the obtained mobile number to contact the injured party by phone.

By filing the aforementioned requests for private puproses and the usage of the mobile number obtained in this way to make a private contact, the police officer autonomously processed personal data for non-legal purposes. Therefore, the police officer’s department cannot be hold responsible, as this action has not been taken in the course of the official duties of the police officer. As a consequence, the police officer is responsible for this breach of the GDPR as an individual. Neither the public body, which cannot be subject to sanctions according to the State Data Protection Act, is responsible nor the police officer is classified as a public body in the sense of this law.

Taking into account that it was the first of such a violation of the data protection laws, a fine of 1,400 Euro pursuant to Article 83 Paragraph 5 GDPR was considered to be appropriate. However, this case shows that even an employee of a public body might become subject to a fine if this person unlawfully processes personal data for private purposes only.

Category: GDPR · General
Tags:

FTC takes action against companies claiming to participate in EU-U.S. Privacy Shield and other international privacy agreements

The Federal Trade Commission (FTC) announced that it had taken action against several companies that pretended to be compliant with the EU-U.S. Privacy Shield and other international privacy agreements.

According to the FTC, SecureTest, Inc., a background screening company, has falsely claimed on its website to have participated in the EU-U.S. Privacy Shield and Swiss-U.S. Privacy Shield. These framework agreements allow companies to transfer consumer data from member states of the European Union and Switzerland to the United States in accordance with EU or Swiss law.

In September 2017, the company applied to the U.S. Department of Commerce for Privacy Shield certification. However, it did not take the necessary steps to be certified as compliant with the framework agreements.

Following the FTC’s complaint, the FTC and SecureTest, Inc. have proposed a settlement agreement. This proposal includes a prohibition for SecureTest to misrepresent its participation in any privacy or security program sponsored by any government or self-regulatory or standardization organization. The proposed agreement will be published in the Federal Register and subject to public comment for 30 days. Afterwards the FTC will make a determination regarding whether to make the proposed consent order final.

The FTC has also sent warning letters to 13 companies that falsely claimed to participate in the U.S.-EU Safe Harbor and the U.S.-Swiss Safe Harbor frameworks, which were replaced in 2016 by the EU-U.S. Privacy Shield and Swiss-U.S. Privacy Shield frameworks. The FTC asked companies to remove from their websites, privacy policies or other public documents any statements claiming to participate in a safe harbor agreement. If the companies fail to take action within 30 days, the FTC warned that it would take appropriate legal action.

The FTC also sent warning letters with the same request to two companies that falsely claimed in their privacy policies that they were participants in the Asia-Pacific Economic Cooperation (APEC) Cross-Border Privacy Rules (CBPR) system. The APEC CBPR system is an initiative to improve the protection of consumer data moving between APEC member countries through a voluntary but enforceable code of conduct implemented by participating companies. To become a certified participant, a designated third party, known as an APEC-approved Accountability Agent, must verify and confirm that the company meets the requirements of the CBPR program.

CNIL fines translation company for violating the French Data Protection Act

19. June 2019

The French Data Protection Authority (CNIL) recently fined UNIONTRAD COMPANY €20,000 for excessive video surveillance of employees.

UNIONTRAD COMPANY is a small French translation company with nine employees. Between 2013 and 2017, several employees complained that they were filmed at their workspaces. The CNIL alerted the company two times to the rules for installing cameras at the workspace, particularly that employees should not be filmed continuously and that information on present cameras should be given.

In an audit carried out at the company’s grounds in February 2018, the CNIL discovered among other things that the camera in the office of six translators filmed them constantly, no sufficient information about the cameras had been provided and the computer workspaces were not secured by a password.

In July 2018, the President of the CNIL issued a formal notice to the company, asking it to inter alia move the camera to no longer film the employees constantly; inform the employees about the cameras and implement appropriate security measures for access to computer workspaces.

A second audit in October 2018 showed that the company had not taken any actions for the violations. The CNIL now imposed a fine of €20,000 considering the size and financial situation of the company.

Spanish DPA imposes fine on Spanish football league

13. June 2019

The Spanish data protection authority Agencia Española de Protección de Datos (AEPD) has imposed a fine of 250.000 EUR on the organisers of the two Spanish professional football leagues for data protection infringements.

The organisers, Liga Nacional de Fútbol Profesional (LFP), operate an app called “La Liga”, which aims to uncover unlicensed performances of games broadcasted on pay-TV. For this purpose, the app has recorded a sample of the ambient sounds during the game times to detect any live game transmissions and combined this with the location data. Privacy-ticker already reported.

AEPD criticized that the intended purpose of the collected data had not been made transparent enough, as it is necessary according to Art. 5 paragraph 1 GDPR. Users must approve the use explicitly and the authorization for the microphone access can also be revoked in the Android settings. However, AEPD is of the opinion that La Liga has to warn the user of each data processing by microphone again. In the resolution, the AEPD points out that the nature of the mobile devices makes it impossible for the user to remember what he agreed to each time he used the La Liga application and what he did not agree to.

Furthermore, AEPD is of the opinion that La Liga has violated Art. 7 paragraph 3 GDPR, according to which the user has the possibility to revoke his consent to the use of his personal data at any time.

La Liga rejects the sanction because of injustice and will proceed against it. It argues that the AEPD has not made the necessary efforts to understand how the technology works. They explain that the technology used is designed to produce only one particular acoustic fingerprint. This fingerprint contains only 0.75% of the information. The remaining 99.25% is discarded, making it technically impossible to interpret human voices or conversations. This fingerprint is also converted into an alphanumeric code (hash) that is not reversible to the original sound. Nevertheless, the operators of the app have announced that they will remove the controversial feature as of June 30.

Belgian DPA imposes first fine since GDPR

11. June 2019

On 28 May 2019, the Belgian Data Protection Authority (DPA) imposed the first fine since the General Data Protection Regulation (GDPR) came into force. The Belgian DPA fined a Belgian mayor 2.000 EUR for abusing use of personal data.

The Belgian DPA received a complaint from the data subjects alleging that their personal data collected for local administrative purposes had been further used by the mayor for election campaign purposes. The parties were then heard by the Litigation Chamber of the Belgian DPA. Finally, the Belgian DPA ruled that the mayor’s use of the plaintiff’s personal data violated the purpose limitation principle of the GDPR, since the personal data was originally collected for a different purpose and was incompatible with the purpose for which the mayor used the data.

In deciding on the amount of the fine, the Belgian DPA took into account the limited number of data subjects, the nature, gravity and duration of the infringement, resulting in a moderate sum of 2.000 EUR. Nevertheless, the decision conveys the message that compliance with the GDPR is the responsibility of each data controller, including public officials.

Germany: Data of smart home devices as evidence in court?!

According to a draft resolution for the upcoming conference of interior ministers of the 16 German federal states, data from smart home devices are to be admitted as evidence in court. The ministers of the federal states believe that the digital traces could help to solve crimes in the future, especially capital crimes and terrorist threats.

The interior ministers want to remove constitutional concerns, because the mentioned data is of great interest for the security authorities. According to the draft resolution, judicial approval will be sufficient in the future. However, domestic politicians expect criticism and resistance from the data protection commissioners of both the federal states and the federal government.

Smart home devices are technical devices such as televisions, refrigerators or voice assistants that are connected to the Internet. They are also summarized under the term Internet of the Things (IoT), can be controlled via the smartphone and make daily life easier for the user. Many data are stored and processed.

We have already reported several times about smart home devices, including the fact that in the USA data from smart home devices have already helped to solve crimes (in German).

It cannot be denied that data from smart home devices can (under certain circumstances) help to solve crimes, but it must be neglected that due to the technical design a 100% reliable statement cannot be made. A simple example is this: whether the landlord was actually at home at the time in question or still on his way home, or just wanted to give the impression that he was at home while in fact on the other side of the world, cannot be determined on the basis of data from smart home devices. For example, the ability to use the smartphone to control the light/heat management allows the user to control it from anywhere at any time.

In addition, it should be taken into consideration that such interventions, or the mere possibility of intervention, may violate a person’s right to informational self-determination, and it is precisely the protection of this constitutionally protected right that data protection is committed to.

Update: The 210th Conference of the interior ministers has come to an end in the meantime and the approval of smart home data as evidence in court has been rejected. The resolutions of the conference can be found here (in German).

Pages: Prev 1 2 3 ... 31 32 33 34 35 36 37 ... 67 68 69 Next
1 32 33 34 35 36 69