Category: Privacy policy

California Voters approve new Privacy Legislation CPRA

20. November 2020

On November 3rd 2020, Californian citizens were able to vote on the California Privacy Rights Act of 2020 (“CPRA”) in a state ballot (we reported). As polls leading up to the vote already suggested, California voters approved the new Privacy legislation, also known as “Prop 24”. The CPRA was passed with 56.2% of Yes Votes to 43.8% of No Votes. Most provisions of the CPRA will enter into force on 1 January 2021 and will become applicable to businesses on 1 January 2023. It will, at large, only apply to information collected from 1 January 2022.

The CPRA will complement and expand privacy rights of California citizens considerably. Among others, the amendments will include:

  • Broadening the term “sale” of personal information to “sale or share” of private information,
  • Adding new requirements to qualify as a “service provider” and defining the term “contractor” anew,
  • Defining the term “consent”,
  • Introducing the category of “Sensitive Information”, including a consumer’s Right to limit the use of “Sensitive Information”,
  • Introducing the concept of “Profiling” and granting consumers the Right to Opt-out of the use of the personal information for Automated Decision-Making,
  • Granting consumers the Right to correct inaccurate information,
  • Granting consumers the Right to Data Portability, and
  • Establishing the California Privacy Protection Agency (CalPPA) with a broad scope of responsibilities and enforcement powers.

Ensuring compliance with the CPRA will require proper preparation. Affected businesses will have to review existing processes or implement new processes in order to guarantee the newly added consumer rights, meet the contractual requirements with service providers/contractors, and show compliance with the new legislation as a whole.

In an interview after the passage of the CPRA, the initiator of the CCPA and the CPRA Alastair Mactaggard commented that

Privacy legislation is here to stay.

He hopes that California Privacy legislation will be a model for other states or even the U.S. Congress to follow, in order to offer consumers in other parts of the country the same Privacy rights as there are in California now.

H&M receives record-breaking 35 Mio Euro GDPR Fine in Germany

21. October 2020

In the beginning of October, the Hamburg Data Protection Commissioner (“HmbBfDI”) imposed a record-breaking 35,258,707.95 Euro GDPR fine on the German branch of the Swedish clothing-retail giant H&M. It is the highest fine, based on a GDPR violation, a German Data Protection Authority has ever issued.

Since 2014, the management of the H&M service centre in Nuremberg extensively monitored the private lives of their employees in various ways. Following holidays and sick leaves of employees, team leaders would conduct so-called “Welcome Back Talks” in which they recorded employees’ holiday experiences, symptoms of illnesses and medical diagnoses. Some H&M supervisors gathered a broad data base of their employees’ private lives as they recorded details on family issues and religious beliefs from one-on-one talks and even corridor conversations. The recordings had a high level of detail and were updated over time and in some cases were shared with up to 50 other managers throughout the whole company. The H&M supervisors also used this Personal Data to create profiles of their employees and to base future employment decisions and measures on this information. The clandestine data collection only became known as a result of a configuration error in 2019 when the notes were accessible company-wide for a few hours.

After the discovery, the H&M executives presented the HmbBfDI a comprehensive concept on improving Data Protection at their Nuremberg sub-branch. This includes newly appointing a Data Protection coordinator, monthly Data Protection status updates, more strongly communicated whistleblower protection and a consistent process for granting data subject rights. Furthermore, H&M has apologised to their employees and paid the affected people a considerable compensation.

With their secret monitoring system at the service centre in Nuremberg, H&M severely violated the GDPR principles of lawfulness, fairness, and transparency of processing pursuant to Art. 5 no. 1 lit. a) and Art. 6 GDPR because they did not have a legal basis for collecting these Personal Data from their employees. The HmbBfDI commented in his statement on the magnitude of the fine saying that “the size of the fine imposed is appropriate and suitable to deter companies from violating the privacy of their employees”.

German State Data Protection Commissioner imposes 1.2 million € GDPR fine

1. July 2020

The German State Data Protection Commissioner of Baden-Württemberg (“LfDI Ba-Wü”)  imposed a GDPR fine of 1.240.000€ on the German statutory health insurance provider AOK Baden-Württemberg (“AOK”). The fine was a result of the health insurance’s lack of technical and organisational measures pursuant to Art. 32 GDPR. It is the highest fine the LfDI Ba-Wü has ever imposed.

Between 2015 and 2019 the AOK organised lotteries on various occasions and collected personal data of the participants, including their contact details and current health insurance affiliations. The AOK wanted to use the data of the lottery participants for advertising purposes, insofar as the participants gave their consent to this. To ensure the security of processing, the AOK implemented internal guidelines and data protection training of their staff as technical and organisatioal measures. However, these measures were not sufficient to comply with Art. 32 GDPR because AOK staff used the personal data of more than 500 lottery participants for advertising purposes without their prior consent.

Following the investigation of the LfDI Ba-Wü, the AOK immediately stopped all marketing activities in order to revise their internal policies and processes against the GDPR. The LfDI Ba-Wü explained that in determining the extent of the fine, it considered the following mitigating factors:

  • the cooperation of the AOK with the Data Protection Authority,
  • the fact that the AOK as a statutory health insurance provider is an important part of the German healthcare system, and
  • the burdens of the current Corona-Pandemic on the healthcare system.

Finally, the Commissioner pointed out that technical and organisational measures must be regularly adjusted to the actual conditions of each processing activity, in order to ensure an adequate level of data protection in the long term.

The Video-conference service Zoom and its Data Security issues

20. April 2020

Amidst the Corona crisis, the video communications service Zoom gained enormous popularity. The rate of daily Zoom users skyrocketed from 10 Mio in December 2019 to 200 Mio in March 2020. As it outshined many of its competitors, Zoom labels itself as “the leader in modern enterprise video communications”. However, the company has been facing a lot of public criticism because of its weaknesses in data security and lack of awareness in data protection matters.

Basic data security weaknesses unfolded little by little starting in March 2020:

  • Zoom had to admit that it was wrongly advertising to provide full end-to-end encryption for all shared contents like video, audio or screen sharing.
  • Security experts revealed several bugs that could have allowed webcam and mic hijacking and the theft of login credentials.
  • An online Tech Magazine reported that Zoom leaked thousands of their users’ email addresses and photos to strangers.
  • Video-conferences which users did not protect with a password, enabled “Zoombombing”, a phenomenon in which strangers hijacked videocalls and disrupted them by posting pornographic and racist images as well as spamming the conversations with threatening language. In response, Zoom introduced the Waiting Room feature and additional password settings.

At the same time, Zoom’s data privacy practices came under scrutiny:

  • Zoom shared web analytics data with third-party companies for advertising purposes without having a legal basis or notifying users about this practice. In response to criticism, Zoom revised its privacy policy and now declares that it does not share data from meetings for advertising.
  • The company also shared more analytics data of its users with Facebook than stated on Zoom’s privacy policy, even if the user did not sign in with their Facebook account. Zoom introduced an update in which this sharing is terminated.
  • The New York Times revealed that Zoom used a data mining feature that matched Zoom users’ names and email addresses to their LinkedIn profiles without the users knowing about it. Zoom then enabled automatic sharing of the matched LinkedIn profiles with other meeting members that were subscribers of a LinkedIn service for sales prospecting (“LinkedIn Sales Navigator”). In response to criticism, Zoom removed this feature permanently.
  • Zoom hosted a feature called Attention Tracking, which let the meeting’s host know when an attendee had clicked away the meeting window for more than 30 seconds. In the meantime, Zoom disabled the feature.

The security and privacy issues of Zoom have led various public authorities and companies internationally to ban their workers from using the service.

On 1 April 2020, Zoom’s founder and CEO Eric S. Yuan announced a 90-day plan to significantly improve their data security in an effort to build greater trust with their users. This plan includes freezing the introduction of new features, enlarge their cybersecurity team and engage outside help from security advisors.

Greek Data Protection Authority releases Guidance on Cookies

16. March 2020

On 25 February 2020, the Hellenic Data Protection Authority (DPA) published a guidance on Cookies and other tracking tools. Previously, the Authority had found that Greek websites and service providers have been largely failing to comply with the rules on the use of Cookies and other trackers set out by the ePrivacy Directive and the GDPR, and reaffirmed by the European Court of Justice’s ruling on Planet 49.

The guidance states that it will be relevant to HTTP/S Cookies, Flash Cookies, local storage applying to HTML 5, device fingerprinting, OS identifiers, and material identifiers.

The Greek DPA reiterated that, generally, providers are obliged to obtain the user’s consent if they are using any tracking tools – irrespective of whether the processing of personal data is taking place. It also outlined that technically necessary trackers are exempt from the obligation to consent. Furthermore, the guidance goes into detail on how information and consent can be made available on websites specifically.

Lastly, the Authority has given Greek website providers a grace period of two months to implement the provisions of this guidance and thereby become compliant with the European rules on tracking tools.

German Robert-Koch-Institute discusses mobile phone tracking to slow down the spreading of the Coronavirus

9. March 2020

According to a news report by the German newspaper “Der Tagesspiegel”, a small group of scientists at the Robert-Koch-Institute (RKI) and other institutions are currently discussing the evaluation and matching of movement data from mobile phones to detect people infected with the Coronavirus (COVID-19).

The scientists, who are trying to slow down the spreading of the disease, complain about the problem of the time-consuming and vague questionings of infected people on who they came in contact with. The evaluation and matching of mobile phone data may be more accurate and could speed up the process of identifying infected people, which could be essential for saving lives.

In a comment, the German Federal Commissioner for Data Protection Ulrich Kelber expressed that this procedure may cause large data protection issues, especially with regards to having a legal basis for processing and the proportionality of processing according to the GDPR.

More US States are pushing on with new Privacy Legislation

3. January 2020

The California Consumer Privacy Act (CCPA) came into effect on January 1, 2020 and will be the first step in the United States in regulating data privacy on the Internet. Currently, the US does not have a federal-level general consumer data privacy law that is comparable to that of the privacy laws in EU countries or even the supranational European GDPR.

But now, several other US States have taken inspiration from the CCPA and are in the process of bringing forth their own state legislation on consumer privacy protections on the Internet, including

  • The Massachusetts Data Privacy Law “S-120“,
  • The New York Privacy Act “S5642“,
  • The Hawaii Consumer Privacy Protection Act “SB 418“,
  • The Maryland Online Consumer Protection Act “SB 613“, and
  • The North Dakota Bill “HB 1485“.

Like the CCPA, most of these new privacy laws have a broad definition of the term “Personal Information” and are aimed at protecting consumer data by strenghtening consumer rights.

However, the various law proposals differ in the scope of the consumer rights. All of them grant consumers the ‘right to access’ their data held by businesses. There will also be a ‘right to delete’ in most of these states, but only some give consumers a private ‘right of action’ for violations.

There are other differences with regards to the businesses that will be covered by the privacy laws. In some states, the proposed laws will apply to all businesses, while in other states the laws will only apply to businesses with yearly revenues of over 10 or 25 Million US-Dollars.

As more US states are beginning to introduce privacy laws, there is an increasing possiblity of a federal US privacy law in the near future. Proposals from several members of Congress already exist (Congresswomen Eshoo and Lofgren’s Proposal and Senators Cantwell/Schatz/Klobuchar/Markey’s Proposal and Senator Wicker’s Proposal).

India updates privacy bill

12. December 2019

The new update of the Indian Personal Data Protection Bill is part of India’s broader efforts to tightly control the flow of personal data.

The bill’s latest version enpowers the government to ask companies to provide anonymized personal data, as well as other non-personal data in order to help to deliver governmental services and privacy policies. The draft defines “personal data” as information that can help to identify a person and also has characteristics, traits and any other features of a person’s identity. “Sensitive personal data” also includes financial and biometric data. According to the draft, such “sensitive” data can be transferred outside India for processing, but must be stored locally.

Furthermore, social media platforms will be required to offer a mechanism for users to prove their identities and display a verification sign publicly. Such requirements would raise a host of technical issues for companies such as Facebook and WhatsApp.

As a result, the new bill could affect the way companies process, store and transfer Indian consumers’ data. Therefore, it could cause some difficulties for top technology companies.

FTC takes action against companies claiming to participate in EU-U.S. Privacy Shield and other international privacy agreements

24. June 2019

The Federal Trade Commission (FTC) announced that it had taken action against several companies that pretended to be compliant with the EU-U.S. Privacy Shield and other international privacy agreements.

According to the FTC, SecureTest, Inc., a background screening company, has falsely claimed on its website to have participated in the EU-U.S. Privacy Shield and Swiss-U.S. Privacy Shield. These framework agreements allow companies to transfer consumer data from member states of the European Union and Switzerland to the United States in accordance with EU or Swiss law.

In September 2017, the company applied to the U.S. Department of Commerce for Privacy Shield certification. However, it did not take the necessary steps to be certified as compliant with the framework agreements.

Following the FTC’s complaint, the FTC and SecureTest, Inc. have proposed a settlement agreement. This proposal includes a prohibition for SecureTest to misrepresent its participation in any privacy or security program sponsored by any government or self-regulatory or standardization organization. The proposed agreement will be published in the Federal Register and subject to public comment for 30 days. Afterwards the FTC will make a determination regarding whether to make the proposed consent order final.

The FTC has also sent warning letters to 13 companies that falsely claimed to participate in the U.S.-EU Safe Harbor and the U.S.-Swiss Safe Harbor frameworks, which were replaced in 2016 by the EU-U.S. Privacy Shield and Swiss-U.S. Privacy Shield frameworks. The FTC asked companies to remove from their websites, privacy policies or other public documents any statements claiming to participate in a safe harbor agreement. If the companies fail to take action within 30 days, the FTC warned that it would take appropriate legal action.

The FTC also sent warning letters with the same request to two companies that falsely claimed in their privacy policies that they were participants in the Asia-Pacific Economic Cooperation (APEC) Cross-Border Privacy Rules (CBPR) system. The APEC CBPR system is an initiative to improve the protection of consumer data moving between APEC member countries through a voluntary but enforceable code of conduct implemented by participating companies. To become a certified participant, a designated third party, known as an APEC-approved Accountability Agent, must verify and confirm that the company meets the requirements of the CBPR program.

Apple advises app developer to reveal or remove code for screen recording

12. February 2019

After TechCrunch initiated investigations that revealed that numerous apps were recording screen usage, Apple called on app developers to remove or at least disclose the screen recording code.

TechCrunch’s investigation revealed that many large companies commission Glassbox, a customer experience analytics firm, to be able to view their users’ screens and thus follow and track keyboard entries and understand in which way the user uses the app. It turned out that during the replay of the session some fields that should have been masked were not masked, so that certain sensitive data, like passport numbers and credit card numbers, could be seen. Furthermore, none of the apps examined informed their users that the screen was being recorded while using the app. Therefore, no specific consent was obtained nor was any reference made to screen recording in the apps’ privacy policy.

Based on these findings, Apple immediately asked the app developers to remove or properly disclose the analytics code that enables them to record screen usage. Apples App Store Review Guidelines require that apps request explicit user consent and provide a clear visual indication when recording, logging, or otherwise making a record of user activity. In addition, Apple expressly prohibits the covert recording without the consent of the app users.

According to TechCrunch, Apple has already pointed out to some app developers that they have broken Apple’s rules. One was even explicitly asked to remove the code from the app, pointing to the Apple Store Guidelines. The developer was given less than a day to do so. Otherwise, Apple would remove the app from the App Store.

 

Pages: 1 2 Next
1 2