Category: Privacy policy

German State Data Protection Commissioner imposes 1.2 million € GDPR fine

1. July 2020

The German State Data Protection Commissioner of Baden-Württemberg (“LfDI Ba-Wü”)  imposed a GDPR fine of 1.240.000€ on the German statutory health insurance provider AOK Baden-Württemberg (“AOK”). The fine was a result of the health insurance’s lack of technical and organisational measures pursuant to Art. 32 GDPR. It is the highest fine the LfDI Ba-Wü has ever imposed.

Between 2015 and 2019 the AOK organised lotteries on various occasions and collected personal data of the participants, including their contact details and current health insurance affiliations. The AOK wanted to use the data of the lottery participants for advertising purposes, insofar as the participants gave their consent to this. To ensure the security of processing, the AOK implemented internal guidelines and data protection training of their staff as technical and organisatioal measures. However, these measures were not sufficient to comply with Art. 32 GDPR because AOK staff used the personal data of more than 500 lottery participants for advertising purposes without their prior consent.

Following the investigation of the LfDI Ba-Wü, the AOK immediately stopped all marketing activities in order to revise their internal policies and processes against the GDPR. The LfDI Ba-Wü explained that in determining the extent of the fine, it considered the following mitigating factors:

  • the cooperation of the AOK with the Data Protection Authority,
  • the fact that the AOK as a statutory health insurance provider is an important part of the German healthcare system, and
  • the burdens of the current Corona-Pandemic on the healthcare system.

Finally, the Commissioner pointed out that technical and organisational measures must be regularly adjusted to the actual conditions of each processing activity, in order to ensure an adequate level of data protection in the long term.

The Video-conference service Zoom and its Data Security issues

20. April 2020

Amidst the Corona crisis, the video communications service Zoom gained enormous popularity. The rate of daily Zoom users skyrocketed from 10 Mio in December 2019 to 200 Mio in March 2020. As it outshined many of its competitors, Zoom labels itself as “the leader in modern enterprise video communications”. However, the company has been facing a lot of public criticism because of its weaknesses in data security and lack of awareness in data protection matters.

Basic data security weaknesses unfolded little by little starting in March 2020:

  • Zoom had to admit that it was wrongly advertising to provide full end-to-end encryption for all shared contents like video, audio or screen sharing.
  • Security experts revealed several bugs that could have allowed webcam and mic hijacking and the theft of login credentials.
  • An online Tech Magazine reported that Zoom leaked thousands of their users’ email addresses and photos to strangers.
  • Video-conferences which users did not protect with a password, enabled “Zoombombing”, a phenomenon in which strangers hijacked videocalls and disrupted them by posting pornographic and racist images as well as spamming the conversations with threatening language. In response, Zoom introduced the Waiting Room feature and additional password settings.

At the same time, Zoom’s data privacy practices came under scrutiny:

  • Zoom shared web analytics data with third-party companies for advertising purposes without having a legal basis or notifying users about this practice. In response to criticism, Zoom revised its privacy policy and now declares that it does not share data from meetings for advertising.
  • The company also shared more analytics data of its users with Facebook than stated on Zoom’s privacy policy, even if the user did not sign in with their Facebook account. Zoom introduced an update in which this sharing is terminated.
  • The New York Times revealed that Zoom used a data mining feature that matched Zoom users’ names and email addresses to their LinkedIn profiles without the users knowing about it. Zoom then enabled automatic sharing of the matched LinkedIn profiles with other meeting members that were subscribers of a LinkedIn service for sales prospecting (“LinkedIn Sales Navigator”). In response to criticism, Zoom removed this feature permanently.
  • Zoom hosted a feature called Attention Tracking, which let the meeting’s host know when an attendee had clicked away the meeting window for more than 30 seconds. In the meantime, Zoom disabled the feature.

The security and privacy issues of Zoom have led various public authorities and companies internationally to ban their workers from using the service.

On 1 April 2020, Zoom’s founder and CEO Eric S. Yuan announced a 90-day plan to significantly improve their data security in an effort to build greater trust with their users. This plan includes freezing the introduction of new features, enlarge their cybersecurity team and engage outside help from security advisors.

Greek Data Protection Authority releases Guidance on Cookies

16. March 2020

On 25 February 2020, the Hellenic Data Protection Authority (DPA) published a guidance on Cookies and other tracking tools. Previously, the Authority had found that Greek websites and service providers have been largely failing to comply with the rules on the use of Cookies and other trackers set out by the ePrivacy Directive and the GDPR, and reaffirmed by the European Court of Justice’s ruling on Planet 49.

The guidance states that it will be relevant to HTTP/S Cookies, Flash Cookies, local storage applying to HTML 5, device fingerprinting, OS identifiers, and material identifiers.

The Greek DPA reiterated that, generally, providers are obliged to obtain the user’s consent if they are using any tracking tools – irrespective of whether the processing of personal data is taking place. It also outlined that technically necessary trackers are exempt from the obligation to consent. Furthermore, the guidance goes into detail on how information and consent can be made available on websites specifically.

Lastly, the Authority has given Greek website providers a grace period of two months to implement the provisions of this guidance and thereby become compliant with the European rules on tracking tools.

German Robert-Koch-Institute discusses mobile phone tracking to slow down the spreading of the Coronavirus

9. March 2020

According to a news report by the German newspaper “Der Tagesspiegel”, a small group of scientists at the Robert-Koch-Institute (RKI) and other institutions are currently discussing the evaluation and matching of movement data from mobile phones to detect people infected with the Coronavirus (COVID-19).

The scientists, who are trying to slow down the spreading of the disease, complain about the problem of the time-consuming and vague questionings of infected people on who they came in contact with. The evaluation and matching of mobile phone data may be more accurate and could speed up the process of identifying infected people, which could be essential for saving lives.

In a comment, the German Federal Commissioner for Data Protection Ulrich Kelber expressed that this procedure may cause large data protection issues, especially with regards to having a legal basis for processing and the proportionality of processing according to the GDPR.

More US States are pushing on with new Privacy Legislation

3. January 2020

The California Consumer Privacy Act (CCPA) came into effect on January 1, 2020 and will be the first step in the United States in regulating data privacy on the Internet. Currently, the US does not have a federal-level general consumer data privacy law that is comparable to that of the privacy laws in EU countries or even the supranational European GDPR.

But now, several other US States have taken inspiration from the CCPA and are in the process of bringing forth their own state legislation on consumer privacy protections on the Internet, including

  • The Massachusetts Data Privacy Law “S-120“,
  • The New York Privacy Act “S5642“,
  • The Hawaii Consumer Privacy Protection Act “SB 418“,
  • The Maryland Online Consumer Protection Act “SB 613“, and
  • The North Dakota Bill “HB 1485“.

Like the CCPA, most of these new privacy laws have a broad definition of the term “Personal Information” and are aimed at protecting consumer data by strenghtening consumer rights.

However, the various law proposals differ in the scope of the consumer rights. All of them grant consumers the ‘right to access’ their data held by businesses. There will also be a ‘right to delete’ in most of these states, but only some give consumers a private ‘right of action’ for violations.

There are other differences with regards to the businesses that will be covered by the privacy laws. In some states, the proposed laws will apply to all businesses, while in other states the laws will only apply to businesses with yearly revenues of over 10 or 25 Million US-Dollars.

As more US states are beginning to introduce privacy laws, there is an increasing possiblity of a federal US privacy law in the near future. Proposals from several members of Congress already exist (Congresswomen Eshoo and Lofgren’s Proposal and Senators Cantwell/Schatz/Klobuchar/Markey’s Proposal and Senator Wicker’s Proposal).

India updates privacy bill

12. December 2019

The new update of the Indian Personal Data Protection Bill is part of India’s broader efforts to tightly control the flow of personal data.

The bill’s latest version enpowers the government to ask companies to provide anonymized personal data, as well as other non-personal data in order to help to deliver governmental services and privacy policies. The draft defines “personal data” as information that can help to identify a person and also has characteristics, traits and any other features of a person’s identity. “Sensitive personal data” also includes financial and biometric data. According to the draft, such “sensitive” data can be transferred outside India for processing, but must be stored locally.

Furthermore, social media platforms will be required to offer a mechanism for users to prove their identities and display a verification sign publicly. Such requirements would raise a host of technical issues for companies such as Facebook and WhatsApp.

As a result, the new bill could affect the way companies process, store and transfer Indian consumers’ data. Therefore, it could cause some difficulties for top technology companies.

FTC takes action against companies claiming to participate in EU-U.S. Privacy Shield and other international privacy agreements

24. June 2019

The Federal Trade Commission (FTC) announced that it had taken action against several companies that pretended to be compliant with the EU-U.S. Privacy Shield and other international privacy agreements.

According to the FTC, SecureTest, Inc., a background screening company, has falsely claimed on its website to have participated in the EU-U.S. Privacy Shield and Swiss-U.S. Privacy Shield. These framework agreements allow companies to transfer consumer data from member states of the European Union and Switzerland to the United States in accordance with EU or Swiss law.

In September 2017, the company applied to the U.S. Department of Commerce for Privacy Shield certification. However, it did not take the necessary steps to be certified as compliant with the framework agreements.

Following the FTC’s complaint, the FTC and SecureTest, Inc. have proposed a settlement agreement. This proposal includes a prohibition for SecureTest to misrepresent its participation in any privacy or security program sponsored by any government or self-regulatory or standardization organization. The proposed agreement will be published in the Federal Register and subject to public comment for 30 days. Afterwards the FTC will make a determination regarding whether to make the proposed consent order final.

The FTC has also sent warning letters to 13 companies that falsely claimed to participate in the U.S.-EU Safe Harbor and the U.S.-Swiss Safe Harbor frameworks, which were replaced in 2016 by the EU-U.S. Privacy Shield and Swiss-U.S. Privacy Shield frameworks. The FTC asked companies to remove from their websites, privacy policies or other public documents any statements claiming to participate in a safe harbor agreement. If the companies fail to take action within 30 days, the FTC warned that it would take appropriate legal action.

The FTC also sent warning letters with the same request to two companies that falsely claimed in their privacy policies that they were participants in the Asia-Pacific Economic Cooperation (APEC) Cross-Border Privacy Rules (CBPR) system. The APEC CBPR system is an initiative to improve the protection of consumer data moving between APEC member countries through a voluntary but enforceable code of conduct implemented by participating companies. To become a certified participant, a designated third party, known as an APEC-approved Accountability Agent, must verify and confirm that the company meets the requirements of the CBPR program.

Apple advises app developer to reveal or remove code for screen recording

12. February 2019

After TechCrunch initiated investigations that revealed that numerous apps were recording screen usage, Apple called on app developers to remove or at least disclose the screen recording code.

TechCrunch’s investigation revealed that many large companies commission Glassbox, a customer experience analytics firm, to be able to view their users’ screens and thus follow and track keyboard entries and understand in which way the user uses the app. It turned out that during the replay of the session some fields that should have been masked were not masked, so that certain sensitive data, like passport numbers and credit card numbers, could be seen. Furthermore, none of the apps examined informed their users that the screen was being recorded while using the app. Therefore, no specific consent was obtained nor was any reference made to screen recording in the apps’ privacy policy.

Based on these findings, Apple immediately asked the app developers to remove or properly disclose the analytics code that enables them to record screen usage. Apples App Store Review Guidelines require that apps request explicit user consent and provide a clear visual indication when recording, logging, or otherwise making a record of user activity. In addition, Apple expressly prohibits the covert recording without the consent of the app users.

According to TechCrunch, Apple has already pointed out to some app developers that they have broken Apple’s rules. One was even explicitly asked to remove the code from the app, pointing to the Apple Store Guidelines. The developer was given less than a day to do so. Otherwise, Apple would remove the app from the App Store.

 

Google changes Privacy Policy due to GDPR

19. December 2018

As it is widely known these days, the General Data Protection Regulation (GDPR) came into force earlier this year to standardize data protection regulation in the EU. This has now lead to the fact that Google will update the company’s terms of service and privacy policy to be compliant with the GDPR.

The company started to notify the countries in the European Economic Area (EEA) and Switzerland in regard to some upcoming changes. They will come into effect on January 22, 2019.

The most important update, also legally, is the change of the data controller. The Google Ireland Limited will become the so called “data controller” who is responsible for the information of European and Swiss users . Therefore, Google Ireland Limited will be in charge to respond to request from users and to ensure compliance with the GDPR. At present, these services are provided by Google LLC, based in the U.S.

For website operators this means that they might also have to adapt their privacy policy accordingly. This is the case, for example, if Google Analytics is used.

Furthermore, there are no changes in regard to the current settings and services.

How is a company transferring data with a non-European company able to ensure the data-protection standard according to the General Data Protection Regulation (GDPR)?

21. March 2018

A trading deal between two companies often includes a high number of coincidentally transferred personal data. From the 25th May 2018 on the new GDPR regulates the data flow in the European Economic Area (EEA) that consists of all the members of the European Union, Iceland, Liechtenstein and Norway. The future status of Great Britain will be primarily the status of a third country.

Otherwise, business relationships to companies from non-EU or EEA States (like the USA, China, …) cannot guarantee the data protection standard of the GDPR automatically. Especially since the overruling of the “safe-harbour” agreement of the EU with the USA by the European Court of Justice (ECJ), every company that transfers data over the Atlantic is obligated to fulfil the data protection by itself. The European Commission (EC) recommends in its communication from the 10th January 2017 the use of so-called standard contractual clauses (SCC) or binding corporate rules (BCR), when an EU-based company transfers personal data to a non-EU based company or non-EU based entity of its corporate group.

This has a wide impact to the daily trade deals that are made all over Europe with third country companies. The EU recommends the data protection going hand in hand with the trading deals, to ensure the relatively high data protection level, which is based on Article 8 of the Charter of Fundamental Rights of the European Union. Especially until the ePrivacy-Regulation of the EU is not in force, every company has to ensure the standard of the GDPR by implementing a privacy policy, in which transfers of data to a third country has to be mentioned.

In conclusion, a company that trades with third country companies needs to enter a special data protection contract with the trading partner and needs to inform its clients by its privacy policy.

Pages: 1 2 Next
1 2