Category: Privacy policy

Canada’s new privacy policy: Bill C-27

29. June 2022

On June 16th, 2022 the Canadian Federal Government has introduced a new privacy bill, named Bill C-27 (a re-working of Bill C-11). Among its main goals there is the will to strengthen the role of the Privacy Commissioner and to establish a special Data Protection Tribunal. Furthermore, it aims to propose new regulations regarding artificial intelligence. If passed, the act would substitute Part 1 of the current PIPEDA (Personal Information and Electronic Documents Act), replacing it with the new CPPA (Consumer Privacy Protection Act). Bill C-27 still needs to undergo reviews by various committees and is not expected to come into force until after summer.

The Office of the Privacy Commissioner  enforces the Canadian federal privacy laws and provides counsel to individuals regarding the protection of their personal data and their rights. With the new bill the Commissioner will be able to make recommendations about penalties to the Tribunal along with other authorities.

If the Bill comes into force, the Data Protection Tribunal’s power will be amplified. Its decisions will be binding and final.  Moreover, its decisions may be enforced as if they were orders of a superior court. The Tribunal also may review the recommendations made by the Privacy Commissioner, but is not bound to follow them in any way.

One other important innovation brought by Bill C-27 is the clarification of the concept of legitimate interest: this has been added as an exception to consent, as it outweighs potential adverse effects on the data subject.

All data regarding children are now considered to be sensitive, and must be treated as such by organizations and corporations. This means introducing higher standards for handling that data and limiting the rights to collect that information.

The concepts of de-identification and anonymization have been adapted to global standards.

Finally, along with Bill C-27 the Government aims to introduce the new Artificial Intelligence and Data Act, creating a framework for high-impact AI systems. Its goals are to regulate international and intraprovincial AI systems commerce by introducing common requirements across Canada, and to prohibit conduct in relation to AI systems that may result in harm to individuals or their interests. A new working definition of AI system is given.

Lastly, the Act aims at the creation of a new AI Data Commissioner inside a ministry. This figure will help the enforcement of the Act across Canada.

CJEU considers representative actions admissible

29. April 2022

Associations can bring legal proceedings against companies according to a press release of the European Court of Justice (CJEU).

This is the conclusion reached by the Court in a decision on the proceedings of the Federation of German Consumer Organisations (vzbv), which challenged Facebook’s data protection directive. Accordingly, it allows a consumer protection association to bring legal proceedings, in the absence of a mandate conferred on it for that purpose and independently of the infringement of specific rights of the data subjects, against the person allegedly responsible for an infringement of the laws protecting personal data, The vzbv is an institution that is entitled to bring legal proceeding under the GDPR because it pursues an objective in the public interest.

Specifically, the case is about third-party games on Facebook, in which users must agree to the use of data in order to be able to play these games on Facebook. According to the association, Facebook has not informed the data subjects in a precise, transparent and understandable form about the use of the data, as is actually prescribed by the General Data Protection Regulation (GDPR). The Federal Court of Justice in Germany (BGH) already came to this conclusion in May 2020 however, it was not considered sufficiently clarified whether the association can bring legal proceedings in this case.

The EU Advocate General also concluded before that the association can bring legal proceeding in a legally non-binding statement.

Thus, the CJEU confirmed this view so that the BGH must now finally decide on the case of vzbv vs. facebook. It is also important that this decision opens doors for similar collective actions against other companies.

Google Play Store apps soon obliged to provide privacy notices

20. August 2021

On the Android Developers Blog, Google has announced further details for the upcoming new safety section in its Play Store. It aims at presenting the security of the offered apps in a simple way to give users a deeper insight into privacy and security practices of the developers. This should allow users to see what data the app may be collecting and why, even before the installation. In order to achieve this, apps in the Google Play Store will be required to publish the corresponding information in the safety section.

The new summary will be displayed to users on an app’s store listing page. It is intended to highlight details such as:

  • What type of data is collected and shared, e.g. location, contacts, name, email address, financial information,
  • How the data will be used, e.g. for app functionality or personalization,
  • Whether the data collection is optional or mandatory for the use of an app,
  • Security practices, e.g. data encryption,
  • Compliance with the family policy,
  • Validation from an independent source against a global security standard.

To support the safety section, policy changes are being made which should lead to more transparency to users. Thus, all developers will be required to provide a privacy notice. Previously, only apps that collected personal and sensitive user data had to do so. The innovation applies to all apps published on Google Play, including Google’s own apps.

Developers will be able to submit information to the Google Play Console for review in October. However, by April 2022 at the latest, the safety section must be approved for their apps. The reason for this is that the new section is scheduled to be rolled out and visible to users in Q1 2022.

Aside from sharing additional information for developers on how to get prepared, Google has also assured that more guidance will be released over the next few months.

WhatsApp’s privacy policy update halted

22. January 2021

Already at the beginning of December 2020, first indications came up signaling that WhatsApp will change its terms of service and privacy policy. Earlier this year, users received the update notice when launching the app on their device. It stated that the new terms concern additional information on how WhatsApp processes user data and how businesses can use Facebook hosted services to store and manage their WhatsApp chats. The terms should be accepted by February 8th, 2021, to continue using the chat service. Otherwise, the deletion of the account was suggested, because it will not be possible to use WhatsApp without accepting the changes. The notice has caused all sorts of confusion and criticism, because it has mistakenly made many users believe that the agreement allows WhatsApp to share all collected user data with company parent Facebook, which had faced repeated privacy controversies in the past.

Users’ fears in this regard are not entirely unfounded. As a matter of fact, outside the EU, WhatsApp user data has already been flowing to Facebook since 2016 – for advertising purposes, among other things. Though, for the EU and the United Kingdom, other guidelines apply without any data transfer.

The negative coverage and user reactions caused WhatsApp to hastily note that the changes explicitly do not affect EU users. Niamh Sweeney, director of policy at WhatsApp, said via Twitter that it remained the case that WhatsApp did not share European user data with Facebook for the purpose of using this data to improve Facebook’s products or ads.

However, since the topic continues to stir the emotions, WhatsApp felt compelled to provide clarification with a tweet and a FAQ. The statements make it clear once again that the changes are related to optional business features and provide further transparency about how the company collects and uses data. The end-to-end encryption, with which chat content is only visible to the participating users, will not be changed. Moreover, the new update does not expand WhatsApp’s ability to share data with Facebook.

Nevertheless, despite all efforts, WhatsApp has not managed to explain the changes in an understandable way. It has even had to accept huge user churn in recent days. The interest in messenger alternatives has increased enormously. Eventually, the public backlash led to an official announcement that the controversial considered update will be delayed until May 15th, 2021. Due to misinformation and concern, users shall be given more time to review the policy on their own in order to understand WhatsApp’s privacy and security principles.

California Voters approve new Privacy Legislation CPRA

20. November 2020

On November 3rd 2020, Californian citizens were able to vote on the California Privacy Rights Act of 2020 (“CPRA”) in a state ballot (we reported). As polls leading up to the vote already suggested, California voters approved the new Privacy legislation, also known as “Prop 24”. The CPRA was passed with 56.2% of Yes Votes to 43.8% of No Votes. Most provisions of the CPRA will enter into force on 1 January 2021 and will become applicable to businesses on 1 January 2023. It will, at large, only apply to information collected from 1 January 2022.

The CPRA will complement and expand privacy rights of California citizens considerably. Among others, the amendments will include:

  • Broadening the term “sale” of personal information to “sale or share” of private information,
  • Adding new requirements to qualify as a “service provider” and defining the term “contractor” anew,
  • Defining the term “consent”,
  • Introducing the category of “Sensitive Information”, including a consumer’s Right to limit the use of “Sensitive Information”,
  • Introducing the concept of “Profiling” and granting consumers the Right to Opt-out of the use of the personal information for Automated Decision-Making,
  • Granting consumers the Right to correct inaccurate information,
  • Granting consumers the Right to Data Portability, and
  • Establishing the California Privacy Protection Agency (CalPPA) with a broad scope of responsibilities and enforcement powers.

Ensuring compliance with the CPRA will require proper preparation. Affected businesses will have to review existing processes or implement new processes in order to guarantee the newly added consumer rights, meet the contractual requirements with service providers/contractors, and show compliance with the new legislation as a whole.

In an interview after the passage of the CPRA, the initiator of the CCPA and the CPRA Alastair Mactaggard commented that

Privacy legislation is here to stay.

He hopes that California Privacy legislation will be a model for other states or even the U.S. Congress to follow, in order to offer consumers in other parts of the country the same Privacy rights as there are in California now.

H&M receives record-breaking 35 Mio Euro GDPR Fine in Germany

21. October 2020

In the beginning of October, the Hamburg Data Protection Commissioner (“HmbBfDI”) imposed a record-breaking 35,258,707.95 Euro GDPR fine on the German branch of the Swedish clothing-retail giant H&M. It is the highest fine, based on a GDPR violation, a German Data Protection Authority has ever issued.

Since 2014, the management of the H&M service centre in Nuremberg extensively monitored the private lives of their employees in various ways. Following holidays and sick leaves of employees, team leaders would conduct so-called “Welcome Back Talks” in which they recorded employees’ holiday experiences, symptoms of illnesses and medical diagnoses. Some H&M supervisors gathered a broad data base of their employees’ private lives as they recorded details on family issues and religious beliefs from one-on-one talks and even corridor conversations. The recordings had a high level of detail and were updated over time and in some cases were shared with up to 50 other managers throughout the whole company. The H&M supervisors also used this Personal Data to create profiles of their employees and to base future employment decisions and measures on this information. The clandestine data collection only became known as a result of a configuration error in 2019 when the notes were accessible company-wide for a few hours.

After the discovery, the H&M executives presented the HmbBfDI a comprehensive concept on improving Data Protection at their Nuremberg sub-branch. This includes newly appointing a Data Protection coordinator, monthly Data Protection status updates, more strongly communicated whistleblower protection and a consistent process for granting data subject rights. Furthermore, H&M has apologised to their employees and paid the affected people a considerable compensation.

With their secret monitoring system at the service centre in Nuremberg, H&M severely violated the GDPR principles of lawfulness, fairness, and transparency of processing pursuant to Art. 5 no. 1 lit. a) and Art. 6 GDPR because they did not have a legal basis for collecting these Personal Data from their employees. The HmbBfDI commented in his statement on the magnitude of the fine saying that “the size of the fine imposed is appropriate and suitable to deter companies from violating the privacy of their employees”.

German State Data Protection Commissioner imposes 1.2 million € GDPR fine

1. July 2020

The German State Data Protection Commissioner of Baden-Württemberg (“LfDI Ba-Wü”)  imposed a GDPR fine of 1.240.000€ on the German statutory health insurance provider AOK Baden-Württemberg (“AOK”). The fine was a result of the health insurance’s lack of technical and organisational measures pursuant to Art. 32 GDPR. It is the highest fine the LfDI Ba-Wü has ever imposed.

Between 2015 and 2019 the AOK organised lotteries on various occasions and collected personal data of the participants, including their contact details and current health insurance affiliations. The AOK wanted to use the data of the lottery participants for advertising purposes, insofar as the participants gave their consent to this. To ensure the security of processing, the AOK implemented internal guidelines and data protection training of their staff as technical and organisatioal measures. However, these measures were not sufficient to comply with Art. 32 GDPR because AOK staff used the personal data of more than 500 lottery participants for advertising purposes without their prior consent.

Following the investigation of the LfDI Ba-Wü, the AOK immediately stopped all marketing activities in order to revise their internal policies and processes against the GDPR. The LfDI Ba-Wü explained that in determining the extent of the fine, it considered the following mitigating factors:

  • the cooperation of the AOK with the Data Protection Authority,
  • the fact that the AOK as a statutory health insurance provider is an important part of the German healthcare system, and
  • the burdens of the current Corona-Pandemic on the healthcare system.

Finally, the Commissioner pointed out that technical and organisational measures must be regularly adjusted to the actual conditions of each processing activity, in order to ensure an adequate level of data protection in the long term.

The Video-conference service Zoom and its Data Security issues

20. April 2020

Amidst the Corona crisis, the video communications service Zoom gained enormous popularity. The rate of daily Zoom users skyrocketed from 10 Mio in December 2019 to 200 Mio in March 2020. As it outshined many of its competitors, Zoom labels itself as “the leader in modern enterprise video communications”. However, the company has been facing a lot of public criticism because of its weaknesses in data security and lack of awareness in data protection matters.

Basic data security weaknesses unfolded little by little starting in March 2020:

  • Zoom had to admit that it was wrongly advertising to provide full end-to-end encryption for all shared contents like video, audio or screen sharing.
  • Security experts revealed several bugs that could have allowed webcam and mic hijacking and the theft of login credentials.
  • An online Tech Magazine reported that Zoom leaked thousands of their users’ email addresses and photos to strangers.
  • Video-conferences which users did not protect with a password, enabled “Zoombombing”, a phenomenon in which strangers hijacked videocalls and disrupted them by posting pornographic and racist images as well as spamming the conversations with threatening language. In response, Zoom introduced the Waiting Room feature and additional password settings.

At the same time, Zoom’s data privacy practices came under scrutiny:

  • Zoom shared web analytics data with third-party companies for advertising purposes without having a legal basis or notifying users about this practice. In response to criticism, Zoom revised its privacy policy and now declares that it does not share data from meetings for advertising.
  • The company also shared more analytics data of its users with Facebook than stated on Zoom’s privacy policy, even if the user did not sign in with their Facebook account. Zoom introduced an update in which this sharing is terminated.
  • The New York Times revealed that Zoom used a data mining feature that matched Zoom users’ names and email addresses to their LinkedIn profiles without the users knowing about it. Zoom then enabled automatic sharing of the matched LinkedIn profiles with other meeting members that were subscribers of a LinkedIn service for sales prospecting (“LinkedIn Sales Navigator”). In response to criticism, Zoom removed this feature permanently.
  • Zoom hosted a feature called Attention Tracking, which let the meeting’s host know when an attendee had clicked away the meeting window for more than 30 seconds. In the meantime, Zoom disabled the feature.

The security and privacy issues of Zoom have led various public authorities and companies internationally to ban their workers from using the service.

On 1 April 2020, Zoom’s founder and CEO Eric S. Yuan announced a 90-day plan to significantly improve their data security in an effort to build greater trust with their users. This plan includes freezing the introduction of new features, enlarge their cybersecurity team and engage outside help from security advisors.

Greek Data Protection Authority releases Guidance on Cookies

16. March 2020

On 25 February 2020, the Hellenic Data Protection Authority (DPA) published a guidance on Cookies and other tracking tools. Previously, the Authority had found that Greek websites and service providers have been largely failing to comply with the rules on the use of Cookies and other trackers set out by the ePrivacy Directive and the GDPR, and reaffirmed by the European Court of Justice’s ruling on Planet 49.

The guidance states that it will be relevant to HTTP/S Cookies, Flash Cookies, local storage applying to HTML 5, device fingerprinting, OS identifiers, and material identifiers.

The Greek DPA reiterated that, generally, providers are obliged to obtain the user’s consent if they are using any tracking tools – irrespective of whether the processing of personal data is taking place. It also outlined that technically necessary trackers are exempt from the obligation to consent. Furthermore, the guidance goes into detail on how information and consent can be made available on websites specifically.

Lastly, the Authority has given Greek website providers a grace period of two months to implement the provisions of this guidance and thereby become compliant with the European rules on tracking tools.

German Robert-Koch-Institute discusses mobile phone tracking to slow down the spreading of the Coronavirus

9. March 2020

According to a news report by the German newspaper “Der Tagesspiegel”, a small group of scientists at the Robert-Koch-Institute (RKI) and other institutions are currently discussing the evaluation and matching of movement data from mobile phones to detect people infected with the Coronavirus (COVID-19).

The scientists, who are trying to slow down the spreading of the disease, complain about the problem of the time-consuming and vague questionings of infected people on who they came in contact with. The evaluation and matching of mobile phone data may be more accurate and could speed up the process of identifying infected people, which could be essential for saving lives.

In a comment, the German Federal Commissioner for Data Protection Ulrich Kelber expressed that this procedure may cause large data protection issues, especially with regards to having a legal basis for processing and the proportionality of processing according to the GDPR.

Pages: 1 2 3 Next
1 2 3