Category: Data Protection

Belarus passes first personal data protection law

27. May 2021

Last month, on April 2nd, the Belarusian House of Representatives adopted in the second reading the draft law “On the Protection of Personal Data”. The law was passed on May 7th. It is the first Belarusian legal act specifically intended to lay down issues of data protection.

The law is aimed at the legal regulation of social relations arising from the processing of personal data of individuals as well as ensuring the protection of such data and the rights and freedoms of individuals in the processing of their personal data. It implies that

Processing of personal data must be commensurate with the stated purposes of its processing and ensure at all stages a fair balance between the interests of all persons concerned.

The provisions concern in detail, inter alia:

  • definition of the categories of personal data as well as principles and conditions of their processing, with and without the use of automated means
  • determination of the process for cross-border transfer of personal data; in particular, it is prohibited if a foreign country does not provide an adequate level of protection of personal data subjects rights
  • determination of the data subject rights and obligations of public authorities, legal entities and natural persons within the processing of personal data, with regard to particularly the appointment of a Data Protection Officer and data breach notifications
  • establishment of additional safeguards against arbitrary and uncontrolled collection, storage, use, dissemination, provision and other processing of personal data
  • procedure for the establishment of an authority empowered with the protection of data subject rights and its competence; the foundation of the mentioned authority shall be assigned to the Council of Ministers of the Republic of Belarus together with the Operations and Analysis Center under the President of the Republic of Belarus within three months after the official publication of the corresponding law
  • liability for violation of the provisions.

The purpose of adopting this law is to ensure an adequate level of protection of personal data and to support the development of business, trade and economic relations of the Republic of Belarus with other countries.

The main provisions of the law shall enter into force six months after its official publication.

Google Play Store to require new privacy information

25. May 2021

In a blog post published on May 6th, 2021, by Suzanne Frey, VP, Product, Android Security and Privacy, Google announced a new policy that will require developers to provide more privacy and security information about their apps. These details will be made available to users in a new “safety section” in the Google Play Store starting in 2022. The announcement comes a few months after Apple began displaying similar privacy information in their App Store.

The new “safety section” will require Android app developers to explain what kind of data is collected by their apps. For example, whether the app collects personal information, such as name, username or email and whether it collects information directly from the phone, such as approximate or exact location, contacts, media (photos, videos, audio files). Developers must also disclose how the app uses the data. For example, to improve app functionality and personalization. The section will also include information about security features, such as encryption and compliance with Google’s policy for apps aimed at children and families.

The new policy won’t be in effect for a few months in order to give developers enough time to implement the changes. Developers can begin declaring the new information in the fourth quarter of 2021. Users will be able to see the information on Google Play starting in the first quarter of 2022, and all new and existing apps will have to declare the information starting in the second quarter of 2022.

The changes seem designed to allow app developers to better explain to customers whether they can trust an app with their data, rather than working to make apps more data-efficient.

Microsoft Cloud Services will store and process EU data within the EU

7. May 2021

On May 7th, 2021, Brad Smith, Microsoft’s President and Chief Legal Officer, announced in a blogpost that Microsoft will enable its EU commercial and public sector customers to store all their data in the EU. Microsoft calls this policy “EU Data Boundary” and it will apply across all of Microsoft’s core business cloud services, such as Azure, Microsoft 365 and Dynamics 365. Microsoft is the first big cloud provider to take such a step. The transition is intended to be done by the end of 2022.

This move can be seen as a reaction to the Court of Justice of the European Union’s (CJEU) “Shrems II” ruling in June 2020 (please see our blogpost), in which the CJEU ruled that the “EU-US-Privacy Shield” does not provide sufficient protection and therefore invalidating the agreement. The “Privacy Shield” was a framework for regulating the transatlantic exchange of personal data for commercial purposes between the EU and the USA.

However, the CJEU has clarified that server location and standard contractual clauses alone are not sufficient to meet the requirements of the General Data Protection Regulation (GDPR). This is because under U.S. law such as the “CLOUD Act”, U.S. law enforcement agencies have the power to compel U.S.-based technology companies to hand over requested data stored on servers, regardless of whether the data is stored in the U.S. or on foreign soil. So even with Microsoft’s proposed changes, U.S. authorities would still be able to access EU citizens’ personal data stored in the EU.

Microsoft believes it has found a way around the U.S. intelligence agencies: The U.S. intelligence agencies’ right of access could be technically worked around if customers effectively protected their data in the cloud themselves. To do this, customers would have to encrypt the data with a cryptographic key. In such a case, it would not be Microsoft that would manage the keys, but the customer themselves, and it would not be possible for Microsoft to hand over the keys to the US intelligence agencies. Microsoft also states that they are going above and beyond with their “Defending your Data” (please see our blogpost) measures to protect their customers’ data.

These measures by Microsoft are a step in the direction of a GDPR-compliant use of cloud applications, but whether they are sufficient to meet the high requirements of the GDPR may be doubted given the far-reaching powers of the US intelligence agencies. The reference to the possibility that users can encrypt their data themselves and keep the keys should help to comply with EU data protection standards, but must also be implemented in practice. Microsoft will have to educate its customers accordingly.

The GDPR-compliant transfer of personal data of EU citizens to the US remains uncertain territory, although further positive signals can be observed. For example, the new U.S. administration under President Joe Biden recently showed itself open to concluding a new comprehensive data protection agreement with the EU.

Portuguese DPA Orders Suspension of U.S. Data Transfers by National Institute of Statistics

29. April 2021

On April 27, 2021, the Portuguese Data Protection Authority “Comissão Nacional de Proteção de Dados” (CNPD) ordered the National Institute of Statistics (INE) to suspend any international data transfers of personal data to the U.S., as well as other countries without an adequate level of protection, within 12 hours.

The INE collects different kinds of data from Portuguese residents from 2021 Census surveys and transfers it to Cloudfare, Inc. (Cloudfare), a service provider in the U.S. that assists the surveys’ operation. EU Standard Contractual Clauses (SCCs) are in place with the U.S. service provider to legitimize the data transfers.

Due to receiving a lot of complaints, the CNPD started an investigation into the INE’s data transfers to third countries outside of the EU. In the course of the investigation, the CNDP concluded that Cloudfare is directly subject to U.S. surveillance laws, such as FISA 702, for national security purposes. These kinds of U.S. surveillance laws impose a legal obligation on companies like Cloudfare to give unrestricted access to personal data of its customers and users to U.S. public authorities without informing the data subjects.

In its decision to suspend any international data transfers of the INE, the CNPD referred to the Schrems II ruling of the Court of Justice of the European Union. Accordingly, the CNPD is if the opinion that personal data transferred to the U.S. by the INE was not afforded a level of data protection essentially equivalent to that guaranteed under EU law, as further safeguards have to be put in place to guarantee requirements that are essentially equivalent to those required under EU law by the principle of proportionality. Due to the lack of further safeguards, the surveillance by the U.S. authorities are not limited to what is strictly necessary, and therefore the SCCs alone do not offer adequate protection.

The CNPD also highlighted that, according to the Schrems II ruling, data protection authorities are obliged to suspend or prohibit data transfers, even when those transfers are based on the European Commission’s SCCs, if there are no guarantees that these can be complied with in the recipient country. As Cloudfare is also receiving a fair amount of sensitive data n relation to its services for the INE, it influenced the CNDP’s decision to suspend the transfers.

Irish DPC launches investigation into Facebook data leak

26. April 2021

On April 14th, 2021, Ireland’s Data Protection Commission (DPC) announced it launched an investigation into Facebook’s data leak reported earlier this month (please see our blog post here). The inquiry was initiated on the Irish DPC’s own volition according to section 110 of the Irish Data Protection Act. It comes after a dataset of 533 million Facebook users worldwide was made available on the internet.

The Irish DPC indicated in a statement that, “having considered the information provided by Facebook Ireland regarding this matter to date, the DPC is of the opinion that one or more provisions of the GDPR and/or the Data Protection Act 2018 may have been, and/or are being, infringed in relation to Facebook Users’ personal data”. The Irish DPC further stated that they had engaged with Facebook Ireland in relation to this reported issue, raising queries in relation to GDPR compliance, to which Facebook Ireland furnished a number of responses.

The launch of an investigation by the Irish authorities is significant due to the fact that Ireland remains home to Facebook’s European headquarters. This means the Irish DPC would act as the lead regulator within the European Union on all matters related to it. However, Ireland’s data watchdog has faced criticism from privacy advocates for being too slow with its GDPR investigations into large tech companies. In fact, the inquiry comes after the European Commission intervened to apply pressure on Ireland’s data protection commissioner.

Facebook’s statement on the inquiry has been shared through multiple media, and it has announced that Facebook is “cooperating fully with the DPC in its enquiry, which relates to features that make it easier for people to find and connect with friends on our services. These features are common to many apps and we look forward to explaining them and the protections we have put in place.”

EPRS publishes report on post-Brexit EU-UK Data Transfer Mechanisms

20. April 2021

On April 9th, 2021, the European Parliamentary Research Service (EPRS) published a report on data transfers in the private sector between the EU and the U.K. following Brexit.

The report reviews and assesses trade dealings, adequacy challenges and transfer instruments under the General Data Protection Regulation (GDPR). The report is intended to help take regulatory and business decisions, and in the Press Release the European Parliament stated that “a clear understanding of the state of play and future prospects for EU-UK transfers of personal data is indispensable”.

The report provides in-depth analysis of an adequacy decision for the UK as a viable long-term solution for data flows between the U.K. and the EU, also considering possible mechanisms for data transfer in the potential absence of an adequacy decision, such as Standard Contractual Clauses, Binding Corporate Rules, codes of conduct, and certification mechanism.

In this analysis the EPRS also sheds light on adequacy concerns such as U.K. surveillance laws and practices, shortcomings of the implementation of the GDPR, weak enforcement of data protection laws, and wavering commitment to EU data protection standards.

As part of its conclusion, the EPRS stated that the European Data Protection Board’s (‘EDPB’) opinion on the draft decision, which has just been published (please see our blogpost here), will likely scrutinise the Commission’s approach and provide recommendations on next steps.

Thailand: Another delay of the Personal Data Protection Act

9. April 2021

On May 28th, 2019, the Personal Data Protection Act (“PDPA”) became law in Thailand. It is the country’s very first legislation governing data protection. Originally, a one-year grace period was determined for implementation of the requirements so that companies could prepare for the prospective liabilities in order to become compliant with the PDPA. However, on May 21st, 2020, a Royal Decree extended the implementation of the PDPA’s key provisions for another year, until June 1st, 2021 (we reported). Currently, a further postponement of the PDPA’s enforcement date is being considered.

According to new Digital Economy and Society (“DES”) Minister, consideration may be given to deferring or amending the PDPA, if the public has negative views about it. The aim is to support small and medium-sized businesses affected by the legislation since most of them are still unprepared for the new obligations and have not adjusted their internal processes yet. In addition, there is an unfortunate lack of willingness among companies concerned, as deputy permanent secretary at the DES Ministry stated. These shortcomings are reflected by the fact that some associations, including the travel and automotive industries, have already requested the deferral of the PDPA’s enforcement.

Contrary to what was initially planned, the appointment of members to the Personal Data Protection Committee is also expected to be delayed further. The Committee plays a decisive role in the approval of subsidiary legislation. The drafts for this concern consent procedures, complaint reception and expert panels.

According to the current status, the PDPA needs further adjustments and necessary regulations still need to be drafted, as many issues have been raised for consultation with regard to the PDPA since it came into effect. The main priorities on which the government intends to focus are as follows:

  • Supporting people’s access to innovation and technology,
  • Creating an ecosystem conducive to a digital economy,
  • Gearing up for digital infrastructure development, particularly 5G and smart city projects,
  • Legal development and enforcement to create a trusted digital ecosystem, especially for the PDPA and issues related to electronic transactions and cybersecurity,
  • Protecting the public from abuse on social media and the internet.

The DES Ministry expects that full enforcement of the PDPA will likely be delayed until the end of this year.

Ikea France on trial for spying on staff and customers

7. April 2021

Ikea’s French subsidiary and several of its former executives stood trial on Monday, March 22nd, 2021, after being sued by former employees on charges of violating privacy rights by surveilling the plaintiffs, job applicants and customers.

Trade unions reported the furniture and household goods company to French authorities in 2012, accusing it of fraudulently collecting personal data and disclosing it without authorization. The subsequent criminal investigation uncovered an extensive espionage system. According to French prosecutors, the company hired a surveillance company, private investigators and even a former military operative to illegally obtain confidential information about its existing and prospective employees as well as customers. The files received contained, inter alia, criminal records and bank statements. The system has been used for years, possibly even over a decade, to identify individuals who were particularly suspicious or working against the company.

After the case caused outrage in 2012, Ikea’s main parent company fired several executives at the French branch, including the former general manager. But the extensive activity in France has again raised questions about data breaches by the company.

At Monday’s trial an employee accused the company of abuse since it had wrongly suspected him of being a bank robber because its investigative system had found prior convictions of a bank robber with the same name. Others claimed the retailer had browsed through employees’ criminal records and used unauthorized data to reveal those driving expensive cars despite low incomes or unemployment benefits. Even an assistant director who had taken a year of medical leave to recover from hepatitis C was monitored to investigate whether she had faked the severity of her illness. Illicit background checks on hundreds of job applicants were also conducted. Moreover, the system was used to track down customers seeking refunds for mismanaged orders.

One of the defendants, the former head of Ikea France’s risk management department, has testified at the hearing that EUR 530.000 to 630.000 a year had been earmarked for such investigations. The former CEOs and Chief Financial Officer as well as store managers are also on trial. In addition, four police officers are accused of handing over confidential information from police files.

Ikea France said in a statement that it takes the protection of its employees’ and customers’ data very seriously. The company added that it adopted compliance and training procedures to prevent illegal activity and changed internal policies after the criminal investigation had been initiated. But at Monday’s hearing, Ikea France’s lawyers denied a system-wide surveillance. The case was also called “a fairy tale” invented by trade union activists.

The deputy prosecutor claimed, Ikea France had illegally monitored at least 400 people and used the information to its advantage. She is asking for a fine of EUR 2.000.000 against the company, prison sentences of at least one year for two former CEOs and a private investigator, as well as fines for some store managers and police officers. A total of 15 people have been charged. The company also faces potential claims for damages from civil lawsuits filed by unions and several employees.

The trial ended on April 2nd. A verdict by a panel of judges is scheduled for June 15th.

CNIL plans to start enforcement on Ad Tracker Guideline

Starting from April 1st, 2021, the French supervisory authority the Commission Nationale de l’Informatique et des Libertés (CNIL) is planning on starting its enforcement of Ad Tracker usage across the internet.

Following its Ad Tracker Guideline, the CNIL gave companies a time frame to adjust ad tracker usage and ensure compliance with the Guideline as well as the GDPR. This chance for the companies to adjust their ad tracker usage has ended on March 31st, 2021.

The new rules on cookies and ad trackers mainly revolve around the chance for the user to give active, free and informed consent. User consent for advertising cookies must be granted by a “clear and positive act”. This encompasses actions such as clicking an “I accept” button and no longer can be agreed to by simply continuing to use the website.

In addition, cookie banners must not only give the option to accept, they also have to give the option to reject. The act to reject cookie has to be as simple and easy as the act to accept cookies. Referring to “Cookie Options” is no longer a valid form of rejection, as it makes the user have to go through an extra step which may dissuade them from rejecting cookies. A valid option remains rejecting cookies by closing the Cookie Banner, but it has to be ensured that unless the cookies are indeed accepted, none but the essential cookies are activated.

Lastly, the Cookie Banner has to give a short information on the usage of the cookies. The CNIL’s Guideline allows for a more detailed information to be linked in the Cookie Banner, however companies should also give a short information in the Cookie Banner in order to be able to obtain “informed” consent.

At the beginning of March, the CNIL announced that “compliance with the rules applicable to cookies and other trackers” would be one of its three priorities for 2021, along with cybersecurity and the protection of health data. In a first act to follow that goal, the CNIL will now begin to conduct checks to ensure websites are in compliance with advertising tracker guidelines.

It is expected that companies that did not adjust their cookie and ad tracker usages will face fines according to the level of lacking compliance.

EDPB released a new Guidance on Virtual Voice Assistants

31. March 2021

In recent years, Virtual Voice Assistants (VVA) have enjoyed increased popularity among technophile consumers. VVAs are integrated in modern smartphones like Siri on Apple or Google Assistant on Android mobile devices, but can also be found in seperate terminal devices like Alexa on the Amazon Echo device. With Smart Homes trending, VVAs are finding their ways into many homes.

However, in light of their general mode of operation and their specific usage, VVAs potentially have access to a large amount of personal data. They furthermore use new technologies such as machine learning and artificial intelligence in order to improve their services.

As both private households and corporate businesses are increasingly using VVAs and questions on data protection arise, the European Data Protection Board (EDPB) sought to provide guidance to the relevant data controllers. Therefore, the EDPB published a guidance on Virtual Voice Assistants earlier this month.

In its guidance, the EDPB specifically addresses VVA providers and VVA application developers. It encourages them to take considerations of data protection into account when designing their VVA service, as layed out by the principle of data protection by design and default under Art. 25 GDPR. The EDPB suggests that, for example, controllers could fulfil their information obligations pursuant to Art. 13/14 GDPR using voice based notifications if the VVA works with a screenless terminal device. VVA designers could also enable users to initiate a data subject request though easy-to-follow voice commands.

Moreover, the EDPB states that in their opinion, providing VVA services will require a Data Protection Impact Assessment according to Art. 35 GDPR. The guidance also gives further advice on complying with general data protection principles and is still open for public consultation until 23 April 2021.

Pages: Prev 1 2 3 4 5 6 7 8 9 10 11 12 Next
1 8 9 10 11 12