Tag: EU

EDPS considers Privacy Shield replacement unlikely for a while

18. December 2020

The data transfer agreements between the EU and the USA, namely Safe Harbor and its successor Privacy Shield, have suffered a hard fate for years. Both have been declared invalid by the European Court of Justice (CJEU) in the course of proceedings initiated by Austrian lawyer and privacy activist Max Schrems against Facebook. In either case, the court came to the conclusion that the agreements did not meet the requirements to guarantee equivalent data protection standards and thus violated Europeans’ fundamental rights due to data transfer to US law enforcement agencies enabled by US surveillance laws.

The judgement marking the end of the EU-US Privacy Shield (“Schrems II”) has a huge impact on EU companies doing business with the USA, which are now expected to rely on Standard Contractual Clauses (SCCs). However, the CJEU tightened the requirements for the SCCs. When using them in the future, companies have to determine whether there is an adequate level of data protection in the third country. Therefore, in particular cases, there may need to be taken additional measures to ensure a level of protection that is essentially the same as in the EU.

Despite this, companies were hoping for a new transatlantic data transfer pact. Though, the European Data Protection Supervisor (EDPS) Wojciech Wiewiórowski expressed doubts on an agreement in the near future:

I don’t expect a new solution instead of Privacy Shield in the space of weeks, and probably not even months, and so we have to be ready that the system without a Privacy Shield like solution will last for a while.

He justified his skepticism with the incoming Biden administration, since it may have other priorities than possible changes in the American national security laws. An agreement upon a new data transfer mechanism would admittedly depend on leveling US national security laws with EU fundamental rights.

With that in mind, the EU does not remain inactive. It is also trying to devise different ways to maintain its data transfers with the rest of the world. In this regard, the EDPS appreciated European Commission’s proposed revisions to SCCs, which take into consideration the provisions laid down in CJEU’s judgement “Schrems II”.

The proposed Standard Contractual Clauses look very promising and they are already introducing many thoughts given by the data protection authorities.

Update: The Council of the European Union publishes recommendations on encryption

8. December 2020

In November, the Austrian broadcasting network “Österreichischer Rundfunk” sparked a controversial discussion by publishing leaked drafts of the Council of the European Union (“EU Council”) on encryption (please see our blog post). After these drafts had been criticized by several politicians, journalists and NGOs, the EU Council published “Recommendations for a way forward on the topic of encryption” on December 1st, in which it considers it important to carefully balance between protecting fundamental rights with ensuring law enforcement investigative powers.

The EU Council sees a dilemma between the need for strong encryption in order to protect privacy on one hand, and the misuse of encryption by criminal subjects such as terrorists and organized crime on the other hand. They further note:

“We acknowledge this dilemma and are determined to find ways that will not compromise
either one, upholding the principle of security through encryption and security despite
encryption.”

The paper lists several intentions that are supposed to help find solutions to this dilemma.

First, it directly addresses EU institutions, agencies, and member states, asking them to coordinate their efforts in developing technical, legal and operational solutions. Part of this cooperation is supposed to be the joint implementation of standardized high-quality training programs for law enforcement officers that are tailored to the skilled criminal environment. International cooperation, particularly with the initiators of the “International Statement: End-to-End Encryption and Public Safety“, is proclaimed as a further intention.

Next the technology industry, civil society and academic world are acknowledged as important partners with whom EU institutions shall establish a permanent dialogue. The recommendations address internet service providers and social media platforms directly, noting that only with their involvement can the full potential of technical expertise be realized. Europol’s EU Innovation Hub and national research and development teams are named key EU institutions for maintaining this dialogue.

The EU Council concludes that the continuous development of encryption requires regular evaluation and review of technical, operational, and legal solutions.

These recommendations can be seen as a direct response to the discussion that arose in November. The EU Council is attempting to appease critics by emphasizing the value of encryption, while still reiterating the importance of law enforcement efficiency. It remains to be seen how willing the private sector will cooperate with the EU institutions and what measures exactly the EU Council intends to implement. This list of intentions lacks clear guidelines, recommendations or even a clearly formulated goal. Instead, the parties are asked to work together to find solutions that offer the highest level of security while maximizing law enforcement efficiency. In summary, these “recommendations” are more of a statement of intent than implementable recommendations on encryption.

The Controversy around the Council of the European Union’s Declaration on End-to-End Encryption

27. November 2020

In the course of November 2020, the Council of the European Union issued several draft versions of a joint declaration with the working title “Security through encryption and security despite encryption”. The drafts were initially intended only for internal purposes, but leaked and first published by the Austrian brodcasting network “Österreichischer Rundfunk” (“ORF”) in an article by journalist Erich Möchel. Since then, the matter has sparked widespread public interest and media attention.

The controversy around the declaration arose when the ORF commentator Möchel presented further information from unknown sources that “compentent authorities” shall be given “exceptional access” to the end-to-end encryption of communications. This would mean that communications service providers like WhatsApp, Signal etc. would be obliged to allow a backdoor and create a general key to encrypted communications which they would deposit with public authorities. From comparing the version of the declaration from 6 November 2020 with the previous version from 21 October 2020, he highlighted that in the previous version it states that additional practical powers shall be given to “law enforcement and judicial authorities”, whereas in the more recent version, the powers shall be given to “competent authorities in the area of security and criminal justice”. He adds that the new broader wording would include European intelligence agencies as well and allow them to undermine end-to-end encryption. Furthermore, he also indicated that plans to restrict end-to-end encyption in Western countries are not new, but originally proposed by the “Five Eyes” intelligence alliance of the United States, Canada, United Kingdom, Australia and New Zealand.

As a result of the ORF article, the supposed plans to restrict or ban end-to-end encryption have been widely criticised by Politicians, Journalists, and NGOs stating that any backdoors to end-to-end encryption would render any secure encryption impossible.

However, while it can be verified that the “Five Eyes” propose the creation of general keys to access end-to-end encrypted communications, similar plans for the EU cannot be clearly deduced from the EU Council’s declaration at hand. The declaration itself recognises end-to-end encryption as highly beneficial to protect governments, critical infrastructures, civil society, citizens and industry by ensuring privacy, confidentiality and data integrity of communications and personal data. Moreover, it mentions that EU data protection authorities have identified it as an important tool in light of the Schrems II decision of the CJEU. At the same time, the Council’s declaration illustrates that end-to-end encryption poses large challenges for criminal investigations when gathering evidencein cases of cyber crime, making it at times “practically impossible”. Lastly, the Council calls for an open, unbiased and active discussion with the tech industry, research and academia in order to achieve a better balance between “security through encryption and security despite encryption”.

Möchel’s sources for EU plans to ban end-to-end encryption through general keys remain unknown and unverifiable. Despite general concerns for overarching surveillance powers of governments, the public can only approach the controversy around the EU Council’s declaration with due objectivity and remain observant on whether or how the EU will regulate end-to-end encryption and find the right balance between the privacy rights of European citizens and the public security and criminal justice interests of governments.

EU Commission proposes “Data Governance Act”

The European Commission (“EC”) aims for an ecosystem of cheap, versatile, and secure EU-internal data transfers, so data transfers into non-EU-regions are less needed. For this goal, the EC proposed the “Data Governance Act” on November 25th, as a part of its “2020 European strategy for data“.  These strategies are intended in order to open up new ways of sharing data that is collected by companies and the public sector, or freely shared by individuals, while increasing public trust in data sharing by implementing several measures, such as establishing “data sharing intermediaries”. Combined with the Gaia-X project and several measures to follow, the Data Governance Act sets the basis to create a domestic data market that offers more efficiency of data transfers to the businesses, while also ensuring that GDPR standards are preserved. Key industries in the focus of this agenda are the agricultural, environmental, energy, finance, healthcare and mobility sectors as well as public administration.

During her speech presenting the Data Governance Act, Margarethe Vestager, Executive Vice President of the European Commission for A Europe Fit for the Digital Age, said that there are huge amounts of data produced every day, but not put to any productive use. As examples she names road traffic data from GPS, healthcare data that enables better and faster diagnosis, or data tracking heat usage from house sensors. The amount of data produced is only going to increase exponentially in the years to come. Vestager sees a lot of potential in this unused data and states the industry has an interest in using this data, however it lacks the tools to harness it.

EU based neutral data sharing intermediaries, who serve as safe data sharing organizers, are a key factor in this project. Their role is supposed to boost the willingness of sharing personal data whilst preserving the initial owner’s control. Therefore, intermediaries are not allowed to use the data for themselves, but function as neutral third-parties, transferring data between the data holder and the data user. Furthermore, intermediaries are to organize and combine different data in a neutral way, so no company secrets can be abused and the data is only used for the agreed purpose. Before they start operating, intermediates are required to notify the competent authority of their intention to provide data-sharing services.

New laws are going to ensure that sensitive and confidential data – such as intellectual property rights – can be shared and reused, while a legitimate level of protection is maintained. The same applies to data shared by individuals voluntarily. Individuals will be able to share personal data voluntarily in so-called “personal data spaces”. Once businesses will get access to these, they benefit from large amounts of data for low costs, no effort and on short notice. Vestager introduces the example of an individual suffering from a rare illness, who could provide data of his medical tests into such a personal data space, so businesses can use this data to work on treatments. Further examples are improvements in the management of climate change and the development of more precise farming tools.

To ensure the trust of potential participants, each EU-member-state is supposed to implement new competent authorities that are tasked with implementing and enforcing the Data Governance Act. A new EU-institution, the “European Data Innovation Board”, will be implemented and tasked with informing the EC about new data innovations and working out guidelines on how to implement these innovations into practice.

A more fluent exchange between different kinds of technical expertise is the hoped-for outcome of these changes, as a means to diminish the influence of big tech companies from the U.S. and China.

The Data Governance Act now needs to go through the regular legislative process. A timetable for when it is supposed to come into effect has not yet been set.

EDPB issues guidance on data transfers following Schrems II

17. November 2020

Following the recent judgment C-311/18 (Schrems II) by the Court of Justice of the European Union (CJEU), the European Data Protection Board (EDPB) published “Recommendations on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data” on November 11th. These measures are to be considered when assessing the transfer of personal data to countries outside of the European Economic Area (EEA), or so-called third countries. These recommendations are subject to public consultation until the end of November. Complementing these recommendations, the EDPB published “Recommendations on the European Essential Guarantees for surveillance measures”. Added together both recommendations are guidelines to assess sufficient measures to meet standards of the General Data Protection Regulation (GDPR), even if data is transferred to a country lacking protection comparable to that of the GDPR.

The EDPB highlights a six steps plan to follow when checking whether a data transfer to a third country meets the standards set forth by the GDPR.

The first step is to map all transfers of personal data undertaken, especially transfers into a third country. The transferred data must be adequate, relevant and limited to what is necessary in relation to the purpose. A major factor to consider is the storage of data in clouds. Furthermore, onwards transfer made by processors should be included. In a second step, the transfer tool used needs to be verified and matched to those listed in Chapter V of the GDPR. The third step is assessing if anything in the law or practice of the third country can impinge on the effectiveness of the safeguards of the transfer tool. The before mentioned Recommendations on European Essential Guarantees are supposed to help to evaluate a third countries laws, regarding the access of data by public authorities for the purpose of surveillance.

If the conclusion that follows the previous steps is that the third countries legislation impinges on the effectiveness of the Article 46 GDPR tool, the fourth step is identifying supplementary measures that are necessary to bring the level of protection of the data transfer up to EU Standards, or at least an equivalent, and adopting these. Recommendations for such measures are listed in Annex 2 of the EDPB Schrems II Recommendations. They may be of contractual, technical, or organizational nature. In Annex 2 the EDPB mentions seven technical cases they found and evaluates them. Five were deemed to be scenarios for which effective measures could be found. These are:

1. Data storage in a third country, that does not require access to the data in the clear.
2. Transfer of pseudonymized data.
3. Encrypted data merely transiting third countries.
4. Transfer of data to by law specially protected recipients.
5. Split or multi-party processing.

Maybe even more relevant are the two scenarios the EDPB found no effective measures for and therefore deemed to not be compliant with GDPR standards.:

6. Transfer of data in the clear (to cloud services or other processors)
7. Remote access (from third countries) to data in the clear, for business purposes, such as, for example, Human Resources.

These two scenarios are frequently used in practice. Still, the EDPB recommends not to execute these transfers in the upcoming future.
Examples of contractual measures are the obligation to implement necessary technical measures, measures regarding transparency of (requested) access by government authorities and measures to be taken against such requests. Accompanying this the European Commission published a draft regarding standard contractual clauses for transferring personal data to non-EU countries, as well as organizational measures such as internal policies and responsibilities regarding government interventions.

The last two steps are undertaking the formal procedural steps to adapt supplementary measures required and re-evaluating the former steps in appropriate intervals.

Even though these recommendations are not (yet) binding, companies should take a further look at the recommendations and check if their data transfers comply with the new situation.

Irish DPC updates Guidance on Data Processing’s Legal Bases

17. December 2019

The Irish Data Protection Commission (DPC) has updated their guidance on the legal bases for personal data processing. It focuses on data processing under the European General Data Protection Regulation (GDPR) as well as data processing requirements under the European Law Enforcement Directive.

The main points of the updates to the guidance are to make companies more sensitive of their reasons for processing personal data and choosing the right legal basis, as well as ensure that data subjects may be able to figure out if their data is being processed lawfully.

The guidance focuses on the different legal bases in Art.6 GDPR, namely consent, contracts, legal obligation, vital interests, public task or legitimate interests. The Irish DPC states that controllers do not only have to choose the right legal basis, but they also have to understand the obligations that come with the chosen one, which is why they wanted to go into further detail.

Overall, the guidance is made to aid both controllers and data subjects. It consists of a way to support a better understanding of the terminology, as well as the legal requirements the GDPR sets out for processing personal data.

CNIL publishes report on facial recognition

21. November 2019

The French Data Protection Authority, Commission Nationale de l’Informatique et des Libertés (CNIL), has released guidelines concerning the experimental use of facial recognition software by the french public authorities.

Especially concerned with the risks of using such a technology in the public sector, the CNIL made it clear that the use of facial recognition has vast political as well as societal influences and risks. In its report, the CNIL explicitly stated the software can yield very biased results, since the algorithms are not 100% reliable, and the rate of false-positives can vary depending on the gender and on the ethnicity of the individuals that are recorded.

To minimize the chances of an unlawful use of the technology, the CNIL came forth with three main requirements in its report. It recommended to the public authorities, that are using facial recognition in an experimental phase, to comply with them in order to keep the chances of risks to a minimum.

The three requirements put forth in the report are as follows:

  • Facial recognition should only be put to experimental use if there is an established need to implement an authentication mechanism with a high level of reliability. Further, there should be no less intrusive methods applicable to the situation.
  • The controller must under all circumstances respect the rights of the individuals beig recorded. That extends to the necessity of consent for each device used, data subjects’ control over their own data, information obligation, and transparency of the use and purpose, etc.
  • The experimental use must follow a precise timeline and be at the base of a rigorous methodology in order to minimize the risks.

The CNIL also states that it is important to evaluate each use of the technology on a case by case basis, as the risks depending on the way the software is used can vary between controllers.

While the CNIL wishes to give a red lining to the use of facial recognition in the future, it has also made clear that it will fulfill its role by showing support concerning issues that may arise by giving counsel in regards to legal and methodological use of facial recognition in an experimental stage.

Category: EU · French DPA · GDPR · General
Tags: , , , ,

European Commission releases third annual Privacy Shield Review report

25. October 2019

The European Commission has released a report on the E.U.-U.S. Privacy Shield, which represents the third annual report on the performance of the supranational Agreement, after it came into effect in July 2016. The discussions on the review were launched on 12 September 2019 by Commissioner for Justice, Consumers and Gender Equality Věra Jourová, with the U.S. Secretary of Commerce Wilbur Ross in Washington, DC.

The Privacy Shield protects the fundamental rights of anyone in the European Union whose personal data is transferred to certified companies in the United States for commercial purposes and brings legal clarity for businesses relying on transatlantic data transfer. The European Commission is commited to review the Agreement on an annual basis to ensure that the level of protection certified under the Privacy Shield continues to be at an adequate level.

This year’s report validates the continuous adequacy of the protection for personal data transferred to certified companies in the U.S. from the Europan Union under the Privacy Shield. Since the Framework was implemented, about 5000 companies have registered with the Privacy Shield. The EU Commissioner for Justice, Consumers and Gender Equality stated that “the Privacy Shield has become a success story. The annual review is an important health check for its functioning“.

The improvements compared to the last annual review in 2018 include the U.S. Department of Commerce’s efforts to ensure necessary oversight in a systematic manner. This is done by monthly checks with samply companies that are certified unter the Privacy Shield. Furthermore, an increasing number of European Citizens are making use of their rights under the Framework, and the resulting response mechanisms are functioning well.

The biggest criticism the European Commission has stated came in the form of the recommendation of firm steps to ensure a better process in the (re)certification process under the Privacy Shield. The time of the (re)certification process allows companies to get recertified within three months after their certification has run out, which can lead to a lack of transparency and confusion, since those companies will still be listed in the registry. A shorter time frame has been proposed by the European Commission to guarantee a higher level of security.

Overall, the third annual review has been seen as a success in the cooperation between the two sides, and both the U.S. and the European officials agree that there is a need for strong and credible enforcement of privacy rules to protect the respective citizens and ensure trust in the digital economy.

German data protection authorities develop fining concept under GDPR

24. October 2019

In a press release, the German Conference of Data Protection Authorities (Datenschutzkonferenz, “DSK”) announced that it is currently developing a concept for the setting of fines in the event of breaches of the GDPR by companies. The goal is to guarantee a systematic, transparent and comprehensible fine calculation.

The DSK clarifies that this concept has not yet been adopted, but is still in draft stage and will be further worked on. At present it is practiced accompanying with current fine proceedings in order to test it for its practical suitability and aiming accuracy. However, the concrete decisions are nevertheless based on Art. 83 GDPR.

Art. 70 Para. 1 lit. k of the GDPR demands a harmonization of the fine setting within Europe. Therefore guidelines shall be elaborated. For this reason, the DSK draft will be brought into line with the concepts of other EU member states.

Also, at European level a European concept is currently being negotiated. This concept should then be laid down in a guideline, at least in principle. The DSK has also contributed its considerations on the assessment.

The fine concept will be discussed further on 6th and 7th November. After prior examination, a decision will be taken on whether the concept on the setting of fines shall be published.

Category: Data Breach · EU · GDPR
Tags: , , ,

USA and UK sign Cross Border Data Access Agreement for Criminal Electronic Data

10. October 2019

The United States and the United Kingdom have entered into the first of its kind CLOUD Act Data Access Agreement, which will allow both countries’ law enforcement authorities to demand authorized access to electronic data relating to serious crime. In both cases, the respective authorities are permitted to ask the tech companies based in the other country, for electronic data directly and without legal barriers.

At the base of this bilateral Agreement stands the U.S.A.’s Clarifying Lawful Overseas Use of Data Act (CLOUD Act), which came into effect in March 2018. It aims to improve procedures for U.S. and foreign investigators for obtaining electronic information held by service providers in the other country. In light of the growing number of mutual legal assistance requests for electronic data from U.S. service providers, the current process for access may take up to two years. The Data Access Agreement can reduce that time considerably by allowing for a more efficient and effective access to data needed, while protecting the privacy and civil liberties of the data subjects.

The Cloud Act focuses on updating legal frameworks to respond to the growing technology in electronic communications and service systems. It further enables the U.S. and other countries to enter into a mutual executive Agreement in order to use own legal authorities to access electronic evidence in the other respective country. An Agreement of this form can only be signed by rights-respecting countries, after it has been certified by the U.S. Attorney General to the U.S. Congress that their laws have robust substansive and procedural protections for privacy and civil liberties.

The Agreement between the U.K. and the U.S.A. further assures providers that the requested disclosures are compatible with data protection laws in both respective countries.

In addition to the Agreement with the United Kingdom, there have been talks between the United States and Australia on Monday, reporting negotiations for such an Agreement between the two countries. Other negotiations have also been held between the U.S. and the European Commission, representing the European Union, in regards to a Data Access Agreement.

Category: General · UK · USA
Tags: , , , ,
Pages: Prev 1 2 3 4 5 Next
1 2 3 4 5