Tag: EU

CNIL publishes report on facial recognition

21. November 2019

The French Data Protection Authority, Commission Nationale de l’Informatique et des Libertés (CNIL), has released guidelines concerning the experimental use of facial recognition software by the french public authorities.

Especially concerned with the risks of using such a technology in the public sector, the CNIL made it clear that the use of facial recognition has vast political as well as societal influences and risks. In its report, the CNIL explicitly stated the software can yield very biased results, since the algorithms are not 100% reliable, and the rate of false-positives can vary depending on the gender and on the ethnicity of the individuals that are recorded.

To minimize the chances of an unlawful use of the technology, the CNIL came forth with three main requirements in its report. It recommended to the public authorities, that are using facial recognition in an experimental phase, to comply with them in order to keep the chances of risks to a minimum.

The three requirements put forth in the report are as follows:

  • Facial recognition should only be put to experimental use if there is an established need to implement an authentication mechanism with a high level of reliability. Further, there should be no less intrusive methods applicable to the situation.
  • The controller must under all circumstances respect the rights of the individuals beig recorded. That extends to the necessity of consent for each device used, data subjects’ control over their own data, information obligation, and transparency of the use and purpose, etc.
  • The experimental use must follow a precise timeline and be at the base of a rigorous methodology in order to minimize the risks.

The CNIL also states that it is important to evaluate each use of the technology on a case by case basis, as the risks depending on the way the software is used can vary between controllers.

While the CNIL wishes to give a red lining to the use of facial recognition in the future, it has also made clear that it will fulfill its role by showing support concerning issues that may arise by giving counsel in regards to legal and methodological use of facial recognition in an experimental stage.

Category: EU · French DPA · GDPR · General
Tags: , , , ,

European Commission releases third annual Privacy Shield Review report

25. October 2019

The European Commission has released a report on the E.U.-U.S. Privacy Shield, which represents the third annual report on the performance of the supranational Agreement, after it came into effect in July 2016. The discussions on the review were launched on 12 September 2019 by Commissioner for Justice, Consumers and Gender Equality Věra Jourová, with the U.S. Secretary of Commerce Wilbur Ross in Washington, DC.

The Privacy Shield protects the fundamental rights of anyone in the European Union whose personal data is transferred to certified companies in the United States for commercial purposes and brings legal clarity for businesses relying on transatlantic data transfer. The European Commission is commited to review the Agreement on an annual basis to ensure that the level of protection certified under the Privacy Shield continues to be at an adequate level.

This year’s report validates the continuous adequacy of the protection for personal data transferred to certified companies in the U.S. from the Europan Union under the Privacy Shield. Since the Framework was implemented, about 5000 companies have registered with the Privacy Shield. The EU Commissioner for Justice, Consumers and Gender Equality stated that “the Privacy Shield has become a success story. The annual review is an important health check for its functioning“.

The improvements compared to the last annual review in 2018 include the U.S. Department of Commerce’s efforts to ensure necessary oversight in a systematic manner. This is done by monthly checks with samply companies that are certified unter the Privacy Shield. Furthermore, an increasing number of European Citizens are making use of their rights under the Framework, and the resulting response mechanisms are functioning well.

The biggest criticism the European Commission has stated came in the form of the recommendation of firm steps to ensure a better process in the (re)certification process under the Privacy Shield. The time of the (re)certification process allows companies to get recertified within three months after their certification has run out, which can lead to a lack of transparency and confusion, since those companies will still be listed in the registry. A shorter time frame has been proposed by the European Commission to guarantee a higher level of security.

Overall, the third annual review has been seen as a success in the cooperation between the two sides, and both the U.S. and the European officials agree that there is a need for strong and credible enforcement of privacy rules to protect the respective citizens and ensure trust in the digital economy.

German data protection authorities develop fining concept under GDPR

24. October 2019

In a press release, the German Conference of Data Protection Authorities (Datenschutzkonferenz, “DSK”) announced that it is currently developing a concept for the setting of fines in the event of breaches of the GDPR by companies. The goal is to guarantee a systematic, transparent and comprehensible fine calculation.

The DSK clarifies that this concept has not yet been adopted, but is still in draft stage and will be further worked on. At present it is practiced accompanying with current fine proceedings in order to test it for its practical suitability and aiming accuracy. However, the concrete decisions are nevertheless based on Art. 83 GDPR.

Art. 70 Para. 1 lit. k of the GDPR demands a harmonization of the fine setting within Europe. Therefore guidelines shall be elaborated. For this reason, the DSK draft will be brought into line with the concepts of other EU member states.

Also, at European level a European concept is currently being negotiated. This concept should then be laid down in a guideline, at least in principle. The DSK has also contributed its considerations on the assessment.

The fine concept will be discussed further on 6th and 7th November. After prior examination, a decision will be taken on whether the concept on the setting of fines shall be published.

Category: Data breach · EU · GDPR
Tags: , , ,

USA and UK sign Cross Border Data Access Agreement for Criminal Electronic Data

10. October 2019

The United States and the United Kingdom have entered into the first of its kind CLOUD Act Data Access Agreement, which will allow both countries’ law enforcement authorities to demand authorized access to electronic data relating to serious crime. In both cases, the respective authorities are permitted to ask the tech companies based in the other country, for electronic data directly and without legal barriers.

At the base of this bilateral Agreement stands the U.S.A.’s Clarifying Lawful Overseas Use of Data Act (CLOUD Act), which came into effect in March 2018. It aims to improve procedures for U.S. and foreign investigators for obtaining electronic information held by service providers in the other country. In light of the growing number of mutual legal assistance requests for electronic data from U.S. service providers, the current process for access may take up to two years. The Data Access Agreement can reduce that time considerably by allowing for a more efficient and effective access to data needed, while protecting the privacy and civil liberties of the data subjects.

The Cloud Act focuses on updating legal frameworks to respond to the growing technology in electronic communications and service systems. It further enables the U.S. and other countries to enter into a mutual executive Agreement in order to use own legal authorities to access electronic evidence in the other respective country. An Agreement of this form can only be signed by rights-respecting countries, after it has been certified by the U.S. Attorney General to the U.S. Congress that their laws have robust substansive and procedural protections for privacy and civil liberties.

The Agreement between the U.K. and the U.S.A. further assures providers that the requested disclosures are compatible with data protection laws in both respective countries.

In addition to the Agreement with the United Kingdom, there have been talks between the United States and Australia on Monday, reporting negotiations for such an Agreement between the two countries. Other negotiations have also been held between the U.S. and the European Commission, representing the European Union, in regards to a Data Access Agreement.

Category: General · UK · USA
Tags: , , , ,

CJEU rules pre-checked Cookie consent invalid

2. October 2019

The Court of Justice of the European Union (CJEU) ruled on Tuesday, October 1rst, that storing Cookies on internet users’ devices requires active consent. This decision concerns the implementation of widely spread pre-checked boxes, which has been decided to be insufficient to fulfill the requirements of a lawful consent under the General Data Protection Regulation (GDPR).

The case to be decided concerned a lottery for advertizing purposes initiated by Planet49 GmbH. During the participation process internet users were confronted with two information texts and corresponding checkboxes. Within the first information text the users were asked to agree to be contacted by other companies for promotional offers, by ticking the respective checkbox. The second information text required the user to consent to the installation of Cookies on their devices, while the respective checkbox had already been pre-checked. Therefore users would have needed to uncheck the checkbox if they did not agree to give their consent accordingly (Opt-out).

The Federal Court of Justice in Germany raised and referred their questions to the CJEU regarding whether such a process of obtaining consent could be lawful under the relevant EU jurisprudence, in particular whether valid consent could have been obtained for the storage of information and Cookies on users devices, in case of such mechanisms.

Answering the questions, the CJEU decided, referring to the relevant provisions of Directive 95/46 and the GDPR that require an active behaviour of the user, that pre-ticked boxes cannot constitute a valid consent. Furthermore, in a statement following the decision, the CJEU clarified that consent must be specific, and that users should be informed about the storage period of the Cookies, as well as about third parties accessing users’ information. The Court also said that the “decision is unaffected by whether or not the information stored or accessed on the user’s equipment is personal data.”

In consequence of the decision, it is very likely that at least half of all websites that fall into the scope of the GDPR will need to consider adjustments of their Cookie Banners and, if applicable, procedures for obtaining consent with regard to performance-related and marketing and advertising Cookies in order to comply with the CJEU’s view on how to handle Cookie usage under the current data protection law.

Cookies, in general, are small files which are sent to and stored in the browser of a terminal device as part of the website user’s visit on a website. In case of performance-related and marketing and advertising Cookies, the website provider can then access the information that such Cookies collected about the user when visiting the website on a further occasion, in order to, e.g., facilitate navigation on the internet or transactions, or to collect information about user behaviour.

Following the new CJEU decision, there are multiple possibilities to ensure a GDPR compliant way to receive users’ active consent. In any case it is absolutely necessary to give the user the possibility of actively checking the boxes themselves. This means that pre-ticked boxes are no longer a possibility.

In regard to the obligation of the website controller to provide the user with particular information about the storage period and third party access, a possible way would be to include a passage about Cookie information within the website’s Privacy Policy. Another would be to include all the necessary information under a seperate tab on the website containing a Cookie Policy. Furthermore, this information needs to be easily accessible by the user prior to giving consent, either by including the information directly within the Cookie Banner or by providing a link therein.

As there are various different options depending on the types of the used Cookies, and due to the clarification made by the CJEU, it is recommended to review the Cookie activities on websites and the corresponding procedures of informing about those activities and obtaining consent via the Cookie Banner.

CJEU rules that Right To Be Forgotten is only applicable in Europe

27. September 2019

In a landmark case on Tuesday the Court of Justice of the European Union (CJEU) ruled that Google will not have to apply the General Data Privacy Regulation’s (GDPR) “Right to be Forgotten” to its search engines outside of the European Union. The ruling is a victory for Google in a case against a fine imposed by the french Commission nationale de l’informatique et des libertés (CNIL) in 2015 in an effort to force the company and other search engines to take down links globally.

Seeing as the internet has grown into a worldwide media net with no borders, this case is viewed as a test of wether people can demand a blanket removal of information about themselves from searches without overbearing on the principles of free speech and public interest. Around the world, it has also been perceived as a trial to see if the European Union can extend its laws beyond its own borders.

“The balance between right to privacy and protection of personal data, on the one hand, and the freedom of information of internet users, on the other, is likely to vary significantly around the world,” the court stated in its decision.The Court also expressed in the judgement that the protection of personal data is not an absolute right.

While this leads to companies not being forced to delete sensitive information on their search engines outside of the EU upon request, they must take precautions to seriously discourage internet users from going onto non-EU versions of their pages. Furthermore, companies with search engines within the EU will have to closely weigh freedom of speech against the protection of privacy, keeping the currently common case to case basis for deletion requests.

In effect, since the Right to be Forgotten had been first determined by the CJEU in 2014, Google has since received over 3,3 million deletion requests. In 45% of the cases it has complied with the delisting of links from its search engine. As it stands, even while complying with deletion requests, the delisted links within the EU search engines can still be accessed by using VPN and gaining access to non-EU search engines, circumventing the geoblocking. This is an issue to which a solution has not yet been found.

Greek Parliament passes bill to adopt GDPR into National Law

29. August 2019

On Monday, August 26th, the Greek Parliament passed a bill that will incorporate the European Union’s General Data Protection Regulation (GDPR) into national law. Originally, the adaptation of the EU regulation was supposed to take place until May 06, 2018. Greece failed to comply with the deadline.

The, now, fast-paced implementation of the regulation may have come as a result of the referral of Greece and Spain by the European Commission (EC) to the European Court of Justice on July 25th. Since they had failed to adopt the GDPR into national law up until then, Greece could have faced a fine of €5,287.50 for every day passed since May 06, in addition to a stiff fine of €1.3 million. In its statement, the EC declared that “the lack of transposition by Spain and Greece creates a different level of protection of peoples’ rights and freedoms, and hampers data exchanges between Greece and Spain on one side and other Member States, who transposed the Directive, on the other side”.

The EU countries are allowed to adopt certain derogations, exeptions and specifications under the GDPR. Greece has done so, in the approved bill, with adjusted provisions in regards to the age of consent, the process of appointing a Data Protection Officer, sensitive data processing, data repurposing, data deletion, certifications and criminal sanctions.

The legislation was approved by New Democracy, the main opposition SYRIZA, the center-left Movement for Change and leftist MeRA25, with an overwhelming majority. The GDPR has already been in effect since May 25th, 2018, with its main aim being to offer more control to individuals over their personal data that they provide to companies and services.

 

Category: EU · EU Commission · GDPR · General
Tags: , , ,

Hackers steal millions of Bulgarians’ financial data

18. July 2019

After a cyberattack on the Bulgarian’s tax agency (NRA) millions of taxpayers’ financial data has been stolen. In an estimate, it is said that most working adults in the 7 million country are affected by some of their data being compromised. The stolen data included names, adresses, income and social security information.

The attack happened in June, but an E-mail from the self-proclaimed perpetrator was sent to Bulgarian media on Monday. It stated that more than 110 databases of the agency had been compromised, the hacker calling the NRA’s cybersecurity a parody. The Bulgarian media were further offered access to the stolen data. One stolen file, e-mailed to the newspaper 24 Chasa,  contained up to 1,1 million personal identification numbers with income, social security and healthcare figures.

The country’s finance minister Vladislav Goranov has appologized in parliament and to the Bulgarian citizens, adding that about 3% of the tax agency’s database had been affected. He made clear that whoever attempted to exploit the stolen data would fall under the impact of Bulgarian law.

In result to this hacking attack, the Bulgarian tax agency now faces a fine of up to 20 million euros by the Commission of Personal Data Protection (CPDP). In addition, the issue has reignited an old debate about the lax cybersecurity standards in Bulgaria, and its adjustement to the modern times.

Google data breach notification sent to IDPC

Google may face further investigations under the General Data Protection Regulation(GDPR), after unauthorized audio recordings have been forwarded to subcontractors. The Irish Data Protection Commission (IDPC) has confirmed through a spokesperson that they have received a data breach notification concerning the issue last week.

The recordings were exposed by the Belgian broadcast VRT, said to affect 1000 clips of conversations in the region of Belgium and the Netherlands. Being logged by Google Assistant, the recordings were then sent to Google’s subcontractors for review. At least 153 of those recordings were not authorized by Google’s wake phrase “Ok/Hey, Google,” and were never meant to be recorded in the first place. They contained personal data reaching from family conversations over bedroom chatter to business calls with confidential information.

Google has addressed this violation of their data security policies in a blog post. It said that the audio recordings were sent to experts, who understand nuances and accents, in order to refine Home’s linguistic abilities, which is a critical part in the process of building speech technology. Google stresses that the storing of recorded data on its services is turned off by default, and only sends audio data to Google once its wake phrase is said. The recordings in question were most likely initiated by the users saying a phrase that sounded similar to “Ok/Hey, Google,” therefore confusing Google Assistant and turning it on.

According to Google’s statement, Security and Privacy teams are working on the issue and will fully review its safeguards to prevent this sort of misconduct from happening again. If, however, following investigations by the IDPC discover a GDPR violation on the matter, it could result in significant financial penalty for the tech giant.

Record fine by ICO for British Airways data breach

11. July 2019

After a data breach in 2018, which affected 500 000 customers, British Airways (BA) has now been fined a record £183m by the UK’s Information Commissioners Office (ICO). According to the BBC, Alex Cruz, chairman and CEO of British Airways, said he was “surprised and disappointed” by the ICO’s initial findings.

The breach happened by a hacking attack that managed to get a script on to the BA website. Unsuspecting users trying to access the BA website had been diverted to a false website, which collected their information. This information included e-mail addresses, names and credit card information. While BA had stated that they would reimburse every customer that had been affected, its owner IAG declared through its chief executive that they would take “all appropriate steps to defend the airline’s position”.

The ICO said that it was the biggest penalty that they had ever handed out and made public under the new rules of the GDPR. “When an organization fails to protect personal data from loss, damage or theft, it is more than an inconvenience,” ICO Commissioner Elizabeth Dunham said to the press.

In fact, the GDPR allows companies to be fined up to 4% of their annual turnover over data protection infringements. In relation, the fine of £183m British Airways received equals to 1,5% of its worldwide turnover for the year 2017, which lies under the possible maximum of 4%.

BA can still put forth an appeal in regards to the findings and the scale of the fine, before the ICO’s final decision is made.

Pages: 1 2 Next
1 2