Category: EU

The Government of India plans one of the largest Facial Recognition Systems in the World

14. February 2020

The Indian Government released a Request for Proposal to bidder companies to procure a national Automated Facial Recognition System (AFRS). AFRS companies had time to submit their proposals until the end of January 2020. The plans for an AFRS in India are a new political development amidst the intention to pass the first national Data Protection Bill in Parliament.

The new system is supposed to integrate image databases of public authorities centrally as well as incorporate photographs from newspapers, raids, mugshots and sketches. The recordings from surveillance cameras, public or private video feeds shall then be compared to the centralised databases and help identify criminals, missing persons and dead bodies.

Human rights and privacy groups are pointing to various risks that may come with implementing nationwide AFRS in India, including violations of privacy, arbitrariness, mis-identifications, discriminatory profiling, a lack of technical safeguards, and even creating an Orwellian 1984 dystopia through mass surveillance.

However, many people in India are receiving the news about the plans of the Government with acceptance and approval. They hope that the AFRS will lead to better law enforcement and more security in their everyday lives, as India has a comparably high crime rate and only 144 police officers for every 100.000 citizens, compared to 318 per 100.000 citizens in the EU.

Irish Data Protection Authority investigates Google’s processing of location data

6. February 2020

The irish data protection authorty (namely The Data Protection Commission (DPC)) is, in its role as Lead Supervisory Authority, responsible for Google within the European Union.

The DPC startet a formal investigation into Google’s practices to track its user’s location and the transparency surrounding that processing.

Following a number of complaints by serveral national consumer groups all across the EU, the investigation was initiated by the DPC.  Consumer organisations argue that the consent to “share” users’ location data was not freely given and consumers were tricked into accepting privacy-intrusive settings. Such practices are not compliant with the EU’s data protection law GDPR.

The irish data protection authority will now have to establish, whether Google has a valid legal basis for processing the location data of its users and whether it meets its obligations as a data controller with regard to transparency.

The investigation will add further pressure to Google. Google is facing a handful of investigations in Europe. The DPC has already opened an investigation into how Google handles data for advertising. That investigation is still ongoing. If Google is found not complying with the GDPR, the company could be forced to change its business model.

However, there are still a number of steps before the Irish DPC makes a decision including the opportunity for Google to reply.

A short review of the Polish DPA’s enforcement of the GDPR

10. January 2020

To date, the Polish Data Protection Authority (DPA) have issued 134 decisions and imposed GDPR fines in 5 cases. In 4 cases, the Polish DPA fined private companies and in one case, it fined a public institution.

The fines for the companies ranged from 13.000€ to 645.000€. Reasons for the fines were failures in protecting personal data on websites resulting in the unauthorised access of personal data, inadequate technical and organisational measures, and an insufficient fulfilment of information obligations according to Art. 14 GDPR.

It is also noteworthy that the Polish DPA has imposed a 9.350€ fine on the Mayor of a Polish small town. Under Art. 83 (7) GDPR, each member state of the EU may lay down rules on whether and to what extent administrative fines may be imposed on public authorities. The Polish legislators decided that non-compliant public authorities may receive a GDPR fine of up to 23.475€.

The Mayor received the GDPR fine since he failed to conclude a data processing agreement with the entities to which he transferred data in violation of Art. 28 (3) GDPR. Moreover, the Mayor violated the principle of storage limitation, the principles of integrity and confidentiality, the principle of accountability and furthermore kept an incomplete record of processing activities.

Recently, the Polish DPA also published the EU Project T4DATA’s Handbook for Data Protection Officers (DPO) in order to help define a DPO’s role, their competencies and main responsibilities.

More US States are pushing on with new Privacy Legislation

3. January 2020

The California Consumer Privacy Act (CCPA) came into effect on January 1, 2020 and will be the first step in the United States in regulating data privacy on the Internet. Currently, the US does not have a federal-level general consumer data privacy law that is comparable to that of the privacy laws in EU countries or even the supranational European GDPR.

But now, several other US States have taken inspiration from the CCPA and are in the process of bringing forth their own state legislation on consumer privacy protections on the Internet, including

  • The Massachusetts Data Privacy Law “S-120“,
  • The New York Privacy Act “S5642“,
  • The Hawaii Consumer Privacy Protection Act “SB 418“,
  • The Maryland Online Consumer Protection Act “SB 613“, and
  • The North Dakota Bill “HB 1485“.

Like the CCPA, most of these new privacy laws have a broad definition of the term “Personal Information” and are aimed at protecting consumer data by strenghtening consumer rights.

However, the various law proposals differ in the scope of the consumer rights. All of them grant consumers the ‘right to access’ their data held by businesses. There will also be a ‘right to delete’ in most of these states, but only some give consumers a private ‘right of action’ for violations.

There are other differences with regards to the businesses that will be covered by the privacy laws. In some states, the proposed laws will apply to all businesses, while in other states the laws will only apply to businesses with yearly revenues of over 10 or 25 Million US-Dollars.

As more US states are beginning to introduce privacy laws, there is an increasing possiblity of a federal US privacy law in the near future. Proposals from several members of Congress already exist (Congresswomen Eshoo and Lofgren’s Proposal and Senators Cantwell/Schatz/Klobuchar/Markey’s Proposal and Senator Wicker’s Proposal).

Austrian Regional Court grants an Austrian man 800€ in GDPR compensation

20. December 2019

The Austrian Regional Court, Landesgericht Feldkirch, has ruled that the major Austrian postal service Österreichische Post (ÖPAG) has to pay an Austrian man 800 Euros in compensation because of violating the GDPR (LG Feldkirch, Beschl. v. 07.08.2019 – Az.: 57 Cg 30/19b – 15). It is one of the first rulings in Europe in which a civil court granted a data subject compensation based on a GDPR violation. Parallel to this court ruling, ÖPAG is facing an 18 Mio Euro fine from the Austrian Data Protection Authorities.

Based on people’s statements in anonymised surveys, ÖPAG had created marketing groups and used algorithms to calculate the probability of the political affinities that people with certain socioeconomic and regional backgrounds might have. ÖPAG then ascribed customers to these marketing groups and thus also stored data about their calculated political affinities. Among these customers was the plaintiff of this case.

The court ruled that this combination is “personal data revealing political opinions” according to Art. 9 GDPR. Since ÖPAG neither obtained the plaintiff’s consent to process his sensitive data on political opinions nor informed him about the processing itself, ÖPAG violated the plaintiff’s individual rights.

While the plaintiff demanded 2.500 Euros in compensation from ÖPAG, the court granted the plaintiff only a non-material damage compensation of 800 Euros after weighing up the circumstances of the individual case.

The case was appealed and will be tried at the Higher Regional Court Innsbruck.

Dutch DPA issued a statement regarding cookie consent

12. December 2019

The Dutch Data Protection Authority (Autoriteit Persoonsgegevens) has recently issued a statement regarding compliance with the rules on cookie consent. According to the statement the DPA has reviewed 175 websites and e-commerce platforms to see if they meet the requirements for the use of cookies. They found that almost half of the websites and nearly all e-commerce platforms do not meet the requirements for cookie consent.

The data protection authority has contacted the companies concerned and requested them to adjust their cookie usage.

In its statement, the Data Protection Authority also refers to the “Planet49case” of the Court of Justice of the European Union (“CJEU”) and clarifies that boxes that have already been clicked do not comply with the obligation to obtain the user’s consent. In addition, it is not equivalent to obtaining consent to the use of cookies if the user merely scrolls down the website. Cookies, which enable websites to track their users, always require explicit consent.

Lastly, the DPA recalls that cookie walls that prevent users, who have not consented to the use of cookies from accessing the website are not permitted.

Category: EU · GDPR · The Netherlands
Tags: ,

CNIL publishes report on facial recognition

21. November 2019

The French Data Protection Authority, Commission Nationale de l’Informatique et des Libertés (CNIL), has released guidelines concerning the experimental use of facial recognition software by the french public authorities.

Especially concerned with the risks of using such a technology in the public sector, the CNIL made it clear that the use of facial recognition has vast political as well as societal influences and risks. In its report, the CNIL explicitly stated the software can yield very biased results, since the algorithms are not 100% reliable, and the rate of false-positives can vary depending on the gender and on the ethnicity of the individuals that are recorded.

To minimize the chances of an unlawful use of the technology, the CNIL came forth with three main requirements in its report. It recommended to the public authorities, that are using facial recognition in an experimental phase, to comply with them in order to keep the chances of risks to a minimum.

The three requirements put forth in the report are as follows:

  • Facial recognition should only be put to experimental use if there is an established need to implement an authentication mechanism with a high level of reliability. Further, there should be no less intrusive methods applicable to the situation.
  • The controller must under all circumstances respect the rights of the individuals beig recorded. That extends to the necessity of consent for each device used, data subjects’ control over their own data, information obligation, and transparency of the use and purpose, etc.
  • The experimental use must follow a precise timeline and be at the base of a rigorous methodology in order to minimize the risks.

The CNIL also states that it is important to evaluate each use of the technology on a case by case basis, as the risks depending on the way the software is used can vary between controllers.

While the CNIL wishes to give a red lining to the use of facial recognition in the future, it has also made clear that it will fulfill its role by showing support concerning issues that may arise by giving counsel in regards to legal and methodological use of facial recognition in an experimental stage.

Category: EU · French DPA · GDPR · General
Tags: , , , ,

 The Netherlands passed new law on the use of passenger data

31. October 2019

In June 2019 the Netherlands adopted a new law concerning the processing and sharing of passenger data by airlines. Since the 18 June 2019, airlines are now required to share passenger data with a newly established passenger information unit  (‘Pi-NL’) for all flights that depart from the Netherlands or arrive in the Netherlands. The passenger data to be passed on include, for example nationality, full name, date of birth, number and type of travel documents used.

The new established specialised unit will be independent with its own statustory task and authorisations and will collect,process and analyse passenger data and share it with the competent authorities such as the police, Public Prosecution and with comparable units in other Member States oft he EU and with Europol, if necessary. It falls under the responsibility of the Minister of Justice and Security. The purpose of such data processing is to prevent, detect, investigate and prosecute terrorist offences and serious criminal offences.

This law implements the European PNR (Passenger Name Record) directive in Dutch law. The aim of the PNR directive is to ensure internal security within the European Union and to protect the life and safety of persons. It will also promote more effective cooperation between EU Member States.

In drafting this law, the Dutch gorvernment weighed the importance of combating terrorism against the privacy interests of passengers.  Therefore the newly introduced law also contains a number of data protection safeguards and guarantees, such as a limitation on the retention period, a processing prohibition on special categories of personal data and strict conditions for the exchange of such data with other states and the requirement that the Pi-NL appoint a data protection officer.

European Commission releases third annual Privacy Shield Review report

25. October 2019

The European Commission has released a report on the E.U.-U.S. Privacy Shield, which represents the third annual report on the performance of the supranational Agreement, after it came into effect in July 2016. The discussions on the review were launched on 12 September 2019 by Commissioner for Justice, Consumers and Gender Equality Věra Jourová, with the U.S. Secretary of Commerce Wilbur Ross in Washington, DC.

The Privacy Shield protects the fundamental rights of anyone in the European Union whose personal data is transferred to certified companies in the United States for commercial purposes and brings legal clarity for businesses relying on transatlantic data transfer. The European Commission is commited to review the Agreement on an annual basis to ensure that the level of protection certified under the Privacy Shield continues to be at an adequate level.

This year’s report validates the continuous adequacy of the protection for personal data transferred to certified companies in the U.S. from the Europan Union under the Privacy Shield. Since the Framework was implemented, about 5000 companies have registered with the Privacy Shield. The EU Commissioner for Justice, Consumers and Gender Equality stated that “the Privacy Shield has become a success story. The annual review is an important health check for its functioning“.

The improvements compared to the last annual review in 2018 include the U.S. Department of Commerce’s efforts to ensure necessary oversight in a systematic manner. This is done by monthly checks with samply companies that are certified unter the Privacy Shield. Furthermore, an increasing number of European Citizens are making use of their rights under the Framework, and the resulting response mechanisms are functioning well.

The biggest criticism the European Commission has stated came in the form of the recommendation of firm steps to ensure a better process in the (re)certification process under the Privacy Shield. The time of the (re)certification process allows companies to get recertified within three months after their certification has run out, which can lead to a lack of transparency and confusion, since those companies will still be listed in the registry. A shorter time frame has been proposed by the European Commission to guarantee a higher level of security.

Overall, the third annual review has been seen as a success in the cooperation between the two sides, and both the U.S. and the European officials agree that there is a need for strong and credible enforcement of privacy rules to protect the respective citizens and ensure trust in the digital economy.

German data protection authorities develop fining concept under GDPR

24. October 2019

In a press release, the German Conference of Data Protection Authorities (Datenschutzkonferenz, “DSK”) announced that it is currently developing a concept for the setting of fines in the event of breaches of the GDPR by companies. The goal is to guarantee a systematic, transparent and comprehensible fine calculation.

The DSK clarifies that this concept has not yet been adopted, but is still in draft stage and will be further worked on. At present it is practiced accompanying with current fine proceedings in order to test it for its practical suitability and aiming accuracy. However, the concrete decisions are nevertheless based on Art. 83 GDPR.

Art. 70 Para. 1 lit. k of the GDPR demands a harmonization of the fine setting within Europe. Therefore guidelines shall be elaborated. For this reason, the DSK draft will be brought into line with the concepts of other EU member states.

Also, at European level a European concept is currently being negotiated. This concept should then be laid down in a guideline, at least in principle. The DSK has also contributed its considerations on the assessment.

The fine concept will be discussed further on 6th and 7th November. After prior examination, a decision will be taken on whether the concept on the setting of fines shall be published.

Category: Data breach · EU · GDPR
Tags: , , ,
Pages: Prev 1 2 3 4 5 6 7 8 9 10 ... 14 15 16 Next
1 2 3 4 5 16