LGPD – Brazil’s upcoming Data Protection Law

28. November 2019

The National Congress of Brazil passed in August 2018 a new General Data Protection Law (“Lei Geral de Proteção de Dados” or “LGPD”). This law is slated to come into effect in August 2020. Prior to the LGPD, data protection in Brazil was primarily enforced via a various collection of legal frameworks, including the country’s Civil Rights Framework for the Internet (Internet Act) and Consumer Protection Code.

The new legislation creates a completely new general framework for the use of personal data processed on individuals in Brazil, regardless of where the data processor is located. Brazil also established its own Data Protection Authority, in order to enforce the guidance. Although the Data Protection Authority will initially be tied to the Presidency of the Federative Republic of Brazil, the DPA will become autonomous in the long term, in about two years.

Like the GDPR, the new framework has an extraterritorial application, which means that the law will apply to any individual or organization, private or public that processes or collects personal data in Brazil, regardless of where the Processor is based. The LGPD does not apply to data processing for strictly personal, academic, artistic and journalistic purposes.

Although the LGPD is largely influenced by the GDPR, both frameworks also differ from each other a lot. For instance, both frameworks define personal data differently. The LGPD’s definition is broad and covers any information relating to an identified or identifiable natural person. Furthermore, the LGPD does not permit cross-border transfers based on the controller’s legitimate interest. In the GDPR, the deadline for data breach notification is 72 hours; in the LGPD, the deadline is loosely defined, to name just a few.

Category: General · Personal Data
Tags: ,

Austrian data protection authority imposes 18 million euro fine

22. November 2019

The Austrian Data Protection Authority (DPA) has imposed a fine of 18 million euros on Österreichische Post AG (Austrian Postal Service) for violations of the GDPR.

The company had among other things collected data on the “political affinity” from 2.2 million customers, and thus violated the GDPR. Parties should be able to send purposeful election advertising to the Austrian inhabitants with this information.

In addition, they also collected data on the frequency of parcel deliveries and the relocation probability of customers, so that these can be used for direct marketing.

The penalty is not yet final. Österreichische Post AG, half of which belongs to the Austrian state, can appeal the decision before the Federal Administrative Court. The company has already announced its intention to take legal action.

CNIL publishes report on facial recognition

21. November 2019

The French Data Protection Authority, Commission Nationale de l’Informatique et des Libertés (CNIL), has released guidelines concerning the experimental use of facial recognition software by the french public authorities.

Especially concerned with the risks of using such a technology in the public sector, the CNIL made it clear that the use of facial recognition has vast political as well as societal influences and risks. In its report, the CNIL explicitly stated the software can yield very biased results, since the algorithms are not 100% reliable, and the rate of false-positives can vary depending on the gender and on the ethnicity of the individuals that are recorded.

To minimize the chances of an unlawful use of the technology, the CNIL came forth with three main requirements in its report. It recommended to the public authorities, that are using facial recognition in an experimental phase, to comply with them in order to keep the chances of risks to a minimum.

The three requirements put forth in the report are as follows:

  • Facial recognition should only be put to experimental use if there is an established need to implement an authentication mechanism with a high level of reliability. Further, there should be no less intrusive methods applicable to the situation.
  • The controller must under all circumstances respect the rights of the individuals beig recorded. That extends to the necessity of consent for each device used, data subjects’ control over their own data, information obligation, and transparency of the use and purpose, etc.
  • The experimental use must follow a precise timeline and be at the base of a rigorous methodology in order to minimize the risks.

The CNIL also states that it is important to evaluate each use of the technology on a case by case basis, as the risks depending on the way the software is used can vary between controllers.

While the CNIL wishes to give a red lining to the use of facial recognition in the future, it has also made clear that it will fulfill its role by showing support concerning issues that may arise by giving counsel in regards to legal and methodological use of facial recognition in an experimental stage.

Category: EU · French DPA · GDPR · General
Tags: , , , ,

Health data transfered to Google, Amazon and Facebook

18. November 2019

Websites, spezialized on health topics transfer information of website users to Google, Amazon and Facebook, as the Financial Times reports.

The transferred information are obtained through cookies and include medical symtoms and clinical pictures of the users.

Referring to the report of the Financial Times does the transfer take place without the express consent of the data subject, contrary to the Data Protection Law in the UK. Besides the legal obligations in the UK, the procedure of the website operators, using the cookie, contradicts also the legal requirements of the GDPR.

According to the requirements of the GDPR the processing of health data falls under Art. 9 GDPR and is a prohibition subject to permission, meaning, that the processing of health data is forbidden unless the data subject has given its explicit consent.

The report is also interesting considering the Cookie judgement of the CJEU (we reported). Based on the judgment, consent must be obtained for the use of each cookie.

Accordingly, the procedure of the website operators will (hopefully) change in order to comply with the new case law.

 

Berlin commissioner for data protection imposes fine on real estate company

6. November 2019

On October 30th, 2019, the Berlin Commissioner for Data Protection and Freedom of Information issued a fine of around 14.5 million euros against the real estate company Deutsche Wohnen SE for violations of the General Data Protection Regulation (GDPR).

During on-site inspections in June 2017 and March 2019, the supervisory authority determined that the company used an archive system for the storage of personal data of tenants that did not provide for the possibility of removing data that was no longer required. Personal data of tenants were stored without checking whether storage was permissible or even necessary. In individual cases, private data of the tenants concerned could therefore be viewed, even though some of them were years old and no longer served the purpose of their original survey. This involved data on the personal and financial circumstances of tenants, such as salary statements, self-disclosure forms, extracts from employment and training contracts, tax, social security and health insurance data and bank statements.

After the commissioner had made the urgent recommendation to change the archive system in the first test date of 2017, the company was unable to demonstrate either a cleansing of its database nor legal reasons for the continued storage in March 2019, more than one and a half years after the first test date and nine months after the GDPR came into force. Although the enterprise had made preparations for the removal of the found grievances, nevertheless these measures did not lead to a legal state with the storage of personal data. Therefore the imposition of a fine was compelling because of a violation of article 25 Abs. 1 GDPR as well as article 5 GDPR for the period between May 2018 and March 2019.

The starting point for the calculation of fines is, among other things, the previous year’s worldwide sales of the affected companies. According to its annual report for 2018, the annual turnover of Deutsche Wohnen SE exceeded one billion euros. For this reason, the legally prescribed framework for the assessment of fines for the established data protection violation amounted to approximately 28 million euros.

For the concrete determination of the amount of the fine, the commissioner used the legal criteria, taking into account all burdening and relieving aspects. The fact that Deutsche Wohnen SE had deliberately set up the archive structure in question and that the data concerned had been processed in an inadmissible manner over a long period of time had a particularly negative effect. However, the fact that the company had taken initial measures to remedy the illegal situation and had cooperated well with the supervisory authority in formal terms was taken into account as a mitigating factor. Also with regard to the fact that the company was not able to prove any abusive access to the data stored, a fine in the middle range of the prescribed fine framework was appropriate.

In addition to sanctioning this violation, the commissioner imposed further fines of between 6,000 and 17,000 euros on the company for the inadmissible storage of personal data of tenants in 15 specific individual cases.

The decision on the fine has not yet become final. Deutsche Wohnen SE can lodge an appeal against this decision.

 The Netherlands passed new law on the use of passenger data

31. October 2019

In June 2019 the Netherlands adopted a new law concerning the processing and sharing of passenger data by airlines. Since the 18 June 2019, airlines are now required to share passenger data with a newly established passenger information unit  (‘Pi-NL’) for all flights that depart from the Netherlands or arrive in the Netherlands. The passenger data to be passed on include, for example nationality, full name, date of birth, number and type of travel documents used.

The new established specialised unit will be independent with its own statustory task and authorisations and will collect,process and analyse passenger data and share it with the competent authorities such as the police, Public Prosecution and with comparable units in other Member States oft he EU and with Europol, if necessary. It falls under the responsibility of the Minister of Justice and Security. The purpose of such data processing is to prevent, detect, investigate and prosecute terrorist offences and serious criminal offences.

This law implements the European PNR (Passenger Name Record) directive in Dutch law. The aim of the PNR directive is to ensure internal security within the European Union and to protect the life and safety of persons. It will also promote more effective cooperation between EU Member States.

In drafting this law, the Dutch gorvernment weighed the importance of combating terrorism against the privacy interests of passengers.  Therefore the newly introduced law also contains a number of data protection safeguards and guarantees, such as a limitation on the retention period, a processing prohibition on special categories of personal data and strict conditions for the exchange of such data with other states and the requirement that the Pi-NL appoint a data protection officer.

Data Incident at H&M in Germany

28. October 2019

According to a report of the ‘Frankfurter Allgemeine Zeitung‘ (FAZ), personal data of H&M employees working in the customer center of H&M in Nuremberg, were leaked to other H&M employees who should not have access to this kind of data.

The concerned personal data result of personnel interviews between employees and mangers. The managers stored the personal information, inter alia health data and information on the private life of employees, in files which should have been only accessible for managers, but according to the report, also other H&M employees besides the managers could access the files and thus the confidential employee data.

At the customer center in Nuremberg work several hundreds employees. These were informed by the board of H&M on Wednesday last week, October 23rd 2019, about the data incident. On the following day the board announced, that all stored in the files, was deleted and that measures were taken to ensure data security. Additionally, the data protection officer of H&M in Nuremberg as well as the competent data protection authority were notified about the data incident.

Category: Data breach · GDPR
Tags: , ,

European Commission releases third annual Privacy Shield Review report

25. October 2019

The European Commission has released a report on the E.U.-U.S. Privacy Shield, which represents the third annual report on the performance of the supranational Agreement, after it came into effect in July 2016. The discussions on the review were launched on 12 September 2019 by Commissioner for Justice, Consumers and Gender Equality Věra Jourová, with the U.S. Secretary of Commerce Wilbur Ross in Washington, DC.

The Privacy Shield protects the fundamental rights of anyone in the European Union whose personal data is transferred to certified companies in the United States for commercial purposes and brings legal clarity for businesses relying on transatlantic data transfer. The European Commission is commited to review the Agreement on an annual basis to ensure that the level of protection certified under the Privacy Shield continues to be at an adequate level.

This year’s report validates the continuous adequacy of the protection for personal data transferred to certified companies in the U.S. from the Europan Union under the Privacy Shield. Since the Framework was implemented, about 5000 companies have registered with the Privacy Shield. The EU Commissioner for Justice, Consumers and Gender Equality stated that “the Privacy Shield has become a success story. The annual review is an important health check for its functioning“.

The improvements compared to the last annual review in 2018 include the U.S. Department of Commerce’s efforts to ensure necessary oversight in a systematic manner. This is done by monthly checks with samply companies that are certified unter the Privacy Shield. Furthermore, an increasing number of European Citizens are making use of their rights under the Framework, and the resulting response mechanisms are functioning well.

The biggest criticism the European Commission has stated came in the form of the recommendation of firm steps to ensure a better process in the (re)certification process under the Privacy Shield. The time of the (re)certification process allows companies to get recertified within three months after their certification has run out, which can lead to a lack of transparency and confusion, since those companies will still be listed in the registry. A shorter time frame has been proposed by the European Commission to guarantee a higher level of security.

Overall, the third annual review has been seen as a success in the cooperation between the two sides, and both the U.S. and the European officials agree that there is a need for strong and credible enforcement of privacy rules to protect the respective citizens and ensure trust in the digital economy.

German data protection authorities develop fining concept under GDPR

24. October 2019

In a press release, the German Conference of Data Protection Authorities (Datenschutzkonferenz, “DSK”) announced that it is currently developing a concept for the setting of fines in the event of breaches of the GDPR by companies. The goal is to guarantee a systematic, transparent and comprehensible fine calculation.

The DSK clarifies that this concept has not yet been adopted, but is still in draft stage and will be further worked on. At present it is practiced accompanying with current fine proceedings in order to test it for its practical suitability and aiming accuracy. However, the concrete decisions are nevertheless based on Art. 83 GDPR.

Art. 70 Para. 1 lit. k of the GDPR demands a harmonization of the fine setting within Europe. Therefore guidelines shall be elaborated. For this reason, the DSK draft will be brought into line with the concepts of other EU member states.

Also, at European level a European concept is currently being negotiated. This concept should then be laid down in a guideline, at least in principle. The DSK has also contributed its considerations on the assessment.

The fine concept will be discussed further on 6th and 7th November. After prior examination, a decision will be taken on whether the concept on the setting of fines shall be published.

Category: Data breach · EU · GDPR
Tags: , , ,

Apple wants to evaluate “Siri”-recordings again

14. October 2019

Apple wants to evaluate Siri-recordings again in the future. After it became public that Apple automatically saved the audio recordings of Siri entries and had some of them evaluated by employees of external companies, the company stopped this procedure. Although Apple stated that only less than 0.2 % of the queries were actually evaluated, the system received around 10 billion queries per month (as of 2018).

In the future, audio recordings from the Siri language assistant will be stored and evaluated again. This time, however, only after the user has consented. This procedure will be tested with the latest beta versions of the Apple IOS software for iPhone and iPad.

Apple itself hopes that many users will agree and thus contribute to the improvement of Siri. A later opt-out is possible at any time, but for each device individually. In addition, only apple’s own employees, who are – according to Apple -subject to strict confidentiality obligations ,will evaluate the recordings. Recordings that have been generated by an unintentional activation of Siri will be completely deleted.

In addition, a delete function for Siri-recordings is to be introduced. Users can then choose in their settings to delete all data recorded by Siri. If this deletion is requested within 24 hours of a Siri request, the respective recordings and transcripts will not be released for evaluation.

However, even if the user does not opt-in to the evaluation of his Siri recordings, a computer-generated transcript will continue to be created and kept by Apple for a certain period of time. Although these transcripts are to be anonymized and linked to a random ID, they still could be evaluated according to Apple.

Category: General
Tags: ,
Pages: Prev 1 2 3 4 5 6 7 8 9 10 ... 45 46 47 Next
1 6 7 8 9 10 47