Category: Personal Data

NIST examines the effect of demographic differences on face recognition

31. December 2019

As part of its Face Recognition Vendor Test (FRVT) program, the U.S. National Institute of Standards and Technology (NIST) conducted a study that evaluated face recognition algorithms submitted by industry and academic developers for their ability to perform various tasks. The study evaluated 189 software algorithms submitted by 99 developers. It focuses on how well each algorithm performs one of two different tasks that are among the most common applications of face recognition.

The two tasks are “one-to-one” matching, i.e. confirming that a photo matches another photo of the same person in a database. This is used, for example, when unlocking a smartphone or checking a passport. The second task involved “one-to-many” matching, i.e. determining whether the person in the photo matches any database. This is used to identify a person of interest.

A special focus of this study was that it also looked at the performance of the individual algorithms taking demographic factors into account. For one-to-one matching, only a few previous studies examined demographic effects; for one-to-many matching, there were none.

To evaluate the algorithms, the NIST team used four photo collections containing 18.27 million images of 8.49 million people. All were taken from operational databases of the State Department, Department of Homeland Security and the FBI. The team did not use images taken directly from Internet sources such as social media or from video surveillance. The photos in the databases contained metadata information that indicated the age, gender, and either race or country of birth of the person.

The study found that the result depends ultimately on the algorithm at the heart of the system, the application that uses it, and the data it is fed with. But the majority of face recognition algorithms exhibit demographic differences. In one-to-one matching, the algorithm rated photos of two different people more often as one person if they were Asian or African-American than if they were white. In algorithms developed by Americans, the same error occurred when the person was a Native American. In contrast, algorithms developed in Asia did not show such a significant difference in one-to-one matching results between Asian and Caucasian faces. However, these results show that algorithms can be trained to achieve correct face recognition results by using a wide range of data.

Austrian Regional Court grants an Austrian man 800€ in GDPR compensation

20. December 2019

The Austrian Regional Court, Landesgericht Feldkirch, has ruled that the major Austrian postal service Österreichische Post (ÖPAG) has to pay an Austrian man 800 Euros in compensation because of violating the GDPR (LG Feldkirch, Beschl. v. 07.08.2019 – Az.: 57 Cg 30/19b – 15). It is one of the first rulings in Europe in which a civil court granted a data subject compensation based on a GDPR violation. Parallel to this court ruling, ÖPAG is facing an 18 Mio Euro fine from the Austrian Data Protection Authorities.

Based on people’s statements in anonymised surveys, ÖPAG had created marketing groups and used algorithms to calculate the probability of the political affinities that people with certain socioeconomic and regional backgrounds might have. ÖPAG then ascribed customers to these marketing groups and thus also stored data about their calculated political affinities. Among these customers was the plaintiff of this case.

The court ruled that this combination is “personal data revealing political opinions” according to Art. 9 GDPR. Since ÖPAG neither obtained the plaintiff’s consent to process his sensitive data on political opinions nor informed him about the processing itself, ÖPAG violated the plaintiff’s individual rights.

While the plaintiff demanded 2.500 Euros in compensation from ÖPAG, the court granted the plaintiff only a non-material damage compensation of 800 Euros after weighing up the circumstances of the individual case.

The case was appealed and will be tried at the Higher Regional Court Innsbruck.

Data Leak of South African IT firm exposes over 1 Million Web Browsing Records

18. December 2019

Security researchers at vpnMentor recently discovered an unsecured and unencrypted database owned by the South African information and communications technology (ICT) company Conor. The breached database consisted of daily logs of user activity by customers of Internet Service Providers (ISPs) that used web filtering software built by Conor.

The leak exposed all internet traffic and activity, along with their personally identifying information and highly sensitive and private information. For two months it revealed activity logs such as website URLs, IP addresses, index names and MSISDN codes which identify mobile users on a specific network. The details contained in this breach included highly sensitive web browsing activity like attempts to visit pornography websites, social media accounts, online storage including iCloud and messaging apps such as WhatsApp. In total, this resulted in 890+ GB of data and over 1 million records being exposed.

“Because the database gave access to a complete record of each user’s activity in a session, our team was able to view every website they visited – or attempted to visit. We could also identify each user,” the vpnMentor team explained in their statement. “For an ICT and software development company not to protect this data is incredibly negligent. Conor’s lapse in data security could create real-world problems for the people exposed.”

Such an incident could make Conor suffer significant reputational damage and integrity loss. In addition, it exposed how their filter system worked and ways to circumvent it. This could lead to their product becoming ineffective against attempts to bypass it, making it redundant. In result, the outcome may lead to a loss of business for Conor, since clients may no longer feel like they can trust the company and the values they propose.

Germany: Telecommunications provider receives a 9.5 Million Euro GDPR fine

16. December 2019

The German Federal Commissioner for Data Protection and Freedom of Information (BfDI) has imposed a fine of 9.55 Million Euro on the major telecommunication services provider 1&1 Telecom GmbH (1&1). This is the second multimillion Euro fine that the Data Protection Authorities in Germany have imposed. The first fine of this magnitude (14.5 Million Euro) was imposed last month on a real estate company.

According to the BfDI, the reason for the fine for 1&1 was an inadequate authentication procedure within the company’s customer service department, because any caller to 1&1’s customer service could obtain extensive information on personal customer data, only by providing a customer’s name and date of birth. The particular case that was brought to the Data Protection Authority’s attention was based on a caller’s request of the new mobile phone number of an ex-partner.

The BfDI found that this authentication procedure stands in violation of Art. 32 GDPR, which sets out a company’s obligation to take appropriate technical and organisational measures to systematically protect the processing of personal data.

After the BfDI had pointed 1&1 to the their deficient procedure, the company cooperated with the authorities. In a first step, the company changed their two-factor authentication procedure to a three step authentication procedure in their customer service department. Furthermore, they are working on a new enhanced authentication system in which each customer will receive a personal service PIN.

In his statement, the BfDI explained that the fine was necessary because the violation posed a risk to the personal data of all customers of 1&1. But because of the company’s cooperation with the authorities, the BfDI set the fine at the lower end of the scale.

1&1 has deemed the fine “absolutely disproportionate” and has announced to file a suit against the penalty notice by the BfDI.

India updates privacy bill

12. December 2019

The new update of the Indian Personal Data Protection Bill is part of India’s broader efforts to tightly control the flow of personal data.

The bill’s latest version enpowers the government to ask companies to provide anonymized personal data, as well as other non-personal data in order to help to deliver governmental services and privacy policies. The draft defines “personal data” as information that can help to identify a person and also has characteristics, traits and any other features of a person’s identity. “Sensitive personal data” also includes financial and biometric data. According to the draft, such “sensitive” data can be transferred outside India for processing, but must be stored locally.

Furthermore, social media platforms will be required to offer a mechanism for users to prove their identities and display a verification sign publicly. Such requirements would raise a host of technical issues for companies such as Facebook and WhatsApp.

As a result, the new bill could affect the way companies process, store and transfer Indian consumers’ data. Therefore, it could cause some difficulties for top technology companies.

Advocate General’s opinion on “Schrems II” is delayed

11. December 2019

The Court of Justice of the European Union (CJEU) Advocate General’s opinion in the case C-311/18 (‘Facebook Ireland and Schrems’) will be released on December 19, 2019. Originally, the CJEU announced that the opinion of the Advocate General in this case, Henrik Saugmandsgaard Øe, would be released on December 12, 2019. The CJEU did not provide a reason for this delay.

The prominent case deals with the complaint to the Irish Data Protection Commission (DPC) by privacy activist and lawyer Maximilian Schrems and the transfer of his personal data from Facebook Ireland Ltd. to Facebook Inc. in the U.S. under the European Commission’s controller-to-processor Standard Contractual Clauses (SCCs).

Perhaps, the most consequential question that the High Court of Ireland set before the CJEU is whether the transfers of personal data from the EU to the U.S. under the SCCs violate the rights of the individuals under Articles 7 and/or 8 of the Charter of Fundamental Rights of the European Union (Question No. 4). The decision of the CJEU in “Schrems II” will also have ramifications on the parallel case T-738/16 (‘La Quadrature du net and others’). The latter case poses the question whether the EU-U.S. Privacy Shield for data transfers from the EU to the U.S. protects the rights of EU individuals sufficiently. If it does not, the European Commission would face a “Safe Harbor”-déjà vu after approving of the new Privacy Shield in its adequacy decision from 2016.

The CJEU is not bound to the opinion of the Advocate General (AG), but in some cases, the AG’s opinion may be a weighty indicator of the CJEU’s final ruling. The final decision by the Court is expected in early 2020.

FTC reaches settlements with companies regarding Privacy Shield misrepresentations

10. December 2019

On December 3, 2019, the Federal Trade Commission (FTC) announced that it had reached settlements in four different cases of Privacy Shield misrepresentation. The FTC alleged that in particular Click Labs, Inc., Incentive Services, Inc., Global Data Vault, LLC, and TDARX, Inc. each falsely claimed to have participated in the framework agreements of the EU-US Privacy Shield. According to the FTC, Global Data and TDARX continued to claim participation in the EU-U.S. Privacy Shield upon expiration of their Privacy Shield certifications. Click Labs and Incentive Services have also erroneously claimed to participate in the Swiss-U.S. Privacy Shield Framework. In addition, Global Data and TDARX have violated the Privacy Shield Framework by failing to follow the annual review of whether statements about their privacy shield practices were accurate. Also, according to the complaints, they did not affirm that they would continue to apply Privacy Shield protection to personal information collected during participation in the program.

As part of the proposed settlements, each of the companies is prohibited from misrepresenting its participation in the EU-U.S. Privacy Shield Framework or any other privacy or data security program sponsored by any government or self-regulatory or standard-setting organization. In addition, Global Data Vault and TDARX are required to continue to apply Privacy Shield protection to personal information collected during participation in the program. Otherwise, they are required to return or delete such information.

The EU-U.S. and Swiss-U.S. Privacy Shield Frameworks allow companies to legally transfer personal data from the EU or Switzerland to the USA. Since the framework was established in 2016, the FTC has initiated a total of 21 enforcement measures in connection with the Privacy Shield.

A description of the consent agreements is published in the Federal Register and publicly commented on for 30 days. The FTC will then decide whether the proposed consent orders are final.

LGPD – Brazil’s upcoming Data Protection Law

28. November 2019

The National Congress of Brazil passed in August 2018 a new General Data Protection Law (“Lei Geral de Proteção de Dados” or “LGPD”). This law is slated to come into effect in August 2020. Prior to the LGPD, data protection in Brazil was primarily enforced via a various collection of legal frameworks, including the country’s Civil Rights Framework for the Internet (Internet Act) and Consumer Protection Code.

The new legislation creates a completely new general framework for the use of personal data processed on individuals in Brazil, regardless of where the data processor is located. Brazil also established its own Data Protection Authority, in order to enforce the guidance. Although the Data Protection Authority will initially be tied to the Presidency of the Federative Republic of Brazil, the DPA will become autonomous in the long term, in about two years.

Like the GDPR, the new framework has an extraterritorial application, which means that the law will apply to any individual or organization, private or public that processes or collects personal data in Brazil, regardless of where the Processor is based. The LGPD does not apply to data processing for strictly personal, academic, artistic and journalistic purposes.

Although the LGPD is largely influenced by the GDPR, both frameworks also differ from each other a lot. For instance, both frameworks define personal data differently. The LGPD’s definition is broad and covers any information relating to an identified or identifiable natural person. Furthermore, the LGPD does not permit cross-border transfers based on the controller’s legitimate interest. In the GDPR, the deadline for data breach notification is 72 hours; in the LGPD, the deadline is loosely defined, to name just a few.

Category: General · Personal Data
Tags: ,

Berlin commissioner for data protection imposes fine on real estate company

6. November 2019

On October 30th, 2019, the Berlin Commissioner for Data Protection and Freedom of Information issued a fine of around 14.5 million euros against the real estate company Deutsche Wohnen SE for violations of the General Data Protection Regulation (GDPR).

During on-site inspections in June 2017 and March 2019, the supervisory authority determined that the company used an archive system for the storage of personal data of tenants that did not provide for the possibility of removing data that was no longer required. Personal data of tenants were stored without checking whether storage was permissible or even necessary. In individual cases, private data of the tenants concerned could therefore be viewed, even though some of them were years old and no longer served the purpose of their original survey. This involved data on the personal and financial circumstances of tenants, such as salary statements, self-disclosure forms, extracts from employment and training contracts, tax, social security and health insurance data and bank statements.

After the commissioner had made the urgent recommendation to change the archive system in the first test date of 2017, the company was unable to demonstrate either a cleansing of its database nor legal reasons for the continued storage in March 2019, more than one and a half years after the first test date and nine months after the GDPR came into force. Although the enterprise had made preparations for the removal of the found grievances, nevertheless these measures did not lead to a legal state with the storage of personal data. Therefore the imposition of a fine was compelling because of a violation of article 25 Abs. 1 GDPR as well as article 5 GDPR for the period between May 2018 and March 2019.

The starting point for the calculation of fines is, among other things, the previous year’s worldwide sales of the affected companies. According to its annual report for 2018, the annual turnover of Deutsche Wohnen SE exceeded one billion euros. For this reason, the legally prescribed framework for the assessment of fines for the established data protection violation amounted to approximately 28 million euros.

For the concrete determination of the amount of the fine, the commissioner used the legal criteria, taking into account all burdening and relieving aspects. The fact that Deutsche Wohnen SE had deliberately set up the archive structure in question and that the data concerned had been processed in an inadmissible manner over a long period of time had a particularly negative effect. However, the fact that the company had taken initial measures to remedy the illegal situation and had cooperated well with the supervisory authority in formal terms was taken into account as a mitigating factor. Also with regard to the fact that the company was not able to prove any abusive access to the data stored, a fine in the middle range of the prescribed fine framework was appropriate.

In addition to sanctioning this violation, the commissioner imposed further fines of between 6,000 and 17,000 euros on the company for the inadmissible storage of personal data of tenants in 15 specific individual cases.

The decision on the fine has not yet become final. Deutsche Wohnen SE can lodge an appeal against this decision.

 The Netherlands passed new law on the use of passenger data

31. October 2019

In June 2019 the Netherlands adopted a new law concerning the processing and sharing of passenger data by airlines. Since the 18 June 2019, airlines are now required to share passenger data with a newly established passenger information unit  (‘Pi-NL’) for all flights that depart from the Netherlands or arrive in the Netherlands. The passenger data to be passed on include, for example nationality, full name, date of birth, number and type of travel documents used.

The new established specialised unit will be independent with its own statustory task and authorisations and will collect,process and analyse passenger data and share it with the competent authorities such as the police, Public Prosecution and with comparable units in other Member States oft he EU and with Europol, if necessary. It falls under the responsibility of the Minister of Justice and Security. The purpose of such data processing is to prevent, detect, investigate and prosecute terrorist offences and serious criminal offences.

This law implements the European PNR (Passenger Name Record) directive in Dutch law. The aim of the PNR directive is to ensure internal security within the European Union and to protect the life and safety of persons. It will also promote more effective cooperation between EU Member States.

In drafting this law, the Dutch gorvernment weighed the importance of combating terrorism against the privacy interests of passengers.  Therefore the newly introduced law also contains a number of data protection safeguards and guarantees, such as a limitation on the retention period, a processing prohibition on special categories of personal data and strict conditions for the exchange of such data with other states and the requirement that the Pi-NL appoint a data protection officer.

Pages: Prev 1 2 3 ... 9 10 11 12 13 14 15 ... 20 21 22 Next
1 10 11 12 13 14 22