A short review of the Polish DPA’s enforcement of the GDPR

10. January 2020

To date, the Polish Data Protection Authority (DPA) have issued 134 decisions and imposed GDPR fines in 5 cases. In 4 cases, the Polish DPA fined private companies and in one case, it fined a public institution.

The fines for the companies ranged from 13.000€ to 645.000€. Reasons for the fines were failures in protecting personal data on websites resulting in the unauthorised access of personal data, inadequate technical and organisational measures, and an insufficient fulfilment of information obligations according to Art. 14 GDPR.

It is also noteworthy that the Polish DPA has imposed a 9.350€ fine on the Mayor of a Polish small town. Under Art. 83 (7) GDPR, each member state of the EU may lay down rules on whether and to what extent administrative fines may be imposed on public authorities. The Polish legislators decided that non-compliant public authorities may receive a GDPR fine of up to 23.475€.

The Mayor received the GDPR fine since he failed to conclude a data processing agreement with the entities to which he transferred data in violation of Art. 28 (3) GDPR. Moreover, the Mayor violated the principle of storage limitation, the principles of integrity and confidentiality, the principle of accountability and furthermore kept an incomplete record of processing activities.

Recently, the Polish DPA also published the EU Project T4DATA’s Handbook for Data Protection Officers (DPO) in order to help define a DPO’s role, their competencies and main responsibilities.

More US States are pushing on with new Privacy Legislation

3. January 2020

The California Consumer Privacy Act (CCPA) came into effect on January 1, 2020 and will be the first step in the United States in regulating data privacy on the Internet. Currently, the US does not have a federal-level general consumer data privacy law that is comparable to that of the privacy laws in EU countries or even the supranational European GDPR.

But now, several other US States have taken inspiration from the CCPA and are in the process of bringing forth their own state legislation on consumer privacy protections on the Internet, including

  • The Massachusetts Data Privacy Law “S-120“,
  • The New York Privacy Act “S5642“,
  • The Hawaii Consumer Privacy Protection Act “SB 418“,
  • The Maryland Online Consumer Protection Act “SB 613“, and
  • The North Dakota Bill “HB 1485“.

Like the CCPA, most of these new privacy laws have a broad definition of the term “Personal Information” and are aimed at protecting consumer data by strenghtening consumer rights.

However, the various law proposals differ in the scope of the consumer rights. All of them grant consumers the ‘right to access’ their data held by businesses. There will also be a ‘right to delete’ in most of these states, but only some give consumers a private ‘right of action’ for violations.

There are other differences with regards to the businesses that will be covered by the privacy laws. In some states, the proposed laws will apply to all businesses, while in other states the laws will only apply to businesses with yearly revenues of over 10 or 25 Million US-Dollars.

As more US states are beginning to introduce privacy laws, there is an increasing possiblity of a federal US privacy law in the near future. Proposals from several members of Congress already exist (Congresswomen Eshoo and Lofgren’s Proposal and Senators Cantwell/Schatz/Klobuchar/Markey’s Proposal and Senator Wicker’s Proposal).

Fine imposed on the City of Oslo

2. January 2020

The Norwegian data protection authority (datatilsynet) recently imposed a fine of €49,300 on the city of Oslo. The reason for the fine was that the city has kept patient data outside the electronic health record system at the city’s nursing homes/health centres from 2007 to November 2018.

The case became known because the City of Oslo reported a data breach to the Data Protection Authority in November 2018. This report included information that various governmental and private nursing homes/health centres were using work sheets. These contained information about the residents, such as their daily needs and care routines, but also full names and room numbers. The work sheets were stored on the respective intranet of the institution and all employees, including for example cleaning staff, had access to this data.

After the procedure came to the surface, the Nursing Home Agency instructed all nursing homes/health centres to delete the work sheets immediately. Due to the way the data was stored, it is not possible to determine who exactly accessed the data and when, and whether unauthorised persons were among them.

In calculating the amount of the fine, the Data Protection Agency has taken into account that the City of Oslo reported the incident itself and has taken quick steps to delete the data. It was also taken into account that the incident occurred for the most part in the period before the new Data Protection Act (in force since July 2018) came into force and that under the old Data Protection Act the maximum amount of a fine was €100,000.

Happy New Year!

1. January 2020

Dear readers,

the team of the blog privacy-ticker.com wish you a happy new year and all the best for 2020.

Once again this year we will keep you up to date on the subject of data protection.

Best regards,

privacy-ticker.com

Category: General

NIST examines the effect of demographic differences on face recognition

31. December 2019

As part of its Face Recognition Vendor Test (FRVT) program, the U.S. National Institute of Standards and Technology (NIST) conducted a study that evaluated face recognition algorithms submitted by industry and academic developers for their ability to perform various tasks. The study evaluated 189 software algorithms submitted by 99 developers. It focuses on how well each algorithm performs one of two different tasks that are among the most common applications of face recognition.

The two tasks are “one-to-one” matching, i.e. confirming that a photo matches another photo of the same person in a database. This is used, for example, when unlocking a smartphone or checking a passport. The second task involved “one-to-many” matching, i.e. determining whether the person in the photo matches any database. This is used to identify a person of interest.

A special focus of this study was that it also looked at the performance of the individual algorithms taking demographic factors into account. For one-to-one matching, only a few previous studies examined demographic effects; for one-to-many matching, there were none.

To evaluate the algorithms, the NIST team used four photo collections containing 18.27 million images of 8.49 million people. All were taken from operational databases of the State Department, Department of Homeland Security and the FBI. The team did not use images taken directly from Internet sources such as social media or from video surveillance. The photos in the databases contained metadata information that indicated the age, gender, and either race or country of birth of the person.

The study found that the result depends ultimately on the algorithm at the heart of the system, the application that uses it, and the data it is fed with. But the majority of face recognition algorithms exhibit demographic differences. In one-to-one matching, the algorithm rated photos of two different people more often as one person if they were Asian or African-American than if they were white. In algorithms developed by Americans, the same error occurred when the person was a Native American. In contrast, algorithms developed in Asia did not show such a significant difference in one-to-one matching results between Asian and Caucasian faces. However, these results show that algorithms can be trained to achieve correct face recognition results by using a wide range of data.

Austrian Regional Court grants an Austrian man 800€ in GDPR compensation

20. December 2019

The Austrian Regional Court, Landesgericht Feldkirch, has ruled that the major Austrian postal service Österreichische Post (ÖPAG) has to pay an Austrian man 800 Euros in compensation because of violating the GDPR (LG Feldkirch, Beschl. v. 07.08.2019 – Az.: 57 Cg 30/19b – 15). It is one of the first rulings in Europe in which a civil court granted a data subject compensation based on a GDPR violation. Parallel to this court ruling, ÖPAG is facing an 18 Mio Euro fine from the Austrian Data Protection Authorities.

Based on people’s statements in anonymised surveys, ÖPAG had created marketing groups and used algorithms to calculate the probability of the political affinities that people with certain socioeconomic and regional backgrounds might have. ÖPAG then ascribed customers to these marketing groups and thus also stored data about their calculated political affinities. Among these customers was the plaintiff of this case.

The court ruled that this combination is “personal data revealing political opinions” according to Art. 9 GDPR. Since ÖPAG neither obtained the plaintiff’s consent to process his sensitive data on political opinions nor informed him about the processing itself, ÖPAG violated the plaintiff’s individual rights.

While the plaintiff demanded 2.500 Euros in compensation from ÖPAG, the court granted the plaintiff only a non-material damage compensation of 800 Euros after weighing up the circumstances of the individual case.

The case was appealed and will be tried at the Higher Regional Court Innsbruck.

Advocate General releases opinion on the validity of SCCs in case of Third Country Transfers

19. December 2019

Today, Thursday 19 of December, the European Court of Justice’s (CJEU) Advocate General Henrik Saugmandsgaard Øe released his opinion on the validity of Standard Contractual Clauses (SCCs) in cases of personal data transfers to processors situated in third countries.

The background of the case, on which the opinion builds on, originates in the proceedings initiated by Mr. Maximillian Schrems, where he stepped up against Facebook’s business practice of transferring the personal data of its European subscribers to servers located in the United States. The case (Schrems I) led the CJEU on October 6, 2015, to invalidate the Safe Harbor arrangement, which up to that point governed data transfers between the EU and the U.S.A.

Following the ruling, Mr. Schrems decided to challenge the transfers performed on the basis of the EU SCCs, the alternative mechanism Facebook has chosen to rely on to legitimize its EU-U.S. data flows, on the basis of similar arguments to those raised in the Schrems I case. The Irish DPA brought proceedings before the Irish High Court, which referred 11 questions to the CJEU for a preliminary ruling, the Schrems II case.

In the newly published opinion, the Advocate General validates the established SCCs in case of a commercial transfer, despite the possibility of public authorities in the third country processing the personal data for national security reasons. Furthermore, the Advocate General states that the continuity of the high level of protection is not only guaranteed by the adequacy decision of the court, but just as well by the contractual safeguards which the exporter has in place that need to match that level of protection. Therefore, the SCCs represent a general mechanism applicable to transfers, no matter the third country and its adequacy of protection. In addition, and in light of the Charter, there is an obligation for the controller as well as the supervisory authority to suspend any third country transfer if, because of a conflict between the SCCs and the laws in the third country, the SCCs cannot be complied with.

In the end, the Advocate General also clarified that the EU-U.S. Privacy Shield decision of 12 July 2016 is not part of the current proceedings, since those only cover the SCCs under Decision 2010/87, taking the questions of the validity of the Privacy Shield off the table.

While the Advocate General’s opinion is not binding, it represents the suggestion of a legal solution for cases for which the CJEU is responsible. However, the CJEU’s decision on the matter is not expected until early 2020, setting the curiosity on the outcome of the case high.

Facebook collects location data despite deactivation

Facebook has admitted at the request of several US senators that they continuously collect location data, even if the user previously deactivated this feature.

In case of deactivating this feature, location data is collected, for example, by IP address mapping or user activity. This includes, for example, a self-conducted location-tag in a certain restaurant or at a special location, but also the case of being linked by friends to a photo that contains a location-tag.

In the letter that Senator Josh Hawley published on Twitter, Facebook states that they have only the best intentions in collecting the data. According to the statement, this is the only way, for example, to place personalized ads or inform a user when someone logs in to a completely different location than usual with their account.

While Facebook states that the location data – based on e.g. the IP address –  does not indicate an exact Location but only the postcode, for example, it means that there is no way for users to opt-out of the collection of location data.

Category: General
Tags: ,

Data Leak of South African IT firm exposes over 1 Million Web Browsing Records

18. December 2019

Security researchers at vpnMentor recently discovered an unsecured and unencrypted database owned by the South African information and communications technology (ICT) company Conor. The breached database consisted of daily logs of user activity by customers of Internet Service Providers (ISPs) that used web filtering software built by Conor.

The leak exposed all internet traffic and activity, along with their personally identifying information and highly sensitive and private information. For two months it revealed activity logs such as website URLs, IP addresses, index names and MSISDN codes which identify mobile users on a specific network. The details contained in this breach included highly sensitive web browsing activity like attempts to visit pornography websites, social media accounts, online storage including iCloud and messaging apps such as WhatsApp. In total, this resulted in 890+ GB of data and over 1 million records being exposed.

“Because the database gave access to a complete record of each user’s activity in a session, our team was able to view every website they visited – or attempted to visit. We could also identify each user,” the vpnMentor team explained in their statement. “For an ICT and software development company not to protect this data is incredibly negligent. Conor’s lapse in data security could create real-world problems for the people exposed.”

Such an incident could make Conor suffer significant reputational damage and integrity loss. In addition, it exposed how their filter system worked and ways to circumvent it. This could lead to their product becoming ineffective against attempts to bypass it, making it redundant. In result, the outcome may lead to a loss of business for Conor, since clients may no longer feel like they can trust the company and the values they propose.

Irish DPC updates Guidance on Data Processing’s Legal Bases

17. December 2019

The Irish Data Protection Commission (DPC) has updated their guidance on the legal bases for personal data processing. It focuses on data processing under the European General Data Protection Regulation (GDPR) as well as data processing requirements under the European Law Enforcement Directive.

The main points of the updates to the guidance are to make companies more sensitive of their reasons for processing personal data and choosing the right legal basis, as well as ensure that data subjects may be able to figure out if their data is being processed lawfully.

The guidance focuses on the different legal bases in Art.6 GDPR, namely consent, contracts, legal obligation, vital interests, public task or legitimate interests. The Irish DPC states that controllers do not only have to choose the right legal basis, but they also have to understand the obligations that come with the chosen one, which is why they wanted to go into further detail.

Overall, the guidance is made to aid both controllers and data subjects. It consists of a way to support a better understanding of the terminology, as well as the legal requirements the GDPR sets out for processing personal data.

Pages: Prev 1 2 3 ... 25 26 27 28 29 30 31 ... 67 68 69 Next
1 26 27 28 29 30 69