Washington State Lawmakers Propose new Privacy Bill

23. January 2020

Washington lawmakers introduced in January 2020, a law that would give state residents new privacy rights. The law is called “Washington Privacy Act” (WPA).

If passed, the Privacy Act would enact a comprehensive data protection framework for Washington that includes individual rights that are very similar and go beyond the rights in the California Consumer Privacy Act (CCPA), as well as a range of other obligations on businesses that do not yet exist in any U.S. privacy law.

Furthermore the new draft bill contains strong provisions that largely align with the EU’s General Data Protection Regulation (GDPR), and commercial facial recognition provisions that start with a legal default of affirmative consent. Nonetheless, legislators must work within a remarkably short time-frame to pass a law that can be embraced by both House and Senate within the next six weeks of Washington’s legislative session. If passed, the bill would go into effect on July 31, 2021.

The current draft provides  data protections to all Washington State residents, and would apply to entities that  conduct business in Washington or produce products or services targeted to Washington residents. Such entities must control or process data of at least 100,000 consumers; or derive 50% of gross revenue from the sale of personal data and process or control personal data of at least 25,000 consumers (with “consumers” defined as natural persons who are Washington residents, acting in an individual or household context). The draft bill will not apply to state and local governments or municipal corporations. The new bill would further provide all state residents, among other rights, the ability to opt out of targeted advertising.

The new draft bill will  regulate companies that process “personal data,” defined broadly as “any information that is linked or reasonably linkable to an identified or identifiable natural person” (not including de-identified data or publicly available information “information that is lawfully made available from federal, state, or local government records”), with specific provisions for pseudonymous data.

Category: Cyber security · GDPR · USA
Tags:

Italian DPA fined Eni Gas e Luce

22. January 2020

The Italian Data Protection Authority ‘Garante‘ fined the gas and electric company ‘Eni Gas es Luce – EGL’ for two violations of the GDPR.

Reason for the overall fine of  €11,5 million is unsolicited telemarketing (€8,5 million) and activation of unsolicited contracts (€3 million).

The santions were determined taking into account the parameters indicated in the GDPR, which include the wide range of subjects involved (about 7200 customers), the pervasiveness of the conduct, the duration of the violation, the economic conditions of EGL.

Besides the fine, the Garante has ordered EGL to adopt corrective measures in order to process personal data in compliance with the GDPR and prohibited the processing of personal data of EGL’s telemarketing list without explicit consent.

The implementations will have to be introduced and communicated to Garante within established timescales, while the payment of sanctions will have to be made within thirty days.

Category: General

CNIL publishes recommendations on how to get users’ cookie consent

21. January 2020

On 14 January 2020, the French data protection authority (“CNIL”) published recommendations on practical modalities for obtaining the consent of users to store or read non-essential cookies and similar technologies on their devices. In addition, the CNIL also published a series of questions and answers on the recommendations.

The purpose of the recommendations is to help private and public organisations to implement the CNIL guidelines on cookies and similar technologies dated 4 July 2019. To this end, CNIL describes the practical arrangements for obtaining users’ consent, gives concrete examples of the user interface to obtain consent and presents “best practices” that also go beyond the rules.

In order to find pragmatic and privacy-friendly solutions, CNIL consulted with organisations representing industries in the ad tech ecosystem and civil society organisations in advance and discussed the issue with them. The recommendations are neither binding or prescriptive nor exhaustive. Organisations may use other methods to obtain user consent, as long as these methods are in accordance with the guidelines.

Among the most important recommendations are:

Information about the purpose of cookies
First, the purposes of the cookies should be listed. The recommendations contain examples of this brief description for the following purposes or types of cookies:
(1) targeted or personalised advertising;
(2) non-personalized advertising;
(3) personalised advertising based on precise geolocation;
(4) customization of content or products and services provided by the Web Publisher;
(5) social media sharing;
(6) audience measurement/analysis.
In addition, the list of purposes should be complemented by a more detailed description of these purposes, which should be directly accessible, e.g. via a drop-down button or hyperlink.

Information on the data controllers
An exhaustive list of data controllers should be directly accessible, e.g. via a drop-down button or hyperlink. When users click on this hyperlink or button, they should receive specific information on data controllers (name and link to their privacy policy). However, web publishers do not have to list all third parties that use cookies on their website or application, but only those who are also data controllers. Therefore, the role of the parties (data controller, joint data controller, or data processor) has to be assessed individually for each cookie. This list should be regularly updated and should be permanently accessible (e.g. through the cookie consent mechanism, which would be available via a static icon or hyperlink at the bottom of each web page). Should a “substantial” addition be made to the list of data controllers, users’ consent should be sought again.

Real choice between accepting or rejecting cookies
Users must be offered a real choice between accepting or rejecting cookies. This can be done by means of two (not pre-ticked) checkboxes or buttons (“accept” / “reject”, “allow” / “deny”, etc.) or equivalent elements such as “on”/”off” sliders, which should be disabled by default. These checkboxes, buttons or sliders should have the same format and be presented at the same level. Users should have such a choice for each type or category of cookie.

The ability for users to delay this selection
A “cross” button should be included so that users can close the consent interface and do not have to make a choice. If the user closes the interface, no consent cookies should be set. However, consent could be obtained again until the user makes a choice and accepts or rejects cookies.

Overall consent for multiple sites
It is acceptable to obtain user consent for a group of sites rather than individually for each site. However, this requires that users are informed of the exact scope of their consent (i.e., by providing them with a list of sites to which their consent applies) and that they have the ability to refuse all cookies on those sites altogether (e.g., if there is a “refuse all” button along with an “accept all” button). To this end, the examples given in the recommendations include three buttons: “Personalize My Choice” (where users can make a more precise selection based on the purpose or type of cookies), “Reject All” and “Accept All”.

Duration of validity of the consent
It is recommended that users re-submit their consent at regular intervals. CNIL considers a period of 6 months to be appropriate.

Proof of consent
Data controllers should be able to provide individual proof of users’ consent and to demonstrate that their consent mechanism allows a valid consent to be obtained.

The recommendations are open for public consultation until 25 February 2020. A new version of the recommendations will then be submitted to the members of CNIL for adoption during a plenary session. CNIL will carry out enforcement inspections six months after the adoption of the recommendations. The final recommendations may also be updated and completed over time to take account of new technological developments and the responses to the questions raised by professionals and individuals on this subject.

German Officials warn Travellers to China of Espionage

17. January 2020

The German Federal Office for the Protection of the Constitution (BfV) sees a significant risk for the security of personal data when accessing local WiFi networks and the mobile network in China. A request from the German newspaper “Handelsblatt” to the BfV revealed that the Officials warn travellers to China of an increasing risk of espionage.

For the stay in China, the BfV discourages travellers from using laptops and smartphones that contain personal data, especially contact information. Instead, the BfV recommends to acquire a travel laptop and a prepaid mobile phone that could be resetted or even be disposed of after leaving China.

According to Handelsblatt, the warning stems from cases in which the Chinese border police conducted mobile phone controls at the Chinese border of Xinjiang and installed a surveillance App on tourists’ smartphones.

In 2016, the BfV already cautioned of potential espionage by Chinese secret services targetting students and researchers.

National Retailer fined £500,000 by ICO

10. January 2020

The Information Commissioner’s Office (ICO) – UK’s Data Protection Authority – has fined the national retailer ‘DSG Retail Limited’ £500,000 for failing to secure information of at least 14 million people after a computer system was compromised as result of a cyberattack.

An investigation by the ICO came to the conclusion that between July 2017 and April 2018 malware has been installed and collected personal data until the attack was detected. Due to the failure of DSG the attacker had access to 5.6 million payment card details and further personal data, inter alia full names, postcodes and email addresses.

The reason for the fine is seen in having poor security arrangements and failing to take adequate steps to protect personal data. The fine is based on the Data Protection Act 1998.

The director of the ICO, Steve Eckersley, said:

“Our investigation found systemic failures in the way DSG Retail Limited safeguarded personal data. It is very concerning that these failures related to basic, commonplace security measures, showing a complete disregard for the customers whose personal information was stolen. The contraventions in this case were so serious that we imposed the maximum penalty under the previous legislation, but the fine would inevitably have been much higher under the GDPR.”

The ICO considered the individual freedom of DSG’s customers to be at risk. Customers would have to fear financial theft and identity fraud.

Category: Cyber security · Data breach · UK

A short review of the Polish DPA’s enforcement of the GDPR

To date, the Polish Data Protection Authority (DPA) have issued 134 decisions and imposed GDPR fines in 5 cases. In 4 cases, the Polish DPA fined private companies and in one case, it fined a public institution.

The fines for the companies ranged from 13.000€ to 645.000€. Reasons for the fines were failures in protecting personal data on websites resulting in the unauthorised access of personal data, inadequate technical and organisational measures, and an insufficient fulfilment of information obligations according to Art. 14 GDPR.

It is also noteworthy that the Polish DPA has imposed a 9.350€ fine on the Mayor of a Polish small town. Under Art. 83 (7) GDPR, each member state of the EU may lay down rules on whether and to what extent administrative fines may be imposed on public authorities. The Polish legislators decided that non-compliant public authorities may receive a GDPR fine of up to 23.475€.

The Mayor received the GDPR fine since he failed to conclude a data processing agreement with the entities to which he transferred data in violation of Art. 28 (3) GDPR. Moreover, the Mayor violated the principle of storage limitation, the principles of integrity and confidentiality, the principle of accountability and furthermore kept an incomplete record of processing activities.

Recently, the Polish DPA also published the EU Project T4DATA’s Handbook for Data Protection Officers (DPO) in order to help define a DPO’s role, their competencies and main responsibilities.

More US States are pushing on with new Privacy Legislation

3. January 2020

The California Consumer Privacy Act (CCPA) came into effect on January 1, 2020 and will be the first step in the United States in regulating data privacy on the Internet. Currently, the US does not have a federal-level general consumer data privacy law that is comparable to that of the privacy laws in EU countries or even the supranational European GDPR.

But now, five other US States have taken inspiration from the CCPA and are in the process of bringing forth their own state legislation on consumer privacy protections on the Internet, including

  • The Massachusetts Data Privacy Law “S-120“,
  • The New York Privacy Act “S5642“,
  • The Hawaii Consumer Privacy Protection Act “SB 418“,
  • The Maryland Online Consumer Protection Act “SB 613“, and
  • The North Dakota Bill “HB 1485“.

Like the CCPA, most of these new privacy laws have a broad definition of the term “Personal Information” and are aimed at protecting consumer data by strenghtening consumer rights.

However, the various law proposals differ in the scope of the consumer rights. All of them grant consumers the ‘right to access’ their data held by businesses. There will also be a ‘right to delete’ in most of these states, but only some give consumers a private ‘right of action’ for violations.

There are other differences with regards to the businesses that will be covered by the privacy laws. In some states, the proposed laws will apply to all businesses, while in other states the laws will only apply to businesses with yearly revenues of over 10 or 25 Million US-Dollars.

As more US states are beginning to introduce privacy laws, there is an increasing possiblity of a federal US privacy law in the near future. Proposals from several members of Congress already exist (Congresswomen Eshoo and Lofgren’s Proposal and Senators Cantwell/Schatz/Klobuchar/Markey’s Proposal and Senator Wicker’s Proposal).

Fine imposed on the City of Oslo

2. January 2020

The Norwegian data protection authority (datatilsynet) recently imposed a fine of €49,300 on the city of Oslo. The reason for the fine was that the city has kept patient data outside the electronic health record system at the city’s nursing homes/health centres from 2007 to November 2018.

The case became known because the City of Oslo reported a data breach to the Data Protection Authority in November 2018. This report included information that various governmental and private nursing homes/health centres were using work sheets. These contained information about the residents, such as their daily needs and care routines, but also full names and room numbers. The work sheets were stored on the respective intranet of the institution and all employees, including for example cleaning staff, had access to this data.

After the procedure came to the surface, the Nursing Home Agency instructed all nursing homes/health centres to delete the work sheets immediately. Due to the way the data was stored, it is not possible to determine who exactly accessed the data and when, and whether unauthorised persons were among them.

In calculating the amount of the fine, the Data Protection Agency has taken into account that the City of Oslo reported the incident itself and has taken quick steps to delete the data. It was also taken into account that the incident occurred for the most part in the period before the new Data Protection Act (in force since July 2018) came into force and that under the old Data Protection Act the maximum amount of a fine was €100,000.

Happy New Year!

1. January 2020

Dear readers,

the team of the blog privacy-ticker.com wish you a happy new year and all the best for 2020.

Once again this year we will keep you up to date on the subject of data protection.

Best regards,

privacy-ticker.com

Category: General

NIST examines the effect of demographic differences on face recognition

31. December 2019

As part of its Face Recognition Vendor Test (FRVT) program, the U.S. National Institute of Standards and Technology (NIST) conducted a study that evaluated face recognition algorithms submitted by industry and academic developers for their ability to perform various tasks. The study evaluated 189 software algorithms submitted by 99 developers. It focuses on how well each algorithm performs one of two different tasks that are among the most common applications of face recognition.

The two tasks are “one-to-one” matching, i.e. confirming that a photo matches another photo of the same person in a database. This is used, for example, when unlocking a smartphone or checking a passport. The second task involved “one-to-many” matching, i.e. determining whether the person in the photo matches any database. This is used to identify a person of interest.

A special focus of this study was that it also looked at the performance of the individual algorithms taking demographic factors into account. For one-to-one matching, only a few previous studies examined demographic effects; for one-to-many matching, there were none.

To evaluate the algorithms, the NIST team used four photo collections containing 18.27 million images of 8.49 million people. All were taken from operational databases of the State Department, Department of Homeland Security and the FBI. The team did not use images taken directly from Internet sources such as social media or from video surveillance. The photos in the databases contained metadata information that indicated the age, gender, and either race or country of birth of the person.

The study found that the result depends ultimately on the algorithm at the heart of the system, the application that uses it, and the data it is fed with. But the majority of face recognition algorithms exhibit demographic differences. In one-to-one matching, the algorithm rated photos of two different people more often as one person if they were Asian or African-American than if they were white. In algorithms developed by Americans, the same error occurred when the person was a Native American. In contrast, algorithms developed in Asia did not show such a significant difference in one-to-one matching results between Asian and Caucasian faces. However, these results show that algorithms can be trained to achieve correct face recognition results by using a wide range of data.

Pages: 1 2 3 4 5 6 7 8 9 10 ... 40 41 42 Next
1 2 3 42