Category: Countries

EDPS publishes opinion on future EU-UK partnership

3. March 2020

On 24 February 2020, the European Data Protection Supervisor (EDPS) published an opinion on the opening of negotiations for the future partnership between the EU and the UK with regards to personal data protection.

In his opinion, the EDPS points out the importance of commitments to fully respect fundamental rights in the future envisaged comprehensive partnership. Especially with regards to the protection of personal data, the partnership shall uphold the high protection level of the EU’s personal data rules.

With respect to the transfer of personal data, the EDPS further expresses support for the EU Commission’s recommendation to work towards the adoption of adequacy decisions for the UK if the relevant conditions are met. However, the Commission must ensure that the UK is not lowering its data protection standard below the EU standard after the Brexit transition period. Lastly, the EDPS recommends the EU Institutions to also prepare for a potential scenario in which no adequacy decisions exist by the end of the transition period on 31 December 2020.

US Lawmakers to introduce bill that restricts Government Surveillance

3. February 2020

On Thursday January 23rd a bipartisan group of US lawmakers have revealed a legislation which would reduce the scope of the National Security Agency’s (NSA) warrantless internet and telephone surveillance program.

The bill aims to reform section 215 of the PATRIOT Act, which is expiring on March 15, and prevent abuses of the Foreign Intelligence Surveillance Act. Under the PATRIOT Act, the NSA can create a secret mass surveillance that taps into the internet data and telephone records of American residents. Further, the Foreign Intelligence Surveillance Act allows for U.S. intelligence agencies to eavesdrop on and store vast amounts of digital communications from foreign suspects living outside the United States, with American citizens often caught in the cross hairs.

The newly introduced bill is supposed to host a lot of reforms such as prohibiting the warrantless collection of cell site location, GPS information, browsing history and internet search history, ending the authority for the NSA’s massive phone record program which was disclosed by Edward Snowden, establishing a three-year limitation on retention of information that is not foreign intelligence or evidence of a crime, and more.

This new legislation is seen favorably by national civil rights groups and Democrats, who hope the bill will stop the continuous infringement to the fourth Amendment of the American Constitution in the name of national security.

UK: Betting companies had access to millions of data of children

28. January 2020

In the UK, betting companies have gained access to data from 28 million children under 14 and adolescents. The data was stored in a government database and could be used for learning purposes. Access to the platform is granted by the government. A company that was given access is said to have illegally given it to another company, which in turn allowed access for the betting companies. The betting providers used the access, among other things, to check age information online. The company accused of passing on the access denies the allegations, but has not yet made any more specific statements.

The British Department for Education speaks of an unacceptable situation. All access points have been closed and the cooperation has been terminated.

Category: Data breach · General · UK
Tags: , ,

Washington State Lawmakers Propose new Privacy Bill

23. January 2020

Washington lawmakers introduced in January 2020, a law that would give state residents new privacy rights. The law is called “Washington Privacy Act” (WPA).

If passed, the Privacy Act would enact a comprehensive data protection framework for Washington that includes individual rights that are very similar and go beyond the rights in the California Consumer Privacy Act (CCPA), as well as a range of other obligations on businesses that do not yet exist in any U.S. privacy law.

Furthermore, the new draft bill contains strong provisions that largely align with the EU’s General Data Protection Regulation (GDPR), and commercial facial recognition provisions that start with a legal default of affirmative consent. Nonetheless, legislators must work within a remarkably short time-frame to pass a law that can be embraced by both House and Senate within the next six weeks of Washington’s legislative session. If passed, the bill would go into effect on July 31, 2021.

The current draft provides  data protection to all Washington State residents, and would apply to entities that conduct business in Washington or produce products or services targeted to Washington residents. Such entities must control or process data of at least 100,000 consumers; or derive 50% of gross revenue from the sale of personal data and process or control personal data of at least 25,000 consumers (with “consumers” defined as natural persons who are Washington residents, acting in an individual or household context). The draft bill will not apply to state and local governments or municipal corporations. The new bill would further provide all state residents, among other rights, the ability to opt out of targeted advertising.

The new draft bill will  regulate companies that process “personal data,” defined broadly as “any information that is linked or reasonably linkable to an identified or identifiable natural person” (not including de-identified data or publicly available information “information that is lawfully made available from federal, state, or local government records”), with specific provisions for pseudonymous data.

Category: Cyber security · GDPR · USA
Tags:

National Retailer fined £500,000 by ICO

10. January 2020

The Information Commissioner’s Office (ICO) – UK’s Data Protection Authority – has fined the national retailer ‘DSG Retail Limited’ £500,000 for failing to secure information of at least 14 million people after a computer system was compromised as result of a cyberattack.

An investigation by the ICO came to the conclusion that between July 2017 and April 2018 malware has been installed and collected personal data until the attack was detected. Due to the failure of DSG the attacker had access to 5.6 million payment card details and further personal data, inter alia full names, postcodes and email addresses.

The reason for the fine is seen in having poor security arrangements and failing to take adequate steps to protect personal data. The fine is based on the Data Protection Act 1998.

The director of the ICO, Steve Eckersley, said:

“Our investigation found systemic failures in the way DSG Retail Limited safeguarded personal data. It is very concerning that these failures related to basic, commonplace security measures, showing a complete disregard for the customers whose personal information was stolen. The contraventions in this case were so serious that we imposed the maximum penalty under the previous legislation, but the fine would inevitably have been much higher under the GDPR.”

The ICO considered the individual freedom of DSG’s customers to be at risk. Customers would have to fear financial theft and identity fraud.

Category: Cyber security · Data breach · UK

More US States are pushing on with new Privacy Legislation

3. January 2020

The California Consumer Privacy Act (CCPA) came into effect on January 1, 2020 and will be the first step in the United States in regulating data privacy on the Internet. Currently, the US does not have a federal-level general consumer data privacy law that is comparable to that of the privacy laws in EU countries or even the supranational European GDPR.

But now, several other US States have taken inspiration from the CCPA and are in the process of bringing forth their own state legislation on consumer privacy protections on the Internet, including

  • The Massachusetts Data Privacy Law “S-120“,
  • The New York Privacy Act “S5642“,
  • The Hawaii Consumer Privacy Protection Act “SB 418“,
  • The Maryland Online Consumer Protection Act “SB 613“, and
  • The North Dakota Bill “HB 1485“.

Like the CCPA, most of these new privacy laws have a broad definition of the term “Personal Information” and are aimed at protecting consumer data by strenghtening consumer rights.

However, the various law proposals differ in the scope of the consumer rights. All of them grant consumers the ‘right to access’ their data held by businesses. There will also be a ‘right to delete’ in most of these states, but only some give consumers a private ‘right of action’ for violations.

There are other differences with regards to the businesses that will be covered by the privacy laws. In some states, the proposed laws will apply to all businesses, while in other states the laws will only apply to businesses with yearly revenues of over 10 or 25 Million US-Dollars.

As more US states are beginning to introduce privacy laws, there is an increasing possiblity of a federal US privacy law in the near future. Proposals from several members of Congress already exist (Congresswomen Eshoo and Lofgren’s Proposal and Senators Cantwell/Schatz/Klobuchar/Markey’s Proposal and Senator Wicker’s Proposal).

NIST examines the effect of demographic differences on face recognition

31. December 2019

As part of its Face Recognition Vendor Test (FRVT) program, the U.S. National Institute of Standards and Technology (NIST) conducted a study that evaluated face recognition algorithms submitted by industry and academic developers for their ability to perform various tasks. The study evaluated 189 software algorithms submitted by 99 developers. It focuses on how well each algorithm performs one of two different tasks that are among the most common applications of face recognition.

The two tasks are “one-to-one” matching, i.e. confirming that a photo matches another photo of the same person in a database. This is used, for example, when unlocking a smartphone or checking a passport. The second task involved “one-to-many” matching, i.e. determining whether the person in the photo matches any database. This is used to identify a person of interest.

A special focus of this study was that it also looked at the performance of the individual algorithms taking demographic factors into account. For one-to-one matching, only a few previous studies examined demographic effects; for one-to-many matching, there were none.

To evaluate the algorithms, the NIST team used four photo collections containing 18.27 million images of 8.49 million people. All were taken from operational databases of the State Department, Department of Homeland Security and the FBI. The team did not use images taken directly from Internet sources such as social media or from video surveillance. The photos in the databases contained metadata information that indicated the age, gender, and either race or country of birth of the person.

The study found that the result depends ultimately on the algorithm at the heart of the system, the application that uses it, and the data it is fed with. But the majority of face recognition algorithms exhibit demographic differences. In one-to-one matching, the algorithm rated photos of two different people more often as one person if they were Asian or African-American than if they were white. In algorithms developed by Americans, the same error occurred when the person was a Native American. In contrast, algorithms developed in Asia did not show such a significant difference in one-to-one matching results between Asian and Caucasian faces. However, these results show that algorithms can be trained to achieve correct face recognition results by using a wide range of data.

Advocate General releases opinion on the validity of SCCs in case of Third Country Transfers

19. December 2019

Today, Thursday 19 of December, the European Court of Justice’s (CJEU) Advocate General Henrik Saugmandsgaard Øe released his opinion on the validity of Standard Contractual Clauses (SCCs) in cases of personal data transfers to processors situated in third countries.

The background of the case, on which the opinion builds on, originates in the proceedings initiated by Mr. Maximillian Schrems, where he stepped up against Facebook’s business practice of transferring the personal data of its European subscribers to servers located in the United States. The case (Schrems I) led the CJEU on October 6, 2015, to invalidate the Safe Harbor arrangement, which up to that point governed data transfers between the EU and the U.S.A.

Following the ruling, Mr. Schrems decided to challenge the transfers performed on the basis of the EU SCCs, the alternative mechanism Facebook has chosen to rely on to legitimize its EU-U.S. data flows, on the basis of similar arguments to those raised in the Schrems I case. The Irish DPA brought proceedings before the Irish High Court, which referred 11 questions to the CJEU for a preliminary ruling, the Schrems II case.

In the newly published opinion, the Advocate General validates the established SCCs in case of a commercial transfer, despite the possibility of public authorities in the third country processing the personal data for national security reasons. Furthermore, the Advocate General states that the continuity of the high level of protection is not only guaranteed by the adequacy decision of the court, but just as well by the contractual safeguards which the exporter has in place that need to match that level of protection. Therefore, the SCCs represent a general mechanism applicable to transfers, no matter the third country and its adequacy of protection. In addition, and in light of the Charter, there is an obligation for the controller as well as the supervisory authority to suspend any third country transfer if, because of a conflict between the SCCs and the laws in the third country, the SCCs cannot be complied with.

In the end, the Advocate General also clarified that the EU-U.S. Privacy Shield decision of 12 July 2016 is not part of the current proceedings, since those only cover the SCCs under Decision 2010/87, taking the questions of the validity of the Privacy Shield off the table.

While the Advocate General’s opinion is not binding, it represents the suggestion of a legal solution for cases for which the CJEU is responsible. However, the CJEU’s decision on the matter is not expected until early 2020, setting the curiosity on the outcome of the case high.

Advocate General’s opinion on “Schrems II” is delayed

11. December 2019

The Court of Justice of the European Union (CJEU) Advocate General’s opinion in the case C-311/18 (‘Facebook Ireland and Schrems’) will be released on December 19, 2019. Originally, the CJEU announced that the opinion of the Advocate General in this case, Henrik Saugmandsgaard Øe, would be released on December 12, 2019. The CJEU did not provide a reason for this delay.

The prominent case deals with the complaint to the Irish Data Protection Commission (DPC) by privacy activist and lawyer Maximilian Schrems and the transfer of his personal data from Facebook Ireland Ltd. to Facebook Inc. in the U.S. under the European Commission’s controller-to-processor Standard Contractual Clauses (SCCs).

Perhaps, the most consequential question that the High Court of Ireland set before the CJEU is whether the transfers of personal data from the EU to the U.S. under the SCCs violate the rights of the individuals under Articles 7 and/or 8 of the Charter of Fundamental Rights of the European Union (Question No. 4). The decision of the CJEU in “Schrems II” will also have ramifications on the parallel case T-738/16 (‘La Quadrature du net and others’). The latter case poses the question whether the EU-U.S. Privacy Shield for data transfers from the EU to the U.S. protects the rights of EU individuals sufficiently. If it does not, the European Commission would face a “Safe Harbor”-déjà vu after approving of the new Privacy Shield in its adequacy decision from 2016.

The CJEU is not bound to the opinion of the Advocate General (AG), but in some cases, the AG’s opinion may be a weighty indicator of the CJEU’s final ruling. The final decision by the Court is expected in early 2020.

FTC reaches settlements with companies regarding Privacy Shield misrepresentations

10. December 2019

On December 3, 2019, the Federal Trade Commission (FTC) announced that it had reached settlements in four different cases of Privacy Shield misrepresentation. The FTC alleged that in particular Click Labs, Inc., Incentive Services, Inc., Global Data Vault, LLC, and TDARX, Inc. each falsely claimed to have participated in the framework agreements of the EU-US Privacy Shield. According to the FTC, Global Data and TDARX continued to claim participation in the EU-U.S. Privacy Shield upon expiration of their Privacy Shield certifications. Click Labs and Incentive Services have also erroneously claimed to participate in the Swiss-U.S. Privacy Shield Framework. In addition, Global Data and TDARX have violated the Privacy Shield Framework by failing to follow the annual review of whether statements about their privacy shield practices were accurate. Also, according to the complaints, they did not affirm that they would continue to apply Privacy Shield protection to personal information collected during participation in the program.

As part of the proposed settlements, each of the companies is prohibited from misrepresenting its participation in the EU-U.S. Privacy Shield Framework or any other privacy or data security program sponsored by any government or self-regulatory or standard-setting organization. In addition, Global Data Vault and TDARX are required to continue to apply Privacy Shield protection to personal information collected during participation in the program. Otherwise, they are required to return or delete such information.

The EU-U.S. and Swiss-U.S. Privacy Shield Frameworks allow companies to legally transfer personal data from the EU or Switzerland to the USA. Since the framework was established in 2016, the FTC has initiated a total of 21 enforcement measures in connection with the Privacy Shield.

A description of the consent agreements is published in the Federal Register and publicly commented on for 30 days. The FTC will then decide whether the proposed consent orders are final.

Pages: 1 2 3 4 5 6 7 8 9 10 ... 13 14 15 Next
1 2 3 15