The French Constitutional Council ruled in favour of the new data protection law implementing the EU General Data Protection Regulation

20. June 2018

The Senators referred the recently adopted data protection law to the Constitutional Council (‘Conseil Constitutionnel’) to prevent its promulgation on time for the General Data Protection Regulation (GDPR) to enter into force on last May 25. Now that the law has overcome the constitutional obstacle, it is expected to be promulgated in the next days.

The decision of the Constitutional Council (Décision n° 2018-765 DC) on June 12 demonstrates that the senators questioned the constitutionality of a number of Articles, e.g. 1, 4, 5, 7, 13, 16, 20, 21, 30 and 36.

Initially, the validity of universal law was weighed against the objective of constitutionality in terms of legislative accessibility and intelligibility. The senators argued that the implementation with the provisions of the GDPR was not clear and could “seriously mislead” citizens about their rights and obligations with regard to data protection.
The Council did not endorse this reasoning, stating that the law was readable and that Article 32 of the law referred to actually empowered the Government to take the measures required “in order to make the formal corrections and adaptations necessary to simplify and ensure consistency and simplicity in the implementation by the persons concerned of the provisions bringing national law into compliance” with the General Data Protection Regulation.

Furthermore, the constitutionality of most of the above-mentioned Articles was established. Nonetheless, Article 13 of the law amends Article 9 of the current law, according to which personal data relating to criminal convictions and offences or related security measures may only be processed “under the control of an official authority” or by certain categories of persons listed in the law. However, according to the Council, it is only a reproduction of Article 10 of the GDPR, without specifying the categories of persons authorised to process such data under the control of the authority, or the purposes of such processing. The words “under the control of the official authority” are not specific enough and therefore unconstitutional. This terminology will not be found in the promulgated law.

For France this symbolises a major step forward to join the small circle of European countries that have succeeded in implementing the GDPR at a national level.

Update on ePrivacy Regulation

12. June 2018

The council of the European Union’s Bulgarian presidency has released a progress report on the draft ePrivacy Regulation ahead of a council meeting June 8th, 2018.

The ePrivacy Regulation (Regulation on Privacy and Electronic Communications) should replace the current ePrivacy Directive and was originally intended to enter into force together with the General Data Protection Regulation (GDPR) on May, 25th 2018.

The report offers several updates including its scope and link to the GDPR, processing of electronic communications content and metadata, among others. Latter mentioned has been one of the main concerns of the Member States. The balance between privacy and innovation regarding processing of metadata seems to be a key aspect of the ePrivacy Regulation.

Furthermore, significant changes of privacy settings according to the future Art. 10 are important for the Commission. The providers of software are only obliged to inform the end-users about the settings and the way the end-users may use them, at the time of installation or first usage and when updates change the privacy settings.

The report ends with three questions for the policy debate at the TTE Council on June 8th. Among others, the versions relating to the permitted processing of metadata and the protection of terminal equipment and privacy settings are open for discussion if it is an acceptable basis to move forward.

Spanish Football League app uses microphones and GPS to detect illegal broadcasting

11. June 2018

The official smartphone app of the Spanish football league (La Liga) can activate the microphone to search for unlicensed public broadcasts of league matches. Those responsible have admitted that the app activates the microphone during the league games in order to find out whether a public broadcast is taking place approximately to the smartphone. In addition, the app uses GPS to determine the exact location where the audio clip was recorded. If an unlicensed, public transmission is determined, the operators of the app receive a notification and can take action against those establishments.

Similar to other countries, Spanish establishments can only show pay-tv broadcasts of football matches in their restaurants with a special license. According to the league, unlicensed performances result in losses amounting to 150 million euros per year and the data obtained will only be used to fight piracy. With the help of the app the fans are to be acquired as “informers” in order to get to the scammers. The app is quite popular and was downloaded at least 10 million times.

The practice was revealed because of the General Data Protection Regulation (GDPR) which entered into force on May 25th 2018. The fact that the microphone authorisation is used for this purpose had not been explained in the terms of use. It merely said that the microphone was used for analysis of the audience. Due to the GDPR, in the newly data protection declaration it says that the app tries to find out via microphone whether the user is watching football and is searching for fraud. However, users in Spain have the possibility to revoke the permission to access the microphone at any time (iOS and Android), but must do so in the settings of their smartphone.

European Court of Justice (ECJ): Facebook fanpages will be treated as a case of Joint Control

With its judgment of June 5 2018, the ECJ decided that both the initiator of the fan pages (e.g. a company) and Facebook are jointly responsible in terms of the General Data Protection Regulation (GDPR) for the personal data collected within the scope of Facebook fan pages.

Fanpages are a Facebook profile of a company that can be used to easily communicate with customers.

Until now, information has been collected from customers who have contacted a company via Facebook. Depending on the type of use of the fan pages, the name and profile of the customer were stored. Facebook has also passed on information collected from users via tracking tools to the respective initiators of the fan pages. In the opinion of the ECJ, the affected users of the respective fan pages were not sufficiently informed about this fact, so that the following requirements must be observed in future:

Who visits a fan page must be informed about which data is collected for which purposes.

In consultation with Facebook, fan page operators must have their own knowledge of what data are collected in order to be able to inform them. This information is obligated pursuant to Art. 13 and 14 of the GDPR.

Before tracking tools and cookies are used, consent must be obtained.

Furthermore, companies and Facebook must become aware of their shared responsibility. It is not yet clear whether this will be done with a contract pursuant to Art. 26 GDPR on Joint Control or with an order data processing agreement pursuant to Art. 28 GDPR. Another solution may also be found.

However, this judgement will not only have consequences for Facebook, but will also affect all social media platforms. This not only affects companies that have their own company presence on Facebook, but also platforms such as LinkedIn, Twitter, Google+ etc., provided that similar tracking functions or other data surveys offer or are included.

Category: General

Data protection risks with regard to WhatsApp and Snapchat on business phones

6. June 2018

The use of the chat services WhatsApp and Snapchat on smartphones used for business purposes will in future be forbidden for employees of the automotive supplier Continental: For data protection reasons, the employer prohibits its employees from downloading the apps. This ban affects approximately 36,000 mobile phones worldwide.

The ban is based on the fact that social media services access users’ address books and thus personal (and possibly confidential) data. The messenger apps do not restrict access to personal data in their settings, so Continental consequently decided to ban the apps from service mobile phones to protect business partners and its own employees.

Under the current terms of use, users of WhatsApp agree to provide contact information “in accordance with applicable laws”. WhatsApp hereby shifts its data protection responsibility to its users, who in fact confirm that they have obtained a corresponding declaration of consent for data processing from every person in their address book. The social media service will be aware that this is practically impossible to guarantee.

In order to ensure an adequate level of data protection, the latter would therefore be obliged to design the default settings to conform to data protection requirements. Such a change could also have a positive effect on the company itself, considering that this would remove the breeding ground for the prohibition. WhatsApp could then be used on countless other smartphones.

Under the new GDPR: Complaints against Google, Instagram, WhatsApp and Facebook

1. June 2018

On the 25th of May, the day the General Data Protection Regulation (GDPR) came into force, noyb.eu filed four complaints over “forced consent” against Google (Android), Instagram, WhatsApp and Facebook.

The complaints filed by the organisation (None Of Your Business) led by Austrian activist Schrems could result in penalties worth up to 7 billion euros. Max Schrems has been fighting Facebook over data protection issues for almost ten years. His earlier lawsuit challenged Facebook’s ability to transfer data from the European Union to the United States (“Safe Harbor”).

The activist alleged that people were not given a “free choice” whether to allow companies to use their data. Noyb.eu bases its opinion on the distinction between necessary and unnecessary data usage. “The GDPR explicitly allows any data processing that is strictly necessary for the service – but using the data additionally for advertisement or to sell it on needs the users’ free opt-in consent.” (See https://noyb.eu/wp-content/uploads/2018/05/pa_forcedconsent_en.pdf) The organisation also claims that under Art. 7 (4) of the GDPR forced consent is prohibited.

The broadly similar complaints have been filed in authorities in various countries, regardless of where the companies have their headquarters. Google (Android) in France (data protection authority: CNIL) with a maximum possible penalty in the amount of 3.7 billion euro although its headquarter is in the USA. Instagram (Facebook) in Belgium (DPA). WhatsApp in Hamburg (HmbBfDI) and Facebook in Austria (DSB). All of these last three have their headquarters in Ireland and could face a maximum possible penalty in the amount of 1.3 billion euro.

Protection against automated decision making with personal data becomes a human right

30. May 2018

Regardless the new data protection legislation in the EU, the worldwide standard of data protection increases too. Through the “Amendment of the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (mostly known as “profiling”)” the European Court of Human Rights (ECtHR) will apply this expansion of the European Convention on Human Rights (ECHR) in the future.

For the last four decades, the Convention has been the only international legally binding instrument for the protection of privacy and personal data open to any country in the world. The aim of the amending is now to modernise and improve the Convention, taking into account the new challenges to the protection of individuals with regard to the processing of personal data that have occurred since the adoption of the Convention in 1980. In particular, this concerns new information and communication technologies, which require a different type of protection mechanism against privacy.

As for any other human right listed in the ECHR, any person can submit an individual application if she/he is violated by one of the contracting parties of the ECHR. This seems to be interesting especially regarding the investigation through profiling by national security authorities all over the European continent.

However, the adoption of the amendments also raises some questions. Particularly with regard to the relationship between European Union law and the Convention, which does not contain any explicit provisions in this respect, as well as deviations in the scope of application. Therefore the ECtHR will comment hopefully before the first lawsuits will start.

Category: Personal Data

The US Senate votes in favor of restoring Net Neutrality rules

17. May 2018

On June 11, anti-net-neutrality is set to take effect in the USA. In a resolution, the Senate has now declared itself in favour of its preservation. The U.S. Senate on Wednesday voted narrowly (52 to 47) to reverse the Federal Communications Commission (FCC) decision in December 2017 to repeal net neutrality rules. Three Republicans voted with all 47 Democrats and two Democratic-leaning senators to back the measure.

The FCC resolution is under the rarely used Congressional Review Act. It is a law that allows Congress, with a simple-majority vote in both houses, to repeal new regulations by federal agencies within 60 legislative days of implementation. Despite the Senate’s passing of the resolution, the measure is unlikely to be approved by the House of Representatives because at least two dozen Republicans must vote against the party line.

Net neutrality is the concept that internet service providers (or governments) treat all data on the internet the same regardless of content, user, platform, application or device. Network neutrality prevents all internet service providers from slowing down connections for people attempting to access certain sites, apps and services, and blocking legal content.

Category: General · USA
Tags:

In China National Standard on Personal Information Security (GB/T 35273-2017) Went into Effect

14. May 2018

On May 1, 2018, the Information Security Technology – Personal Information Security Specification (the “Specification”) went into effect in China. The Specification not mandatory and it is not possible to enforce it directly. Nonetheless, it could become important in the sense of guideline or reference for their administration and enforcement agencies.
The “Specification” embodies a framework concerning the collection, retention, use, sharing and transfer of personal information.

The Information Security Technology – Personal Information Security Specification establishes primary rules for personal information security, notice and consent requirements, security measures, rights of data subjects and requirements related to internal administration and management.
It distinguishes between personal information and sensitive personal information. For the latter exist specific obligations for its collection and use.
Under the the „Specification“, sensitive personal information means information such as personal identity information (ID card or passport number), financial information (bank account number or credit information) and biological identifying information (fingerprint or iris information).

Even though the “Specification” is not binding it may become significant within China because it constitutes benchmarks for the processing of personal information by a wide variety of entities and organizations. Companies that collect or process personal information should make sure that their practices in China are in compliance with the „Specification“.

Category: General · Personal Data
Tags:

How to rule a Data Protection Impact Assessment (DPIA)?

9. May 2018

Pursuant to Art. 35 of the General Data Protection Regulation (GDPR) the controller of personal data shall carry out an assessment of the impact of the data processing that takes place in the controller’s responsibility. That means mostly, to anticipate the possible data breaches and to fulfil the requirements of the GDPR before the personal data is processed.

Even if the date of enforcement of the GDPR (25th May 2018) comes closer and closer, just a few of the EU member states are well-prepared. Only Austria, Belgium, Germany, Slovakia and Sweden have enact laws for the implementation of the new data protection rules. Additional to this legislation the national data protection authorities have to publish some advises on how to rule a DPIA. Pursuant to Art. 35 (4) sent. 2 GDPR these handbooks on DPIA’s should be gathered by the European Data Protection Board for an equal European-wide data protection level. The Board as well seems not to work yet, as the Article 29 Working Part (WP29) is still the official authority.

But at least, Belgium and Germany have published their DPIA recommendations and listed processes for which a DPIA is required, pursuant to Art. 35 (4) GDPR, and in which cases a DPIA is not required, see Art. 35 (5) GDPR.

For example, in the following cases the Belgian authority requires a DPIA:

  • Processing, that involves biometric data uniquely identifying in a space—public or private—which is publicly open,
  • Personal data from a third party that determines whether an applicant is hired or fired,
  • Personal data collected without given consent by the data subject (e.g. electronic devices like smart phones, auditory, and/or video devices),
  • Processing done by medical implant. This data may be an infringement of rights and freedoms.
  • Personal data that affects the vulnerable members of society (e.g., children, mentally challenged, physically challenged individuals),
  • Highly personal data such as financial statement; employability; social service involvement; private activities; domestic situation.
Category: Article 29 WP · Belgium · Data breach · EU · GDPR
Pages: 1 2 3 4 5 6 7 8 9 10 ... 22 23 24 Next
1 2 3 24