Category: General Data Protection Regulation

Update on ePrivacy Regulation

12. June 2018

The council of the European Union’s Bulgarian presidency has released a progress report on the draft ePrivacy Regulation ahead of a council meeting June 8th, 2018.

The ePrivacy Regulation (Regulation on Privacy and Electronic Communications) should replace the current ePrivacy Directive and was originally intended to enter into force together with the General Data Protection Regulation (GDPR) on May, 25th 2018.

The report offers several updates including its scope and link to the GDPR, processing of electronic communications content and metadata, among others. Latter mentioned has been one of the main concerns of the Member States. The balance between privacy and innovation regarding processing of metadata seems to be a key aspect of the ePrivacy Regulation.

Furthermore, significant changes of privacy settings according to the future Art. 10 are important for the Commission. The providers of software are only obliged to inform the end-users about the settings and the way the end-users may use them, at the time of installation or first usage and when updates change the privacy settings.

The report ends with three questions for the policy debate at the TTE Council on June 8th. Among others, the versions relating to the permitted processing of metadata and the protection of terminal equipment and privacy settings are open for discussion if it is an acceptable basis to move forward.

Spanish Football League app uses microphones and GPS to detect illegal broadcasting

11. June 2018

The official smartphone app of the Spanish football league (La Liga) can activate the microphone to search for unlicensed public broadcasts of league matches. Those responsible have admitted that the app activates the microphone during the league games in order to find out whether a public broadcast is taking place approximately to the smartphone. In addition, the app uses GPS to determine the exact location where the audio clip was recorded. If an unlicensed, public transmission is determined, the operators of the app receive a notification and can take action against those establishments.

Similar to other countries, Spanish establishments can only show pay-tv broadcasts of football matches in their restaurants with a special license. According to the league, unlicensed performances result in losses amounting to 150 million euros per year and the data obtained will only be used to fight piracy. With the help of the app the fans are to be acquired as “informers” in order to get to the scammers. The app is quite popular and was downloaded at least 10 million times.

The practice was revealed because of the General Data Protection Regulation (GDPR) which entered into force on May 25th 2018. The fact that the microphone authorisation is used for this purpose had not been explained in the terms of use. It merely said that the microphone was used for analysis of the audience. Due to the GDPR, in the newly data protection declaration it says that the app tries to find out via microphone whether the user is watching football and is searching for fraud. However, users in Spain have the possibility to revoke the permission to access the microphone at any time (iOS and Android), but must do so in the settings of their smartphone.

Data protection risks with regard to WhatsApp and Snapchat on business phones

6. June 2018

The use of the chat services WhatsApp and Snapchat on smartphones used for business purposes will in future be forbidden for employees of the automotive supplier Continental: For data protection reasons, the employer prohibits its employees from downloading the apps. This ban affects approximately 36,000 mobile phones worldwide.

The ban is based on the fact that social media services access users’ address books and thus personal (and possibly confidential) data. The messenger apps do not restrict access to personal data in their settings, so Continental consequently decided to ban the apps from service mobile phones to protect business partners and its own employees.

Under the current terms of use, users of WhatsApp agree to provide contact information “in accordance with applicable laws”. WhatsApp hereby shifts its data protection responsibility to its users, who in fact confirm that they have obtained a corresponding declaration of consent for data processing from every person in their address book. The social media service will be aware that this is practically impossible to guarantee.

In order to ensure an adequate level of data protection, the latter would therefore be obliged to design the default settings to conform to data protection requirements. Such a change could also have a positive effect on the company itself, considering that this would remove the breeding ground for the prohibition. WhatsApp could then be used on countless other smartphones.

Under the new GDPR: Complaints against Google, Instagram, WhatsApp and Facebook

1. June 2018

On the 25th of May, the day the General Data Protection Regulation (GDPR) came into force, noyb.eu filed four complaints over “forced consent” against Google (Android), Instagram, WhatsApp and Facebook.

The complaints filed by the organisation (None Of Your Business) led by Austrian activist Schrems could result in penalties worth up to 7 billion euros. Max Schrems has been fighting Facebook over data protection issues for almost ten years. His earlier lawsuit challenged Facebook’s ability to transfer data from the European Union to the United States (“Safe Harbor”).

The activist alleged that people were not given a “free choice” whether to allow companies to use their data. Noyb.eu bases its opinion on the distinction between necessary and unnecessary data usage. “The GDPR explicitly allows any data processing that is strictly necessary for the service – but using the data additionally for advertisement or to sell it on needs the users’ free opt-in consent.” (See https://noyb.eu/wp-content/uploads/2018/05/pa_forcedconsent_en.pdf) The organisation also claims that under Art. 7 (4) of the GDPR forced consent is prohibited.

The broadly similar complaints have been filed in authorities in various countries, regardless of where the companies have their headquarters. Google (Android) in France (data protection authority: CNIL) with a maximum possible penalty in the amount of 3.7 billion euro although its headquarter is in the USA. Instagram (Facebook) in Belgium (DPA). WhatsApp in Hamburg (HmbBfDI) and Facebook in Austria (DSB). All of these last three have their headquarters in Ireland and could face a maximum possible penalty in the amount of 1.3 billion euro.

New Austrian Data Protection Law – undermining GDPR

8. May 2018

Austria’s governing parties passed a new law on data protection in the last month. This new law, which was intendet to implement the requirements of the General Data Protection Regulation (GDPR), complicates the enforcement of the new EU-wide data protection rules. This developement is result of a change in policy. Three years ago Austria’s justice minister complained that the EU’s forthcoming data protection rules were to weak, nowadays, the new government in Vienna says they are too strong.

It has been suggested, that the governing parties in Vienna are trying to turn the coountry into a sort of ‘safe haven’ – by complicating enforcement of the GDPR.

Purpose of the GDPR is, inter alia, to hand back the control of personal data to the data subjects. This aim could be undermined by the new provisions regarding the sanctions.

The GDPR stipulates, that sanctions are imposed by DPAs without any condition and without a room for specification or changes to member states’ law. In contrast to this the new Austrian data protection law contains a term that requires warnings before launching sanctions against violating firms. It must be feared, that most infringements will go unpunished.

The responsibles of the Austrian Data Protection Authority tried to weaken the concerns: The authority will still decide on a case-by-case basis whether to impose administrative fines or not – even it is the first violation of the company.

It remains to be seen how the new law will be applied in the future.

Application of the GDPR outside the EU

10. April 2018

When the General Data Protection Regulation (GDPR) comes into force on May 25th this year, not only in Europe the handling of personal data will have to change. Companies operating with customer data of EU citizens also have to observe the GDPR worldwide. But which non-European legal entity has to show consideration for the European Data Protection?

In accordance with Article 3 (1) GDPR, the GDPR applies to the processing of data of natural persons in so far as it takes place in the context of an activity of the controller (see Article 4 (7) GDPR) or a processor (see Article 4 (8) GDPR) in the Union. This applies irrespective of whether the data processing takes place on EU territory or in a third country.

If the data subject lives in the EU but the controller / data processor is located outside the EU, the scope of the GDPR according to Article 3 (2) GDPR is applicable if the data processing is related to goods or services offered within the EU (see Art. 3 (2) lit. a)). The GDPR applies cumulatively if the processor carries out a profiling on a EU-citizen (see Art. 3 (2) lit. b)).

Furthermore, the GDPR is also applied outside the EU territory to a controller / data processor who isn’t resident of the EU, if the law of a Member State becomes applicable on the basis of international public law (e.g. in consular or diplomatic matters, or on the basis of private international law).

WP29 Guidelines on the notion of consent according to the GDPR – Part 1

26. January 2018

According to the GDPR, consent is one of the six lawful bases mentioned in Art. 6. In order for consent to be valid and compliant with the GDPR it needs to reflect the data subjects real choice and control.

The Working Party 29 (WP 29) clarifies and specifies the “requirements for obtaining and demonstrating” such a valid consent in its Guidelines released in December 2017.

The guidelines start off with an analysis of Article 4 (11) of the GDPR and then discusses the elements of valid consent. Referring to the Opinion 15/2011 on the definition of consent, “obtaining consent also does not negate or in any way diminish the controller’s obligations to observe the principles of processing enshrined in the GDPR, especially Article 5 of the GDPR with regard to fairness, necessity and proportionality, as well as data quality.”

The WP29 illustrates the elements of valid consent, such as the consent being freely given, specific, informed and unambiguous. For example, a consent is not considered as freely given if a mobile app for photo editing requires the users to have their GPS location activated simply in order to collect behavioural data aside from the photo editing. The WP29 emphasizes that consent to processing of unnecessary personal data “cannot be seen as a mandatory consideration in exchange for performance.”

Another important aspect taken into consideration is the imbalance of powers, e.g. in the matter of public authorities or in the context of employment. “Consent can only be valid if the data subject is able to exercise a real choice, and there is no risk of deception, intimidation, coercion or significant negative consequences (e.g. substantial extra costs) if he/she does not consent. Consent will not be free in cases where there is any element of compulsion, pressure or inability to exercise free will. “

Art. 7(4) GDPR emphasizes that the performance of a contract is not supposed to be conditional on consent to the processing of personal data that is not necessary for the performance of the contract. The WP 29 states that “compulsion to agree with the use of personal data additional to what is strictly necessary limits data subject’s choices and stands in the way of free consent.” Depending on the scope of the contract or service, the term “necessary for the performance of a contract… …needs to be interpreted strictly”. The WP29 lays down examples of cases where the bundling of situations is acceptable.

If a service involves multiple processing operations or multiple purposes, the data subject should have the freedom to choose which purpose they accept. This concept of granularity requires the purposes to be separated and consent to be obtained for each purpose.

Withdrawal of consent has to be possible without any detriment, e.g. in terms of additional costs or downgrade of services. Any other negative consequence such as deception, intimidation or coercion is also considered to be invalidating. The WP29 therefore suggests controllers to ensure proof that consent has been given accordingly.

(will be soon continued in Part 2)

French Data Protection Commission threatens WhatsApp with sanctions

21. December 2017

The French National Data Protection Commission (CNIL) has found violations of the French Data Protection Act in the course of an investigation conducted in order to verify compliance of WhatsApps data Transfer to Facebook with legal requirements.

In 2016, WhatsApp had announced to transfer data to Facebook for the purpose of targeted advertising, security and business intelligence (technology-driven process for analyzing data and presenting actionable information to help executives, managers and other corporate end users make informed business decisions).

Immediately after the announcement, the Working Party 29 (an independent European advisory body on data protection and privacy, set up under Article 29 of Directive 95/46/EC; hereinafter referred to as „WP29“) asked the company to stop the data transfer for targeted advertising as French law doesn’t provide an adequate legal basis.

„While the security purpose seems to be essential to the efficient functioning of the application, it is not the case for the “business intelligence” purpose which aims at improving performances and optimizing the use of the application through the analysis of its users’ behavior.“

In the wake of the request, WhatsApp had assured the CNIL that it does not process the data of French users for such purposes.

However, the CNIL currently not only came to the result that the users’ consent was not validly collected as it lacked two essential aspects of data protection law: specific function and free choice. But it also denies a legitimate interest when it comes to preserving fundamental rights of users based on the fact that the application cannot be used if the data subjects refuse to allow the processing.

WhatsApp has been asked to provide a sample of the French users’ data transferred to Facebook, but refused to do so because being located in die United States, „it considers that it is only subject to the legislation of this country.“

The inspecting CNIL thus has issued a formal notice to WhatsApp and again requested to comply with the requirements within one month and states:

„Should WhatsApp fail to comply with the formal notice within the specified timescale, the Chair may appoint an internal investigator, who may draw up a report proposing that the CNIL’s restricted committee responsible for examining breaches of the Data Protection Act issue a sanction against the company.“

 

WP29: Guideline for profiling and automated decision-making

19. October 2017

The Article 29 Data Protection Working Party (WP29) adopted a guideline for the automated individual decision-making and profiling which are addressed by the General Data Protection Regulation (GDPR). The GDPR will be applicable from the 25th May 2018. WP29 acknowledges that “profiling and automated decision-making can be useful for individuals and organisations as well as for the economy and society as a whole”. “Increased efficiencies” and “resource savings” are two examples that were named.

However, it was also stated that “profiling and automated decision-making can pose significant risks for individuals’ rights and freedoms which require appropriate safeguards”. One risk could be that profiling may “perpetuate existing stereotypes and social segregation”.

The Guideline covers inter alia definitions of profiling and automated decision-making as well as the general approach of the GDPR to these. It is addressed that the GDPR introduces provisions to ensure that the use of profiling and automated decision-making does not have an “unjustified impact on individuals’ rights” and names examples, such as “specific transparency and fairness requirements” and “greater accountability obligations”.

UK government introduced Data Protection Bill

13. October 2017

The UK government introduced the Data Protection Bill to implement the General Data Protection Regulation (GDPR – 2016/679).

The GDPR enters into force on 25th May 2018 in the European Union. After the brexit, until now it was unclear if the UK would implement the GDPR into UK domestic law. The Data Protection Bill implements not only the legal requirements of the GDPR. The Law Enforcement Directive (2016/680) and the standards of the Council of Europe’s draft modernized Convention 108 on processing of personal data carried out by the intelligence services will also be adopted in the new Data Protection Law of the UK.

The new Law will replace the existing UK Data Protection Act 1998.

Currently the bill is at the beginning of the parliamentary process. The first reading in the House of Lords was held on 13th September, the second on 10th October. The bill consist of seven parts and 18 Schedules.

The data flow between European countries and the UK will not cause those problems that caused concerns after the Brexit, because the data protection level in Europe and the UK will be equal.

Pages: 1 2 3 4 5 Next
1 2 3 5