Tag: GDPR
26. April 2019
Recently, the Dutch Data Portection Authority (Autoriteit Personensgegevens) published six recommendations for companies when outlining their privacy policies for the purpose of Art. 24 para 2 of the General Data Protection Regulation (the “GDPR”).
The authorities’ recommendations are a result of their investigation of companies’ privacy policies, which focused on companies that mainly process special categories of personal data, e.g. health data or data relating to individuals’ political beliefs.
The Dutch DPA reviewed privacy policies of several companies such as blood banks or local political parties and it focused on three main points 1) the description of the categories of the personal data 2) the description of the purposes of the processing and 3) the information about data subjects’ rights. They discovered that the descriptions of the data categories and purposes were incomplete or too superficial and thus released six recommendations that companies shall take into consideration when outlining privacy policies.
Those are the six recommendations:
- Companies should evaluate whether they have to implement privacy policies (taking into account the nature, scope, context and purposes of the processing, as well as the risks for the rights and freedoms of natural persons)
- Companies should consult internal and/or external expertise such as data protection officers when implementing privacy policies
- The policy should be outlined in a single document to avoid fragmentation of information
- The policy should be concrete and specific and therefore not only repeating the provisions of the GDPR
- The DPA recommends to publish the privacy policies so that data subjects are aware of how the company handles personal data
- The DPA also suggests to draft a privacy policy even if it is not mandatory to demonstrate that the company is willing to protect personal data
9. April 2019
The French data protection authority CNIL has published a model regulation which regulates under which conditions devices for access control through biometric authentication may be introduced at the workplace.
Pursuant to Article 4 paragraph 14 of the General Data Protection Regulation (GDPR), biometric data are personal data relating to the physical, physiological or behavioural characteristics of a natural person, obtained by means of specific technical processes, which enable or confirm the unambiguous identification of that natural person. According to Article 9 paragraph 4 GDPR, the member states of the European Union may introduce or maintain additional conditions, including restrictions, as far as the processing of biometric data is concerned.
The basic requirement under the model regulation is that the controller proves that biometric data processing is necessary. To this end, the controller must explain why the use of other means of identification or organisational and technical safeguards is not appropriate to achieve the required level of security.
Moreover, the choice of biometric types must be specifically explained and documented by the employer. This also includes the justification for the choice of one biometric feature over another. Processing must be carried out for the purpose of controlling access to premises classified by the company as restricted or of controlling access to computer devices and applications.
Furthermore, the model regulation of the CNIL describes which types of personal data may be collected, which storage periods and conditions apply and which specific technical and organisational measures must be taken to guarantee the security of personal data. In addition, CNIL states that before implementing data processing, the controller must always carry out an impact assessment and a risk assessment of the rights and freedoms of the individual. This risk assessment must be repeated every three years for updating purposes.
The data protection authority also points out that the model regulation does not exempt from compliance with the regulations of the GDPR, since it is not intended to replace its regulations, but to supplement or specify them.
25. March 2019
On 21 of March Maciej Szpunar, Advocate General of the European Court of Justice, delivered his Opinion in the case of Planet24 GmbH against Bundesverband Verbraucherzentralen und Vebraucherverbände – Verbaucherzentrale Bundesverband e.V. (Federal Association of Consumer Organisations). In the Opinion, Szpunar explains how to obtain valid consent for the use of cookies.
In the case in question, Planet24 GmbH has organised a lottery campaign on the internet. When registering to participate in the action lottery, two checkboxes appeared. The first checkbox, which did not contain a pre-selected tick, concerned permission for sponsors and cooperation partners to contact the participant in order to inform him of their offers. The second checkbox, which was already ticked off, concerned the consent to the setting of cookies, which evaluate the user’s surfing and usage behaviour.
The Federal Association held that the clauses used infringed german law, in particular Article 307 of the BGB, Article 7(2), point 2, of the UWG and Article 12 et seq. of the TMG and filed a lawsuit in 2014 after an unsuccessful warning.
In the course of the instances, the case ended up at the German Federal Supreme Court in 2017. The German Federal Court considers that the success of the case depends on the interpretation of Articles 5(3) and 2(f) of Directive 2002/58, read in conjunction with Article 2(h) of Directive 95/46, and of Article 6(1)(a) of Regulation 2016/679. For that reason, it asked the European Court of Justice the following questions for a preliminary ruling:
(1) Does consent given on the basis of a pre-ticked box meet the requirements for valid consent under the ePrivacy Directive, the EU Data Protection Directive and the EU General Data Protection Regulation (the GDPR)?
(2) What information does the service provider have to provide to the user and does this include the duration of the use of cookies and whether third parties have access to the cookies?
According to the Advocate General, there is no valid consent if the checkbox is already ticked. In such case, the user must remove the tick, i.e. become active if he/she does not agree to the use of cookies. However, this would contradict the requirement of an active act of consent by the user. It is necessary for the user to explicitly consent to the use of cookies. Therefore, it is also not sufficient if one checkbox is used to deal with both the use of cookies and participation in the action lottery. Consent must be given separately. Otherwise the user is not in the position to freely give a separate consent.
In addition, Szpunar explains that the user must be provided with clear and comprehensive information that enables the user to easily assess the consequences of his consent. This requires that the information provided is unambiguous and cannot be interpreted. For this purpose, the information must contain details such as the duration of the operation of cookies, as well as whether third parties have access to the cookies.
11. March 2019
The Dutch data protection authority, Autoriteit Persoonsgegevens, clarified on 7th of March 2019 that the use of websites must remain accessible when tracking cookies are not accepted. Websites that allow users to access only if they agree to the use of tracking cookies or other similar means to track and record their behavior do not comply with the General Data Protection Regulation, GDPR.
The Dutch DPA’s decision was prompted by numerous complaints from website users who no longer had access to the websites after refusing the usage of tracking cookies.
The Dutch DPA noted that the use of tracking software is generally allowed. Tracking the behaviour of website users, however, must be based on sufficient consent. In order to be compliant with the GDPR, permission must be given freely. In the case of so-called cookie walls the user has no access to the website if he does not agree to the setting of cookies. In this way, pressure is exerted on the user to disclose his personal data. Nevertheless, according to the GDPR a consent has not been given voluntarily if no free or no real choice exists.
With publication of the explanation the Dutch DPA demands organizations to make their practice compliant with the GDPR. The DPA has already written to those organisations about which the users have complained the most. In addition, it announced that it would intensify its monitoring in the near future in order to examine whether the standard is applied correctly in the interest of data protection.
25. February 2019
The European Data Protection Board has published an information note to explain data transfer to organisations and facilitate preparation in the event that no agreement is reached between the EEA and the UK. In case of a no-deal Brexit, the UK becomes a third country for which – as things stand at present – no adequacy decision exists.
EDPB recommends that organisations transferring data to the UK carry out the following five preparation steps:
• Identify what processing activities will imply a personal data transfer to the UK
• Determine the appropriate data transfer instrument for your situation
• Implement the chosen data transfer instrument to be ready for 30 March 2019
• Indicate in your internal documentation that transfers will be made to the UK
• Update your privacy notice accordingly to inform individuals
In addition, EDPB explains which instruments can be used to transfer data to the UK:
– Standard or ad hoc Data Protection Clauses approved by the European Commission can be used.
– Binding Corporate Rules for data processing can be defined.
– A code of conduct or certification mechanism can be established.
Derogations are possible in the cases mentioned by article 49 GDPR. However, they are interpreted very restrictively and mainly relate to processing activities that are occasional and non-repetitive. Further explanations on available derogations and how to apply them can be found in the EDPB Guidelines on Article 49 of GDPR.
The French data protection authority CNIL has published an FAQ based on the information note of the EDPB, explaining the consequences of a no-deal Brexit for the data transfer to the UK and which preparations should be made.
6. February 2019
The European Commission lately posted an infographics about the impact of the General Data Protection Regulation (GDPR) since its entering into force on May 25, 2018. The graphic looks at complying, enforcement and awareness of the GDPR. It illustrates inter alia that:
- In total 95.180 complaints to Data Protection Authorities came from individuals who believe their rights under GDPR have been violated. Most of the complaints were related to CCTV, telemarketing or promotional e-mails.
- Until January, the number of notifications of data breaches has increased up to 41.502. The data controllers have to notify data breaches within 72 hours to their national supervisory authority.
- Data Protection Authorities have initiated 225 investigations in cross border cases.
- In Europe, 23 countries have adopted their national data protection law since the GDPR came into force. Bulgaria, Greece, Slovenia, Portugal and Czech Republic are still in progress doing so.
- So far, three fines have been issued under GDPR. In Germany, a social network operator was fined € 20.000 for not securing its users data. In France, Google was fined € 50 million for lack of transparency, inadequate information and lack of valid consent regarding the ads personalization (we reported) and in Austria, a sports betting café was fined € 5.280 for unlawful video surveillance.
28. January 2019
The European Commission adopted an adequacy decision for Japan on the 23rd of January 2019, enabling data flows to take place freely and safely. The exchange of personal data is based on strong safeguards that Japan has put in place in advance of the adequacy decision to ensure that the transfer of data complies with EU standards.
The additional safeguards include:
– A set of rules (Supplementary Rules), which will cover the differences between the two data protection systems. This should strengthen the protection of sensitive data, the exercise of personal rights and the conditions under which EU data can be further transferred to another third country. These additional rules are binding in particular on Japanese companies importing data from the EU. They can also be enforced by the independent Japanese data protection authority (PPC) as well as by courts.
– Also, safeguards have been established concerning access by Japanese authorities for law enforcement and national security purposes. In this regard, the Japanese Government has given assurances to the Commission and has ensured that the use of personal data is limited to what is necessary and proportionate and is subject to independent supervision and redress.
– A complaint handling mechanism to investigate and resolve complaints from Europeans regarding Japanese authorities’ access to their data. This new mechanism will be managed and monitored by Japan’s independent data protection authority.
The adequacy decision has been in force since 23rd of January 2019. After two years, the functioning of the framework will be reviewed for the first time. The subsequent reviews will take place at least every four years.
The adequacy decision also complements the EU-Japan Economic Partnership Agreement, which will enter into force in February 2019. European companies will benefit from free data flows as well as privileged access to the 127 million Japanese consumers.
25. January 2019
On 21st of January 2019, the French Data Protection Authority CNIL imposed a fine of € 50 Million on Google for lack of transparency, inadequate information and lack of valid consent regarding the ads personalization.
On 25th and 28th of May 2018, CNIL received complaints from the associations None of Your Business (“NOYB”) and La Quadrature du Net (“LQDN”). The associations accused Google of not having a valid legal basis to process the personal data of the users of its services.
CNIL carried out online inspections in September 2018, analysing a user’s browsing pattern and the documents he could access.
The committee first noted that the information provided by Google is not easily accessible to a user. Essential information, such as the data processing purposes, the data storage periods or the categories of personal data used for the ads personalization, are spread across multiple documents. The user receives relevant information only after carrying out several steps, sometimes up to six are required. According to this, the scheme selected by Google is not compatible with the General Data Protection Regulation (GDPR). In addition, the committee noted that some information was unclear and not comprehensive. It does not allow the user to fully understand the extent of the processing done by Google. Moreover, the purposes of the processing are described too generally and vaguely, as are the categories of data processed for these purposes. Finally, the user is not informed about the storage periods of some data.
Google has stated that it always seeks the consent of users, in particular for the processing of data to personalise advertisements. However, CNIL declared that the consent was not valid. On the one hand, the consent was based on insufficient information. On the other hand, the consent obtained was neither specific nor unambiguous, as the user gives his or her consent for all the processing operations purposes at once, although the GDPR provides that the consent has to be given specifically for each purpose.
This is the first time CNIL has imposed a penalty under the GDPR. The authority justified the amount of the fine with the gravity of the violations against the essential principles of the GDPR: transparency, information and consent. Furthermore, the infringement was not a one-off, time-limited incident, but a continuous breach of the Regulation. In this regard, according to CNIL, the application of the new GDPR sanction limits is appropriate.
Update: Meanwhile, Google has appealed, due to this a court must decide on the fine in the near future.
22. January 2019
On Wednesday, 16th January 2019, EU Parliament and member state negotiators agreed that parties or political foundations can be sanctioned for data protection breaches during election campaigns. This regulation is intended to prevent any influence on the forthcoming European elections in May. It was decided that in such cases affected institutions would have to pay up to five percent of their annual budget in future.
One of the reasons for the new regulation was the data scandal surrounding Facebook and Cambridge Analytica. During the US election campaign, Facebook gained unauthorized access to the data of millions of its users. With this data, Cambridge Analytica is said to have tried to prevent potential Clinton supporters from voting and to mobilise Trump voters by means of advertising and contributions (we reported).
In future, data protection violations that are deliberately accepted in order to influence the outcome of European elections will be severely sanctioned. National supervisory authorities are to decide whether a party has violated the regulation. The Authority for European Political Parties and European Political Foundations must then review the decision and, if necessary, impose the appropriate sanction. Moreover, those found to be in breach could not apply for funds from the general budget of the European Union in the year in which the fine is imposed.
The text adopted on Wednesday still has to be formally adopted by Parliament and the Council of Member States.
18. January 2019
At the end of last year, the French Data Protection Authority (“Commission Nationale de l’Informatique et des Libertés”, the “CNIL”) published guidance on sharing data with business partners or third parties. The CNIL stated that many companies that collect data from individuals transfer this data to “business partners” or other organisations especially to send prospecting emails. In case of a transmission the data subjects must maintain control over their personal data .
The published guidance state the following five requirements:
• Prior consent: Before sharing data with business partners or third parties such as data brokers, organisations must request the individual’s consent.
• Identification of the partners: The individuals must be informed of the specific partner(s) who may receive the data. According to the CNIL’s guidance, the organisation can either publish a complete and updated list containing the organisation’s partners directly on the data collection form or if such a list would be too long, it can integrate a link to the collection form. This should be inserted together with a link to their respective privacy policies.
• Information of changes to the list of partners: The organisations have to notify the individuals of any changes to the list of partners, especially if they may share the data with new partners. Therefore, they may provide an updated list of their partners within each marketing message sent to the individual and each new partner that receives the individual’s data must inform him or her of such processing in its first communication to the data subject.
• No “transfer” of the consent: Companies may not share the information they receive with their own partners without obtaining the consent of individuals, in particular with regard to the identity of new companies that would become recipients of the subject’s data.
• Information to be provided by the partner(s): The partner who received the individual’s data for their own marketing purposes must inform the data subject of the origin (name of the organisation who shared the data with them) and inform them of their data subject rights, in particular the right to object to the processing.
Pages: Prev 1 2 3 4 5 6 7 8 9 10 11 Next