Category: General Data Protection Regulation

Clubhouse Data Protection issues

28. January 2021

Clubhouse is a new social networking app by the US company Alpha Exploration Co. available for iOS devices. Registered users can open rooms for others to talk about various topics. Participation is possible both as a speaker and as a mere listener. These rooms can be available for the public or as closed groups. The moderators speak live in the rooms and the listeners can then join the virtual room. Participants are initially muted and can be unmuted by the moderators to talk. In addition, the moderators can also mute the participants or exclude them from the respective room. As of now, new users need to be invited by other users, the popularity of these invitations started to rise in autumn 2020 when US celebrities started to use the app. With increasing popularity also in the EU, Clubhouse has come under criticism from a data protection perspective.

As mentioned Clubhouse can only be used upon an invitation. To use the option to invite friends, users must share their address book with Clubhouse. In this way, Alpha Exploration can collect personal data from contacts who have not previously consented to the processing of their data and who do not use the app. Not only Alpha Exploration, but also users may be acting unlawfully when they give the app access to their contacts. The user may also be responsible for the data processing associated with the sharing of address books. Therefore, it is not only the responsibility of Alpha Exploration, but also of the user to ensure that consent has been obtained from the contacts whose personal data is being processed. From a data protection perspective, it is advisable not to grant the Clubhouse app access to this data unless the consent of the respective data subjects has been obtained and ideally documented. Currently, this data is transferred to US servers without the consent of the data subjects in the said address books. Furthermore, it is not apparent in what form and for what purposes the collected contact and account information of third parties is processed in the USA.

Under Clubouse’s Terms of Service, and in many cases according to several national laws, users are prohibited from recording or otherwise storing conversations without the consent of all parties involved. Nevertheless, the same Terms of Service include the sentence “By using the service, you consent to having your audio temporarily recorded when you speak in a room.” According to Clubhouse’s Privacy Policy, these recordings are used to punish violations of the Terms of Service, the Community Guidelines and legal regulations. The data is said to be deleted when the room in question is closed without any violations having been reported. Again, consent to data processing should be treated as the general rule. This consent must be so-called informed consent. In view of the fact that the scope and purpose of the storage are not apparent and are vaguely formulated, there are doubts about this. Checking one’s own platform for legal violations is in principle, if not a legal obligation in individual cases, at least a so-called legitimate interest (Art. 6 (1) (f) GDPR) of the platform operator. As long as recordings are limited to this, they are compliant with the GDPR. The platform operator who records the conversations is primarily responsible for this data processing. However, users who use Clubhouse for conversations with third parties may be jointly responsible, even though they do not record themselves. This is unlikely to play a major role in the private sphere, but all the more so if the use is in a business context.

It is suspected that Clubhouse creates shadow profiles in its own network. These are profiles for people who appear in the address books of Clubhouse users but are not themselves registered with Clubhouse. For this reason, Clubhouse considers numbers like “Mobile-Box” to be well-connected potential users. So far, there is no easy way to object to Clubhouse’s creation of shadow profiles that include name, number, and potential contacts.

Clubhouse’s Terms of Use and Privacy Policy do not mention the GDPR. There is also no address for data protection information requests in the EU. However, this is mandatory, as personal data of EU citizens is also processed. In addition, according to Art. 14 GDPR, EU data subjects must be informed about how their data is processed. This information must be provided to data subjects before their personal data is processed. That is, before the data subject is invited via Clubhouse and personal data is thereby stored on Alpha Exploration’s servers. This information does not take place. There must be a simple opt-out option, it is questionable whether one exists. According to the GDPR, companies that process data of European citizens must also designate responsible persons for this in Europe. So far, it is not apparent that Clubhouse even has such data controllers in Europe.

The german “Verbraucherzentrale Bundesverband” (“VZBV”), the german federate Consumer Organisation, has issued a written warning (in German) to Alpha Exploration, complaining that Clubhouse is operated without the required imprint and that the terms of use and privacy policy are only available in English, not in German as required. The warning includes a penalty-based cease-and-desist declaration relating to Alpha Exploration’s claim of the right to extensive use of the uploaded contact information. Official responses from European data protection authorities regarding Clubhouse are currently not available. The main data protection authority in this case is the Irish Data Protection Commissioner.

So far, it appears that Clubhouse’s data protection is based solely on the CCPA and not the GDPR. Business use of Clubhouse within the scope of the GDPR should be done with extreme caution, if at all.

Norwegian DPA intends to fine Grindr

26. January 2021

The Norwegian Data Protection Authority “Datatilsynet” (in the following “DPA”) announced recently that it intends to fine the online dating provider “Grindr LLC” (in the following “Grindr”) for violations of the GDPR an administrative fine of € 9.6 Mio. (NOK 100 Mio.).

Grindr is a popular and widely used Dating App for gay, bi, trans and queer people and uses a location-based technology to connect the users. Thus, Grindr processes beside personal data also sensitive data like the sexual orientation of the users. The latter are subject to a high level of protection due to the requirements of the GDPR.

The DPA came to the conclusion that Grindr transferred personal data of its users to third parties for marketing purposes without having a legal basis for doing so. In particular, Grindr neither informed the data subjects in accordance with the GDPR nor have obtained consent from the concerned data subject. Datatilsynet considers this case as serious, because the users were not able to exercise real and effective control over the sharing of their data.

Datatilsynet has set a deadline of February 15th, 2021 for Grindr to submit its comments on the case and will afterwards make its final decision.

CJEU Advocate General’s opinion on GDPR’s One-Stop-Shop mechanism

On January 13, 2021, the Advocate General (“AG”) of the Court of Justice of the European Union (“CJEU”) published an opinion in the case of Facebook Ireland Limited, Facebook INC, Facebook Belgium BVBA v the Belgian Data Protection Authority “Gegevensbeschermingsautoriteit” (“Belgian DPA”), addressing the General Data Protection Regulation’s (“GDPR”) One-Stop-Shop mechanism.

In 2015, the Belgian DPA initiated several legal proceedings against Facebook Group members in local courts. The allegation was that Facebook placed cookies on devices of Belgian users without their consent, thereby collecting data in an excessive manner. Facebook argued that with the GDPR becoming applicable in 2018, the Belgian DPA lost its competence to continue the legal proceedings, as Facebook’s lead supervisory authority under the GDPR is the Irish Data Protection Commission. The Belgian Court of Appeal referred several questions to the CJEU, including whether the GDPR’s One-Stop-Shop regime prevented national DPA’s from initiating proceedings in the national courts when it is not the lead DPA.

The AG responded that, in his opinion, the lead DPA has the general jurisdiction over cross-border data processing, while a national DPA may exceptionally bring proceedings before its own national courts. The national DPA’s right is subject to the One-Stop-Shop regime and cooperation and consistency mechanism of the GDPR. Thus, each national DPA has the competence to initiate proceedings against possible infringements affecting its territory, the significant regulatory role of the lead DPA limits this competence with respect to cross-border data processing.

One of the concerns expressed by the Belgian DPA was the risk of insufficient enforcement if only lead DPA’s may act against organizations that do not comply with the GDPR. In this regard, the GA emphasizes that Art. 61 GDPR specifically provides for appropriate mechanisms to address such concerns. National DPA’s have the possibility to ask the lead DPA for assistance in investigations, and if such assistance is not provided, the national DPA concerned may take action itself.

In certain circumstances, the AG sees the possibility for national DPAs not acting as lead DPA to initiate proceedings before their national court, if

  • the DPA is acting outside of the material scope of the GDPR; e.g., because the processing does not involve personal data;
  • cross-border data processing is carried out by public authorities, in the public interest, or to comply with legal obligations;
  • the processor is not established in the EU;
  • there is an urgent need to act to protect the rights and freedoms of data subjects (Art. 66 GDPR);
  • the lead DPA has decided not to process a case.

With regards to data subjects, the AG notes that data subjects can bring action against any controller or processor before the court of their Member State and may file a complaint with their Member State’s DPA, regardless of which Member State’s DPA is the lead DPA.

The AG’s opinion is not legally binding on the CJEU, although the CJEU will take it into account. A final judgment of the CJEU is expected in the coming months. Thereafter, the Belgian Court of Appeal will have to decide its case in accordance with the CJEU’s judgment. The CJEU’s decision will most likely have a lasting impact on the division of roles between lead DPAs and other national DPAs, as well as on the ability of national DPAs to take enforcement actions into their own hands.

Swedish court confirms Google’s violations of the GDPR

16. December 2020

The Administrative Court of Stockholm announced on November 23rd, 2020, that it had rejected Google LLC’s appeal against the decision of the Swedish Data Protection Authority (Datainspektionen) determining Google’s violations of the GDPR. Google as a search engine operator had not fulfilled its obligations regarding the right to be forgotten (RTBF). However, the court reduced the fine from a total of SEK 75 million (approx. € 7,344,000) to SEK 52 million (approx. € 5,091,000).

Background to the case was the Swedish DPA’s audit in 2017 concerning Google’s handling of requests on delisting, which means removal of certain results from a search engine. The DPA concluded the inspection by ordering Google to delist certain individuals’ names due to inaccuracy, irrelevance and superfluous information. In 2018 the DPA initiated a follow-up audit because of indications that Google had not fully complied with the previously issued order. It resulted in issuing an administrative fine of SEK 75 million in March 2020.

The DPA raised attention to the fact that the GDPR increases the obligations of data controllers and data processors as well as strengthens the rights of individuals, which include the right to have their search result delisted. Though, Google has not been fully complying with its obligations, as it has not properly removed two of the search result listings that the DPA had ordered to delete. In one case Google has done a too narrow interpretation of what web addresses to remove, in the other case Google has failed to remove it without undue delay.

Moreover, the DPA criticized Google’s procedure of managing delisting requests and found it to be undermining data subjects’ rights. Following the removal of a search result listing, Google notifies the website to which the link is directed. The delisting request form, directed to the data subject raising the request, states that information on the removed web addresses can be provided to the webmaster. This information has to be seen as misleading since the data subject is made to understand that its consent to the notification is required in order to process the request. Therefore, such practice might result in individuals refraining from exercising their right to request delisting, which violates Art. 5 (1) lit. a) GDPR. What’s more, in the opinion of the DPA the delisting notifications to the webmasters are not covered by legal obligations according to Art. 6 (1) lit. c), 17 (2) GDPR, nor legitimate interests pursuant to Art. 6 (1) lit. f) GDPR. Also, Google’s routine of regularly sending information to webmasters constitutes processing of personal data being incompatible with the purpose for which the data was originally collected. This practice infringes Art. 5 (1) lit. b), 6 (4) GDPR.

Google appealed the decision of the DPA. Though, the Swedish Administrative Court of Stockholm reaffirmed the DPA’s opinion and confirmed Google’s violations of the GDPR.

The court stated that the process concerning delisting requests must facilitate for the individual to exercise its rights. That means, any process that restricts the individuals’ rights may violate Art. 15 through 22 GDPR. The court also specified why the personal data had been processed beyond their original purpose. Since the notifications are only sent after Google has removed a search result, the purpose of the processing has already expired when the notification is sent. Thus, the notification cannot be considered effective in achieving the purpose specified by Google.

Google shall now delist specific search results and cease to inform webmasters of requests. Also, Google must adapt its data subject rights procedure within eight weeks after the court’s judgment has gained legal force.

Admonition for revealing a list of people quarantined in Poland

27. November 2020

The President of the Personal Data Protection Office in Poland (UODO) imposed an admonition on a company dealing with waste management liable for a data breach and ordered to notify the concerned data subjects. The admonition is based on a violation of personal data pertaining to data subjects under medical quarantine. The city name, street name, building/flat number and the fact of remaining under quarantine of the affected data subjects have been provided by the company to unauthorized recipients. The various recipients were required to verify whether, in a given period, waste was to be collected from places determined in the above-mentioned list.

The incident already happened in April 2020. Back then, a list of data subjects was made public, containing information on who had been quarantined by the administrative decision of the District Sanitary-Epidemiological Station (PPIS) in Gniezno as well as information on quarantined data subjects in connection with crossing the country border and on data subjects undergoing home isolation due to a confirmed SARS-CoV-2 infection. After becoming aware of the revelation, the Director of PPIS notified the relevant authorities – the District Prosecutor’s Office and the President of UODO – about the incident.

PPIS informed them that it had carried out explanatory activities showing that the source of disclosure of these data was not PPIS. These data were provided to the District Police Headquarters, the Head of the Polish Post Office, Social Welfare Centres and the Headquarters of the State Fire Service. Considering the fact that these data had been processed by various parties involved, it was necessary to establish in which of them the breach may have occurred.

UODO took steps to clarify the situation. In the course of the proceedings, it requested information from a company dealing with waste management being one of the recipients of the personal data. The company, acting as the data controller, had to explain whether, when establishing the procedures related to the processing of personal data, it had carried out an assessment of the impact of the envisaged processing operations on the protection of personal data according to Art. 35 GDPR. The assessment persists in an analysis of the distribution method in electronic and paper form in terms of risks related to the loss of confidentiality. Furthermore, the data controller had to inform UODO about the result of this analysis.

The data controller stated that it had conducted an analysis considering the circumstances related to non-compliance with the procedures in force by data processors and circumstances related to theft or removal of data. Moreover, the data controller expressed the view that the list, received from the District Police Headquarters, only included administrative (police) addresses and did not contain names, surnames and other data allowing the identification of a natural person. Thus, the GDPR would not apply, because the data has to be seen as anonymized. However, from the list also emerged the fact that residents of these buildings/flats were placed in quarantine, which made it possible to identify them. It came out that the confidentiality of the processed data had been violated in the course of the performance of employee duties of the data processor, who had left the printed list on the desk without proper supervision. During this time, another employee had recorded the list in the form of a photo and had shared it with another person.

Following the review of the entirety of the collected material in this case, UODO considered that the information regarding the city name, street name, building/flat number and placing a data subject in medical quarantine, constitute personal data within the meaning of Art. 4 (1) GDPR, while the last comprises a special category of personal data concerning health according to Art. 9 (1) GDPR. Based on the above, it is possible to identify the data subjects, and therefore the data controller is bound to the obligations arising from the GDPR.

In the opinion of UODO, the protective measures indicated in the risk analysis are general formulations, which do not refer to specific activities undertaken by authorized employees. The measures are insufficient and inadequate to the risks of processing special categories of data. In addition, the data controller should have considered factors, such as recklessness and carelessness of employees and a lack of due diligence.

According to Art. 33 (1) GDPR, the data controller shall without undue delay and, where feasible, not later than 72 hours after having become aware of the data breach, notify it to the competent supervisory authority. Moreover, in a situation of high risk to the rights and freedoms of the data subjects, resulting from the data breach (which undoubtedly arose from the disclosure), the data controller is obliged to inform the data subject without undue delay in accordance with Art. 34 (1) GDPR. Despite this, the company did not report the infringement, neither to the President of UODO nor to the concerned data subjects.

Microsoft reacts on EDPB’s data transfer recommendations

24. November 2020

Microsoft (“MS”) is among the first companies to react to the European Data Protection Board’s data transfer recommendations (please see our article), as the tech giant announced in a blog post on November 19th. MS calls these additional safeguards “Defending Your Data” and will immediately start implementing them in contracts with public sector and enterprise customers.

In light of the Schrems II ruling by the Court of Justice of the European Union (“CJEU”) on June 16th, the EDPB issued recommendations on how to transfer data into non-EEA countries in accordance with the GDPR on November 17th (please see our article). The recommendations lay out a six-step plan on how to assess whether a data transfer is up to GDPR standards or not. These steps include mapping all data transfer, assessing a third countries legislation, assessing the tool used for transferring data and adding supplementary measures to that tool. Among the latter is a list of technical, organizational, and contractual measures to be implemented to ensure the effectiveness of the tool.

Julie Brill, Corporate Vice President for Global Privacy and Regulatory Affairs and Chief Privacy Officer at Microsoft, issued the statement in which she declares MS to be the first company responding to the EDPB’s guidance. These safeguards include an obligation for MS to challenge all government requests for public sector or enterprise customer data, where it has a lawful basis for doing so; to try and redirect data requests; and to notify the customer promptly if legally allowed, about any data request by an authority, concerning that customer. This was one of the main ETDB recommendations and also included in a draft for new Standard Contractual Clauses published by the European Commission on November 12th. MS announces to monetary compensate customers, whose personal data has to be disclosed in response to government requests.  These changes are additions to the SCC’s MS is using ever since Schrems II. Which include (as MS states) data encrypted to a high standard during transition and storage, transparency regarding government access requests to data (“U.S. National Security Orders Report” dating back to 2011; “Law Enforcement Requests Report“) .

Recently European authorities have been criticizing MS and especially its Microsoft 365 (“MS 365”) (formerly Office 365) tools for not being GDPR compliant. In July 2019 the Ministry of Justice in the Netherlands issued a Data Protection Impact Assessment (DPIA), warning authorities not to use Office 365 ProPlus, Windows 10 Enterprise, as well as Office Online and Mobile, since they do not comply with GDPR standards. The European Data Protection Supervisor issued a warning in July 2020 stating that the use of MS 365 by EU authorities and contracts between EU institutions and MS do not comply with the GDPR. Also, the German Data Security Congress (“GDSC”) issued a statement in October, in which it declared MS 365 as not being compliant with the GDPR. The GDSC is a board made up of the regional data security authorities of all 16 german states and the national data security authority. This declaration was reached by a narrow vote of 9 to 8. Some of the 8 regional authorities later even issued a press release explaining why they voted against the declaration. They criticized a missing involvement and hearing of MS during the process, the GDSC’s use of MS’ Online Service Terms and Data Processing Addendum dating back to January 2020 and the declaration for being too undifferentiated.

Some of the German data protection authorities opposing the GDSC’s statement were quick in welcoming the new developments in a joint press release. Although, they stress that the main issues in data transfer from the EU to the U.S. still were not solved. Especially the CJEU main reserves regarding the mass monitoring of data streams by U.S. intelligence agencies (such as the NSA) are hard to prevent and make up for. Still, they announced the GDSC would resume its talks with MS before the end of 2020.

This quick reaction to the EDPB recommendations should bring some ease into the discussion surrounding MS’ GDPR compliance. It will most likely help MS case, especially with the German authorities, and might even lead to a prompt resolution in a conflict regarding tools that are omnipresent at workplaces all over the globe.

China issued new Draft for Personal Information Protection Law

23. November 2020

At the end of October 2020, China issued a draft for a new „Personal Information Protection Law” (PIPL). This new draft is the introduction of a comprehensive system in terms of data protection, which seems to have taken inspiration from the European General Data Protection Regulation (GDPR).

With the new draft, China’s regulations regarding data protection will be consisting of China’s Cybersecurity Law, Data Security Law (draft) and Draft PIPL. The new draft legislation contains provisions relating to issues presented by new technology and applications, all of this in around 70 articles. The fines written in the draft for non-compliance are quite high, and will bring significant impact to companies with operations in China or targeting China as a market.

The data protection principles drawn out in the draft PIPL include transparency, fairness, purpose limitation, data minimization, limited retention, data accuracy and accountability. The topics that are covered include personal information processing, the cross-border transfer of personal information, the rights of data subjects in relation to data processing, obligations of data processors, the authority in charge of personal information as well as legal liabilities.

Unlike China’s Cybersecurity Law, which provides limited extraterritorial application, the draft PIPL proposes clear and specific extraterritorial application to overseas entities and individuals that process the personal data of data subjects in China.

Further, the definition of “personal data” and “processing” under the draft PIPL are very similar to its equivalent term under the GDPR. Organizations or individuals outside China that fall into the scope of the draft PIPL are also required to set up a dedicated organization or appoint a representative in China, in addition to also report relevant information of their domestic organization or representative to Chinese regulators.

In comparison to the GDPR, the draft PIPL extends the term of “sensitive data” to also include nationality, financial accounts, as well as personal whereabouts. However, sensitive personal information is defined as information that once leaked or abused may cause damage to personal reputation or seriously endanger personal and property safety, which opens the potential for further interpretation.

The draft legislation also regulates cross-border transfers of personal information, which shall be possible if it is certified by recognized institutions, or the data processor executes a cross-border transfer agreement with the recipient located outside of China, to ensure that the processing meets the protection standard provided under the draft PIPL. Where the data processor is categorized as a critical information infrastructure operator or the volume of data processed by the data processor exceeds the level stipulated by the Cyberspace Administration of China (CAC), the cross-border transfer of personal information must pass a security assessment conducted by the CAC.

It further to keep in mind that the draft PIPL enlarges the range of penalties beyond those provided in the Cybersecurity Law, which will put a much higher pressure on liabilities for Controllers operating in China.

Currently, the period established to receive open comments on the draft legislation has ended, but the next steps have not yet been reported, and it not yet sure when the draft legislation will come into full effect.

EDPB adopts first decision under Art. 65 GDPR

20. November 2020

During its 41st plenary session, the European Data Protection Board (EDPB) adopted by a two-thirds majority of its members its first dispute resolution decision under Art. 65 GDPR regarding Twitter International Company. The binding decision aims to resolve a dispute arisen from a draft decision by the Irish supervisory authority, being the lead supervisory authority in that case, and subsequent relevant and reasoned objections raised by several authorities concerned.

The Irish supervisory authority prepared a draft decision following an own-initiative investigation into Twitter International Company, after the company had notified the Irish supervisory authority of a personal data breach on January 8th, 2019. According to Art. 60 (3) GDPR, the Irish supervisory authority submitted its draft decision to the other authorities concerned in May 2020, which had the opportunity to express their objections within a period of four weeks afterwards. They referred to, inter alia, violations of the GDPR identified by the lead supervisory authority, the role of Twitter International Company as the sole data controller, and the quantification of the proposed fine.

Due to the fact that the lead supervisory authority rejected the objections and/or considered them not to be “relevant and reasoned”, it submitted the matter to the EDPB pursuant to Art. 60 (4) GDPR, thus initiating the dispute resolution procedure.

Thereupon, the completeness of the file was evaluated, that led to the institution of legal proceedings stated in Art. 65 GDPR on September 8th, 2020. In accordance with Art. 65 (3) GDPR and in conjunction with Art. 11.4 of the EDPB Rules of Procedure, the default time period of one month was extended by a further month on account of the complexity of the subject-matter.

On November 9th, 2020, the EDPB adopted its binding decision and will shortly notify it to the Irish supervisory authority, which, on the other hand, will issue a final decision. It will be addressed to the data controller without undue delay and at the latest by one month after the EDPB has notified its decision. In compliance with the requirements of Art. 65 (6) GDPR, the lead supervisory authority shall inform the EDPB of the date when its final decision is notified respectively to the controller. After that, the EDPB decision will be published on its website.

European Commission issues draft on Standard Contractual Clauses

18. November 2020

A day after the European Data Protection Board (EDPB) issued its recommendations on supplementary measures, on November 12th the European Commission issued a draft on implementing new Standard Contractual Clauses (SCCs) for data transfers to non-EU countries (third countries). The draft is open for feedback until December 10th, 2020, and includes a 12-month transition period during which companies are to implement the new SCCs. These SCCs are supposed to assist controllers and processors in transferring personal data from an EU-country to a third-country, implementing measures that guarantee GDPR-standards and regarding the Court of Justice of the European Union’s (CJEU) “Schrems II” ruling.

The Annex includes modular clauses suitable for four different scenarios of data transfer. These scenarios are: (1) Controller-to-controller-transfer; (2) Controller-to-processor-transfer; (3) Processor-processor-transfer; (4) Processor-to-controller-transfer. Newly implemented in these SCCs are the latter two scenarios. Since the clauses in the Annex are modular, they can be mixed and matched into a contract fitting the situation at hand. Furthermore, more than two parties can adhere to the SCC and the modular approach even allows for additional parties to accede later on.

The potential of government access to personal data is distinctly addressed, since this was a main issue following the “Schrems II” ruling. Potential concerns are met by implementing clauses that address how the data importer must react when laws of the third country impinge on his ability to comply with the contract, especially the SCCs, and how he must react in case of government interference.  Said measures include notifying the data exporter and the data subject of any government interference, such as legally binding requests of access to personal data, and, if possible, sharing further information on these requests on a regular basis, documenting them and challenging them legally. Termination clauses have been added, in case the data importer cannot comply further, e.g. because of changes in the third country’s law.

Further clauses regard matters such as data security, transparency, accuracy and onwards transfer of personal data, which represent issues that have all been tackled in the older SCCs, but are to be updated now.

Poland: Addresses of judges, politicians and pro-life activists published on Twitter

12. November 2020

In recent days, social networks in Poland have teemed with posts containing private addresses and telephone numbers of judges of the Constitutional Tribunal, politicians and activists openly supporting the abortion sentence. In conjunction with the publication of the above on Twitter, the President of the Personal Data Protection Office (UODO) took immediate steps to protect the personal data and privacy of these persons.

Background to this was the judgement of the Constitutional Tribunal repealing the provisions allowing abortion in cases of, for example, serious genetic defects or severe impairment of the human fetus. This provoked resistance from a part of Polish society and led to a street revolution of “liberal” men and women. Unfortunately, the agitation turned into invectives, destruction of property, public disorder and personal arguments. As a result, personal data of people supporting the prohibition of abortion have been shared thousands of times on all social media too. For this reason, numerous protesters appeared at the indicated houses, covered the walls of the surrounding buildings with vulgar inscriptions, and the addressees began to receive packages, e.g. with a set of hangers.

On October 29th, 2020 the President of the UODO responded to the case:

Publishing private addresses and contact details of pro-life activists, politicians and judges by users of the Twitter social network is an action leading to the disclosure of a wide sphere of privacy, and thus posing threats to health and life, such as possible acts of violence and aggression directed against these people and their family members.

The announcement stated that the President of the UODO requested an immediate procedure by the Irish supervisory authority, which is responsible for the processing of personal data via Twitter. Pointing out the enormous scale of threats, he indicated the need to verify the response time to reported irregularities and the possibility of introducing automated solutions to prevent the rapid furtherance of such content by other portal users. He also notified the law enforcement authorities that Twitter users had committed a crime consisting in the processing of personal data without a legal basis. The lawfulness had neither been guaranteed by consent according to Art. 6 (1) lit. a GDPR nor legitimate interests pursuant to Art. 6 (1) lit. f GDPR or any other legal basis. Thus, the processing has to be seen as illegitimate as also stated by the President of the UODO. The law enforcement authorities will be obliged to examine and document both the scope of personal data disclosed in a way that violates the principles of personal data protection and to determine the group of entities responsible for unlawful data processing. The President of the UODO also applied to the Minister of Justice – Public Prosecutor General for placing this case under special supervision due to the escalation of conflict and aggression, which pose a high risk of violating the life interests of both people whose data is published on social media and their family members.

In conclusion, the President of the UODO added:

The intensification of actions of all competent authorities in this matter is necessary due to the unprecedented nature of the violations and the alarming announcements of disclosing the data of more people, as well as the deepening wave of aggression.

Pages: Prev 1 2 3 4 5 6 7 8 9 10 ... 12 13 14 Next
1 2 3 4 14