Category: Personal Data

Privacy Activist Schrems unleashes 101 Complaints

21. September 2020

Lawyer and privacy activist Maximilian Schrems has become known for his legal actions leading to the invalidation of “Safe Harbor” in 2015 and of the “EU-U.S. Privacy Shield” this year (we reported). Following the landmark court decision on the “EU-U.S. Privacy Shield”, Schrems recently announced on the website of his NGO “noyb” (non-of-your-business) that he has filed 101 complaints against 101 European companies in 30 different EU and EEA countries with the responsible Data Protection Authorities. Schrems exercised the right to lodge a complaint with the supervisory authority that every data subject has if he or she considers that the processing of personal data relating to him or her infringes the Regulation, pursuant to Art. 77 GDPR.

The complaints concern the companies’ continued use of Google Analytics and Facebook Connect that transfer personal data about each website visitor (at least IP-address and Cookie data) to Google and Facebook which reside in the United States and fall under U.S. surveillance laws, such as FISA 702. Schrems also published a list of the 101 companies which include Sky Deutschland, the University of Luxembourg and the Cyprus Football Association. With his symbolic action against 101 companies, Schrems wanted to point to the widespread inactivity among many companies that still do not take the data protection rights of individuals seriously despite the recent ruling by the Court of Justice of the European Union.

In response, the European Data Protection Board (“EDPB”) has set up a “task force” to handle complaints against European companies using Google Analytics and Facebook services. The taskforce shall analyse the matter and ensure a close cooperation among the members of the Board which consists of all European supervisory authorities as well as the European Data Protection Supervisor.

Brazil Update: Rapid Developments regarding Brazil’s LGPD come with legal Uncertainty

28. August 2020

Earlier this year, in April, the President of Brazil issued Provisional Measure #959/2020, which dealt with emergency measures in face of the pending Coronacrisis. The Provisional Measure (“PM”) did not only set rules for the federal banks’ payments of benefits to workers affected by the reduction in salary and working hours and the temporary suspension of employment due to the pandemic, but also postponed the effective date of Brazil’s first Data Protection Law (“LGPD”) from the 14 August 2020 to the 3 May 2021 (we reported).

In Brazil, PMs serve as temporary law and are valid for a maximum period of 120 days, in which both chambers of the National Congress must approve of the PM in order to become permanent law.

As the 120 days period was coming to an end, the House of Representatives approved of the PM on 25 August 2020, but included an amendment to delay the effective date only to the 31 December 2020. One day later, on 26 August 2020, the Senate approved of the PM, but provided yet another amendment to not include any delay of the LGPD’s effective date at all. The Senate’s amendment rather postulates that violations against the LGPD shall not be santioned by the Data Protection Authority until 1 August 2021. Thus, neither the House of Representative’s postponement to the 31 December 2020 nor the President’s intial postponement to the 3 May 2021 were approved of. This development came to a great surprise because in April, Brazil’s Senate itself introduced  Law Bill “PL 1179/2020” which aimed at postponing the effective date of the LGPD to 1 January 2021.

After all, the LGPD will become effective very soon. Upon the rapid developments regarding the LGPD, legal commentators from Brazil still share some confusion to when the law will become valid exactly. They report that the law will become effective either when the President signs it into law or retroactively on 14 August 2020. In any case, many Brazilian businesses are reportedly not ready for the LGPD whilst also facing a very difficult economic environment, as Brazil is suffering from the consequences of the pandemic.

Moreover, Brazilian businesses are also facing legal uncertainty because Brazil’s national Data Protection Authority (“ANPD”) is still not fully functional. Only on 26 August 2020, Brazil’s President passed Decree 10.474 to establish the ANPD. However, the new Data Protection Law gives the ANPD many vital responsibilities that it has not been able to fulfil, because it hadn’t been established yet. These responsibilities include

  • Recognising good practices and best-in-class examples of accountable privacy programs,
  • Establishing rules, procedures and guidance for organisations as required by the LGPD,
  • Clarifying LGPD provisions,
  • Providing technical standards to organisations, and
  • Enabling international transfers of personal data.

As the recent developments and the status quo of the national Data Protection Authority suggest a rocky road ahead for Brazil’s privacy landscape, the fundamental milestones of making the LGPD effective and establishing the ANPD have been passed now. At the same time, Brazilian businesses can draw hope from the fact that they have time to become compliant until 1 August 2021.

Irish DPC to assess TikTok’s plans for opening Data Centre in Ireland

13. August 2020

The short video app TikTok is planning to establish a data centre in Ireland under the One Stop Shop (OSS) data processing mechanism, the Irish Data Commission has stated.

However, the company needs to first be assessed to determine if they meet the requirements of the OSS.

The OSS rules, introduced under the General Data Protection Regulations (GDPR) rules, mean companies can make the Irish Data Protection Commission the lead supervisory authority, if they meet the criteria, and would not have to deal with regulators in each of the 28 EU member states but could be monitored by a lead regulator in one state. This would benefit the company in the case that if something happens, it would be one investigation, one decision and one appeal, rather than one for each country affected.

These plans come at a time when the popular app is facing some criticism, however. Not only is TikTok on the verge of being banned in the United States, a lot of doubts in regard to their handling of user data have surfaced in the past few months.

Last week in Beijing, the Beijing Internet Court ruled against TikTok’s owner Tencent Holdings in cases alleging the misuse of user data. The data was shared without consent between the WeRead and WeChat apps, violating the users’ privacy.

The move to establish a data centre in Ireland “will create hundreds of new jobs and play a key role in further strengthening the safeguarding and protection of TikTok user data with a state of the art physical and network security defense system planned around this new operation,“ stated Global Chief Information Security Officer of the company, Roland Cloutier.

Following the moves of big tech giants of recent years, TikTok plans to open the data centre by the year 2022. The Irish Data Protection Commissioner stated that the examination for the OSS mechanism is currently underway.

South Africa’s Data Protection Act comes into force

9. July 2020

On July 1, 2020, South Africa’s Protection of Personal Information Act 2013 finally came into effect. The Act had been in planning for the last seven years, with parts of it already published in 2014, and will fully come into effect with oversight provisions in June 2021, allowing for a 12 months period to enable companies to become compliant with the new regulations.

Due to its long planning period, most companies already have organised compliancy. On the other side, a lot of businesses haven’t taken the necessary steps yet, as they have been waiting for the final push to see if the Act would even come into effect. Full enforcement will be enacted on July 1, 2021, giving those companies a countdown to become compliant.

The initial draft made in 2013 was mainly based on the EU Data Protection Directive 95/46/EC, with some changes for stricter provisions. The partial enforcement in 2014 allowed for the establishment of an Information Regulator in 2016, which has released Guidances in light of the future enforcement of the Act.

The right to privacy has been a fundamental right since 1996, and the act aims to promote the protection of personal data for any business processing personal information in South Africa. However, different from a lot of other Data protection Regulations around the world, the South African Protection of Personal Information Act also includes protection of the juristic person, such as companies, banks, trusts, etc.

One of the bigger changes in regards to South Africa’s previous handling of protection of personal data represents the obligation to notify a data breach to the authorities and, in some cases, to the data subjects. It also includes further requirements for international data transfers, as well as finally detailing data subjects’ rights.

Regional Court of Vienna judges in Schrems against Facebook case

6. July 2020

On June 30th, 2020, the Vienna Regional Court passed judgement in the case of Max Schrems against Facebook Ireland Limited, in the case number 3 Cg 52/14k-91 (in German). In the following, we will be presenting the case and the court’s judgement.

Facts of the case

In the years 2011, 2012, 2013, 2015 and 2019, the plaintiff submitted requests for information in accordance with Art. 15 GDPR. The defendant initially responded to these requests with an 18-page pdf file dated 09.06.2011 and a CD with further pdf files of 1,222 A4 pages. Despite the information provided, the plaintiff felt that his rights as stated by the GDPR had been violated, as none of the consecutive requests had been answered. From his point of view, the information provided was neither sufficient in terms of content nor was the number of responses in relation to the number of requests made sufficient for him.

Furthermore, the plaintiff was concerned by the data processing by third parties, about which he received no clear information. He also stated that he was “Controller” in the sense of the GDPR. The defendant had not fulfilled the resulting requirements, as Data Processor, of concluding a Data Processing Agreement with the plaintiff. Finally, the defendant had violated Art. 9 GDPR by failing to obtain consent in respect of his interests and further sensitive data, for which the plaintiff demanded injunction for future data processing.

Guiding principles of the judgement

The Regional Court judged on the following guiding principles in the case:

  • the defendant must provide the plaintiff with complete information in writing and free of charge within fourteen days about all personal data of the plaintiff processed by it, stating the exact origin and, if applicable, the exact recipients of the data,
  • and pay the applicant the sum of EUR 500 in damages within fourteen days.

Reason for decision

The regional court’s guiding principles on the case were the only points in the plaintiff’s claim in which they judged in his favour. The court has stated that the tools used and information given by the defendant to inform the plaintiff about the processed personal data is not enough to meet the requirements of Art. 15 GDPR’s right of access. This results in a lack of control of the plaintiff over his own personal data, which goes against his fundamental right to data privacy. Therefore, the court has ruled damages in the sum of EUR 500 as adequate compensation for the infringement of Mr. Schrems’ privacy.

Regarding Mr. Schrems’ other points, the court ruled that because the plaintiff uses the Facebook platform in light of private/family activities, he cannot be a Controller of the processed personal data due to the fact that according to Art. 2 II lit.c GDPR, the regulation does not apply to him. This also applies to social media and online networks, as mentioned in Recital 18. Therefore, Facebook is not a Data Processor in the terms of those private activities and purposes, which negates the requirement of a Data Processing Agreement according to Art 28 GDPR.

Further, the court sees no sensitive data in the lines of Art. 9 GDPR to be at risk. In light of the personalisation of the platform, such as personalized ads and suggestions, the court stated that this belongs to the core of the defendant’s business activities. As such, there is no consent needed, as the defendant states that the processing of the data is for the purpose of a contract. The plaintiff, according to the court, has entered into such a contract knowing of the terms of service and on his own behalf in order to use the platform’s services. An injunction regarding the future processing of such personal data is therefore not to be applied.

Assessment

Overall, the Regional Court’s judgement has only a minimal practical relevance, as it is hard to fully assess the consequences of the passed judgement. One can neither say how the conduct will affect the future management of the company, nor is it certain whether the judgement will even become final in the first place. However, the plaintiff has already announced on NOYB’s homepage that he will lodge an appeal, and it therefore will remain to be seen what practical relevance can be drawn from the case in the future.

EDPB releases new official register of Art. 60 GDPR decisions

29. June 2020

On 25 June 2020, the European Data Protection Board (“EDPB”) released a new register of final decisions by national European Data Protection Authorities (Supervisory Authorities) cooperating with one another pursuant to Art. 60 GDPR. The register provides access to the decisions themselves, summaries of the decisions in English, and information on the identity of the cooperating Lead Supervisory Authority and Concerned Supervisory Authorities.

The GDPR postulates that Supervisory Authorities have to cooperate in potential cases of GDPR violations that include cross-border data processing activities. During this cooperation, the Lead Supervisory Authority will be in charge of preparing the draft decision and involving the Concerned Supervisory Authorities, and will act as the sole interlocutor of the Controller or Processor (“One-Stop-Shop”-Principle), Art. 56 and Art. 60 GDPR.

To date, the new EDPB register contains 110 final decisions. The EDPB states in its announcement that ‘the register will be valuable to data protection practitioners who will gain access to information showcasing how SAs work together to enforce the GDPR in practice.’

Thailand postpones Enforcement of new Personal Data Protection Act

22. June 2020

In response to the European General Data Protection Regulation (“GDPR”) becoming applicable in 2018, Thailand adopted its first-ever Personal Data Protection Act (“PDPA”) into law on 28 May 2019. As it is fashioned after the GDPR, the PDPA is built around principles that vastly align with the GDPR, especially in the areas of data protection principles, legal bases, and data subject rights. Originally, it was determined that the PDPA would start its applicability one year after its adoption, on 27 May 2020.

Now, the Thai Government has approved of a draft decree by the Ministry of Digital Economy and Society (“MDES”) to postpone the enforcement of most sections of the PDPA to 31 May 2021. The MDES explained that the reasons for delay are the current Corona pandemic and its strain on businesses, as well as many businesses not being prepared for PDPA compliance. Notably, Brasil also postponed the enforcement of its new Data Protecion Law (“LGPD”) for similar reasons (we reported).

The only sections of the PDPA that will be enforced as originally planned include the appointment of the Personal Data Protection Committee members and the establishment of the Office of the Personal Data Protection Committee. Whilst the delay allows companys more time to become PDPA compliant, the lack of enforcement regarding data subject rights in the meantime are a big concern of critics, especially in light of the recent adoption of Thailand’s controversial new cybersecurity law.

Hungary Update: EDPB publishes Statement on Art. 23 GDPR

17. June 2020

Since March 2020, Hungary has been in a “state of emergency” following the COVID-19 pandemic. The country’s COVID-19 related emergency laws and state of emergency received worldwide criticism from constitutional experts, politicians and civil rights groups, because it allows the Prime Minister to rule by decree during the state of emergency and does not provide a predefined end date. During the state of emergency, Prime Minister Victor Orbán made extensive use of his newly gained powers by passing more than a hundred decrees, including Decree No. 179/2020, which suspended the GDPR data subject rights in Art. 15-22 GDPR with respect to personal data processing for the purpose of preventing, understanding, detecting the coronavirus disease and impeding its further spread (we reported).

In response to this suspension of GDPR rights, the European Data Protection Board (“EDPB”) has recently published a Statement on restrictions on data subject rights pursuant to Art. 23 GDPR, which is the provision that Hungary’s measure was based on. This article allows the member states to restrict, by way of a legislative measure, the scope of the obligations and rights provided for in Articles 12 to 22 and Article 34, when such a restriction respects the essence of the fundamental rights and freedoms and is a necessary and proportionate measure in a democratic society to safeguard, inter alia, important objectives of general public interest of the Union or of a Member State such as public health.

In its Statement, the EDPB points out that any restriction must respect the essence of the right that is being restricted. If the essence of the right is compromised, the restriction must be considered unlawful. Since the data subject’s right of access and the right to rectification are fundamental rights according to Art. 8 para. 2 of the Charter of Fundamental Rights of the European Union, any restriction of those rights must be carefully weighed up by the member states, in order respect the essence of the rights. The EDPB considers that restrictions adopted in the context of a state of emergency suspending or postponing the application of data subject rights, without any clear limitation in time, equate to a de facto blanket suspension and denial of those rights and are not be compatible with the essence of the fundamental rights and freedoms.

The EDPB also recalls that the restrictions under Art. 23 GDPR must be necessary and proportionate. It argues that restrictions that are imposed for a duration not precisely limited in time or which apply retroactively or are subject to undefined conditions, are not foreseeable to data subjects and thus disproportionate.

Furthermore, the EDPB takes the view that in order to safeguard important objectives of general public interest such as public health (Art. 23 para. 1 lit. e GDPR), there must be a clearly established and demonstrated link between the foreseen restrictions and the objective pursued. The mere existence of a pandemic or any other emergency situation alone does not justify a restriction of data subject rights, especially if it is not clearly established, how the restrictions can help dealing with the emergency.

Following the international public backlash, the Parliament of Hungary passed legislation on 16 June 2020 to revoke the emergency laws as soons as the current state of emergency will be terminated by the Government. Hungary’s Government announced in May that it intends to lift the state of emergency on 20 June 2020. After that, the restrictions on the GDPR rights shall be lifted as well, so that data subject may exercise their Art. 15-22 GDPR rights again.

Germany’s Constitutional Court curbs Federal Intelligence Service’s competence

16. June 2020

In a court ruling from May 19th 2020 with regards to the German Federal Intelligence Service (BND) and their manner of operation, the German Constitutional Court has proclaimed that the BND is bound by fundamental rights in cases of surveillance of foreigners, even outside of Germany’ federal territory.

 Background

The case, which was brought to the court in the manner of a constitutional complaint by a collective of foreign journalists, found its origin initially through the disclosures made by Edward Snowden back in 2013, where some of the BND’s practices in relation to strategic foreign surveillance came to light. In 2016, German legislators passed a new law with the purpose to regulate surveillance done by the BND. However, that new law mainly restricted surveillance of German citizens, as well as foreigner living in Germany. It has been criticized that the new law did nothing to restrict and regulate the BND’s actions abroad by not having to abide by any legal provisions. The constitutional complaint brought to the German Constitutional Court deals with strategic surveillance from foreign reporters and journalists with regards to their highly confidential data necessary to perform their work through the BND, which risks to be exchanged with their own country’s intelligence agencies and in the process put them at risk of federal measures taken against them.

The key points

Territorial Scope. One of the biggest points of the court ruling has been the definition of the territorial scope of the fundamental rights at risk in this case. Since the complainants are journalists from outside the German territory, the Constitutional Court had to specify if the constitutional rights that would shield them from surveillance by the BND would find application in the matter. In this instance, the court has ruled that the fundamental rights are not limited to the German territory, but rather apply wherever the German state authority is acting. This is derived from Art. 1 III of the German Constitution (GG), which binds the German state authority to conformity with the Constitution. In such, as the fundamental rights from Art. 10 I, Art. 5 I GG are not simply applicable to Germans, the Constitutional Court has extended the range of application to foreigners in foreign countries, and given them international importance.

Current legislation is unconstitutional. In effect, the Constitutional Court has further analysed the new intelligence law from 2016, and ruled it unconstitutional in the current state. The main reason is that, due to the fact that the legislators assumed that the fundamental rights did not apply, they did not conform with the requirements set out in the Constitution for such law. In such, the new law violates the privacy of telecommunications and its requirements from Art. 10 I GG, and in addition does not meet the key requirements deriving from other fundamental rights, such as Art. 19 I GG. However, the Constitutional Court has stated that the law can be amended to follow fundamental rights and comply with the constitution. The court declared several points which are necessary to implement in the amended law, some of which we will present further below.

Independent oversight. The Constitutional Court stated that in order to ensure conformity with the Constitution and regulate the BND in a way that would ensure the protection of fundamental rights of the people under surveillance, it would be necessary to establish a new, independent oversight regime that would act to judge and regulate strategic surveillance. Its main purposes would be the legal oversight of the BND and protection of the surveillance subjects, as well as the control of the surveillance process, from the analysing of data to the transfer of information between agencies, etc.

Legislative suggestions. In the ruling of the case, the Constitutional Court also made a few suggestions in regards to potential statutory regulation in order to regulate the BND and its area of action better than it was in the past. Part of those suggestions were the necessity of defining the purpose of surveillance measures with precision and clarity, in order to ensure transparency, as well as the necessity for the legislator to set out essential framework for the analysis of the collected data, like a cease in analysis as soon as it becomes clear that the surveillance has touched the core of private life. The court also suggested that special requirements have to apply to the protection of professional groups with communications of increased confidentiality, and that the surveillance in these cases must be tied to qualified thresholds. The court also mentioned the storage and deletion of surveillance data, stating that the traffic data obtained should not be stored for longer than six months, while a systematic deletion policy needs to be established. In the terms of the transfer of information to other (foreign) intelligence agencies, the Constitutional Court made it clear that such transfers will need an official statutory basis in order to be lawful.

The court has given the German government until the end of 2021 to amend the law and make statutory changes to comply with the ruling and the decision of the international scope of the fundamental rights. While this may seem like a big set back for the BND, it is a chance to show that intelligence agencies can work on a high constitutional standard while also being successful in their purpose.

Series on COVID-19 Contact Tracing Apps Part 3: Data Protection Issues

28. May 2020

In today’s blogpost, we will finish the miniseries on COVID-19 contact tracing apps with a final part on the issues that are created by them with regards to data protection and users’ privacy. As we have presented in the first part of this series, different approaches to contact tracing apps are in use or are being developed in different countries. These different operating approaches have different data protection issues, some of which can, in the European Union, be mitigated by following data protection regulations and the guidelines the European Data Protection Board has published, which we presented in the second part of this series.

The arising data protection issues that come with COVID-19 contact tracing apps and their impact highly depend on the API design of the apps used. However, there are common points which can cause privacy problems that may apply to all contact tracing apps due to the sensitivity of the data processed.

The biggest risks of contact tracing apps

While contact tracing apps have the potential to pose risks to data protection and their users’ privacy in all terms of data protection aspects, the following are the risks politicians, scientists and users are most worried about:

  • The risk of loss of trust
  • The risk of unauthorized access
  • The risk of processing too much data
  • The risk of abuse of the personal data collected

The risk of loss of trust: In order to work properly and reach the effectiveness necessary to contain the spread of the virus and break the chain of transmission, scientists and researches have pinpointed that at least 60% of a country’s population has to use the contact tracing apps properly. But for this to be able to happen, user satisfaction and trust in the app and its use of their personal data have to remain high. A lot of the research done on the issue shares the concern that lack of transparency in the development of the apps as well as in regard to the data they collect and process might cause the population to be sceptical and distrustful to the technologies being developed. The European Data Protection Board (EDPB) as well as the European Parliament have stated that in order for contact tracing apps to be data protection compliant, their development as well as processing of data need to be transparent throughout the entirety of the use of the apps.

The risk of unauthorized access: While the risk that the apps and the data they process can be hacked is relatively low, there is the concern that in some cases unauthorized access may result in a big privacy issue. Especially in contact tracing apps that use GPS location data as well as apps that use a centralized approach to the storage of the data processed, the risks of unauthorized access is higher due to the information being readily available. In the case of GPS data, it is easily possible to track users’ movements, allowing for a very detailed potential to analyse their behaviour. The centralized storage stores all the collected data in one cloud space, which in the case of a hacking incident may result in easy access to not only information about social behaviour and health details, but also, if used in conjunction with GPS tracking data, an easy to identify user behaviour analysis. Therefore, it has been recommended to conduct a Data Protection Impact Assessment before launching the apps, and ensure that the encryption standards are high. The Bluetooth method of phones pinging each other anonymized IDs that change every 15 minutes in case of contact closer than 10 feet has been recommended as the ideal technology to minimize location data being collected. Furthermore, most scientists and researchers recommend that in order to prevent damage, a decentralized storage method is better suited to protect the data of the users, as this method only stores the information on the users’ device instead of a central cloud.

The risk of processing too much data: In the case of contact tracing apps, one of the big risks is the processing of too much data. This is an issue which can apply to apps using GPS location tracking, the necessity to collect sensitive health data other than the COVID-19 infection status, transactional information, contacts, etc. In general, contact tracing apps should not require much additional information except the user’s contact information, since it is only necessary to log the other devices their device has come in contact with. However, there are some countries that use contact tracing apps through GPS location tracking instead of Bluetooth exchange of IDs, in which case the location data and movements of the user are automatically recorded. Other countries, like for example India, have launched an app where additional health data is being processed, as well as other information unnecessary to follow up on the contact tracing. Contact tracing apps should follow the concept of minimization of data collection in order to ensure that only personal data necessary to the purpose of the contact tracing apps are being processed. That is also one of the important ground rules the EDPB has portrayed in their guideline on the subject. However, different countries have different data protection laws, which makes a unified approach and handling of personal data difficult in cases like these.

The risk of abuse of the personal data collected: One of the biggest fears of scientists and users regarding contact tracing apps is the potential risk of abuse of the personal data collected once the pandemic is over. Especially with the centralized storage, even now there are apps that give access to the data to the government, like in India, Hong Kong and Singapore. A majority of critics is demanding regulation which will ensure that the data cannot be used after the pandemic is over and the need for the apps has ceased. This is a specifically high risk in the case of tracing apps that locate the user through GPS location tracking rather than through Bluetooth technology, since the movements of the devices lead to a very detailed and easy to analyse movement tracking of the users. This potential risk is one the most prominent ones regarding the Apple and Google project for a joint contact tracing API, as both companies have been known to face severe data protection issues in the past. However, both companies have stated that they plan on completely discontinuing the developed API once the pandemic is over, which would disable the apps working with that API. Since the Bluetooth approach they are using stores the data on users’ devices, the data will be locked and inaccessible once the API cannot read it anymore. But there are still a lot of other countries with their own APIs and apps, which may lead to a risk of government surveillance and even abuse by foreign powers. For Europe, the EDPB and the European Parliament have clearly stated that the data must be deleted and the apps dismantled after they are no longer necessary, as the purpose and legal basis for processing will not apply anymore once the pandemic is under control.

The bottom line

Needless to say, the pandemic has driven the need for new technologies and approaches to handle the spread of viruses. However, in a modern world this brings risks to the personal data used to contain the pandemic and break the chain of transmission, especially due to the fact that it is not only a nationwide, but also an international effort. It is important for users to keep in mind that their right to privacy is not entirely overpowered by the public interest to contain the virus. However, in order to keep the balance, it is important for the contact tracing apps to face criticism and be developed in a way that is compliant with data protection regulations in order to minimize the potential risks that come with the new technology. It is the only way to ensure that the people’s personal freedom and private life can continue without having to take high toll from the potential attacks that could result from these risks. Transparency is the bottom line in these projects, and it can ensure that regulations are being met and the people’s trust is kept in order to be able to reach the effectiveness needed for the tracing apps to be successful in their purpose.

Pages: 1 2 3 4 5 6 7 8 9 10 ... 12 13 14 Next
1 2 3 14