Privacy harms and the importance of preserving the fundamental rights of individuals
Among others, one of the main preoccupations of the GDPR is to safeguard the interests and fundamental rights and freedoms of the data subjects. Understanding and being able to recognize what stands behind these concepts is paramount for any data privacy professional involved in the work of advising their companies on the path of compliance with legal requirements. The GDPR cannot be understood and applied without going back to its roots. Its foundation has been established by a host of precursor international human rights laws. Those who had had the chance to go through the IAPP CIPP/E training course are already aware that the course opens with a study of history and evolution of the right to privacy.
Brief recent history of privacy in the E.U.
From 1995 until May 2018 the main legal instrument on data protection was Directive 95/46/EC (Data Protection Directive). The Directive reflected data protection principles already contained in national laws and in Convention 108 while often expanding them. By the time Directive 95/46/EC was adopted several member States had already adopted national data protection laws. For example, Sweden adopted Datalagen in 1973, Germany adopted Bundesdatenschutzgestez in 1976, France adopted the Loi relative à l’informatique, aux fichiers et aux libertés in 1977. In the U.K. the Data Protection Act was adopted in 1948. Finally, the Netherlands adopted the Wet Persoonregistraties in 1989.
The Data Protection Directive of 1995 came about from the need to harmonize these laws. Nevertheless, its definitions and rules have been interpreted and transposed differently in national laws creating a complicated business environment. At the same time, given its initial role on economic integration and the establishment of a common market, the original treaties of the European Communities did not contain any reference to human rights and their protection. The free flow of data, capital, services, resources, in the internal market could not be hampered and at the same time the Community policies could not impact on its citizens human rights therefore, a reform was required.
In 2000 the European Union proclaimed the Charter of Fundamental Rights of the European Union. The Charter recognizes the entire host of civil, political, economic and social rights of the European citizens divided into six sections: dignity, freedoms, equality, solidarity, citizens’ rights and justice. “Respect for private and family life” and “Protection of personal data” are regulated in art. 7 respectively art. 8, Title ll “Freedoms”. The Charter came into force in 2009 together with the Treaty of Lisbon and constitute the foundation for the adoption of the GDPR.
The right to data privacy as an umbrella for other fundamental rights
The right to data protection and data privacy is heavily interlinked with other liberties and fundamental rights which are in fact dependent on the protection afforded to the right to privacy. We are talking here about fundamental rights such as freedom of thoughts, freedom of expression, freedom of assembly and association, right to education, freedom to work, non-discrimination, right of workers to associate in trade unions, protection from unjustified dismissal, the right to fair and just working conditions, access to healthcare, access to social security and social assistance, etc. GDPR legal requirements such as transparency, the balance test or the undertaking of a data protection impact assessment in specific contexts and circumstances or the privacy by design and by default principles stems from the need to protect the data subjects against violations and harms to their fundamental rights recognized by the Charter.
Privacy taxonomies and their importance
While professors Michael Friedewald, Rachel L. Finn and David Wright suggest seven potential privacy harms professor Daniel Solove taxonomy indicates four main categories subdivided in sub-categories. Briefly, prof. Michael Friedewald, Rachel L. Finn and David Wright suggests that violations can affect the privacy of the person, privacy of behavior and action, privacy of personal communications, privacy of data and image, privacy of thoughts and feelings, privacy of location and space, privacy of association. On the other hand, prof. Daniel Solove identifies the following potential violators: information processing (aggregation, secondary use, exclusion, insecurity, identification), collection (surveillance and interrogation), information dissemination (disclosure, exposure, breach of confidentiality, increased accessibility, appropriation, distortion) and invasion (intrusion, decisional interference).
According to the Handbook on the European data protection law the right to data protection often interacts with other rights. The interaction is often ambivalent. While there are situations where the right to personal data protection is in tension with a specific right (i.e., freedom of expression), there are also situations where the right to personal data protection effectively ensures the respect of the same specific right.
Nowadays the global economy model is data, big data. Personal data are inherent part of this vast picture. Business models are build-up around data processing. Shosana Zuboff describes in her very famous book “The Age of Surveillance Capitalism” that the structure created under the auspices of private capital is based on unprecedented asymmetries of knowledge which gives rise to unprecedented asymmetries of power. “A power that is able to shape our behavior but that operates outside of our awareness and is designed to operate, always keeping us ignorant. Engineered to keep us ignorant. We’re entering the 21st century in this institutional setting that introduces a whole new axis of social inequality. Not just economic inequality now, but these profound inequalities of knowledge and the power that accrues to behavioral knowledge that can actually influence our behavior, influence the behavior of our group, of our city, of our region, of our country, of our society. That’s at the institutional level and that is disfiguring 21st-century society, right from the beginning.”
While COVID-19 is appreciated as unlikely to cause a significant demographic catastrophe, it may have a major impact on privacy protections in the name of pandemic control. The most significant technology companies are leading the charge to harness data and the newest industry tools to fight the spread of this disease at the cost of reducing personal privacy in the process. Many of the processing activities in the context of Covid-19 data collection and processing target individuals and pull them from anonymity or isolate their information in databases and bring them to public attention. Covid-19 had repercussions on everyone’s daily life. Employees are being asked to provide information to their employers about their personal travel, their medical condition including testing habits and results. Consumers are being denied the right to access services or physical rights to access buildings or places unless they share a host of data about themselves. Travelers are being denied their right to travel or access certain destinations unless they present medical certificates with negative testing or undergoing vaccination.
Data privacy can be used as an open door for harming individuals in different many contexts. Aggregation of data can dynamite a person ability to land a job when the potential employer combines as much data as possible looking for negative and out-of-the-context information on the candidates. Facebook aggregates different pieces of personal information about its users to better target adds. Detention centers in different part of the world combines information about individual’s family life, social and religious associations in order to decide whether the detainee is allowed to leave the camp or not. Beside the privacy harms produced by data aggregation there are different other sources of concerns such as exclusion, secondary use, breach of confidentiality, exposure, distortion, and so on.
In terms of data protection, the main concern is around the volume and the variety of personal data processed and around the processing and its result. The introduction of complex algorithms and AI software that help transform big data into a resource for decision-making purposes affects individuals and groups in cases of profiling or labelling and ultimately raises many data protection issues and constitute a potential violation of fundamental rights and freedoms going beyond the right to privacy.