| Literature DB >> 35127339 |
Oskar J Gstrein1, Anne Beaulieu1.
Abstract
The United Nations confirmed that privacy remains a human right in the digital age, but our daily digital experiences and seemingly ever-increasing amounts of data suggest that privacy is a mundane, distributed and technologically mediated concept. This article explores privacy by mapping out different legal and conceptual approaches to privacy protection in the context of datafication. It provides an essential starting point to explore the entwinement of technological, ethical and regulatory dynamics. It clarifies why each of the presented approaches emphasises particular aspects and analyses the tensions that arise. The resulting overview provides insight into the main strengths and limitations of the different approaches arising from specific traditions. This analytic overview therefore serves as a key resource to analyse the usefulness of the approaches in the context of the increasing datafication of both private and public spheres. Specifically, we contrast the approach focusing on data subjects whose data are being 'protected' with others, including Fair Information Practice Principles, the German right to 'informational self-determination', and the South American 'habeas data' doctrine. We also present and contrast emerging approaches to privacy (differential privacy, contextual integrity, group privacy) and discuss their intersection with datafication. In conclusion, we put forth that rather than aiming for one single solution that works worldwide and across all situations, it is essential to identify synergies and stumbling blocks between the various regulatory settings and newly emerging approaches.Entities:
Keywords: Data Protection; Datafication; Human Dignity; Privacy; Technology
Year: 2022 PMID: 35127339 PMCID: PMC8800549 DOI: 10.1007/s13347-022-00497-4
Source DB: PubMed Journal: Philos Technol ISSN: 2210-5433
Fig.1The layers of datafication. This graphic is adapted from the ecosystem of Big Data (Letouze, 2015) through the addition of the care sphere and a conceptualisation of the meaning of each layer. Reproduced from (Beaulieu & Leonelli, 2021)
Summary of concepts and instances relating to privacy and datafication. This table supplements the detailed discussion in this article and provides a concise overview that highlights the contrast and overlap between different approaches
| Legal approaches (Sect. 4.1) | Conceptualisation of Privacy | Instance of datafication |
|---|---|---|
| Privacy (US) 4th amendment focused | Right to be left alone: protection from warrantless search; establishes realm of the personal by excluding it from public scrutiny (effectively scrutiny from governmental powers); realm of the private is defined by ‘reasonable expectation of privacy’ afforded only to US residents/’persons’—all other humans are without protection (for example, immigrants at border); 4th Amendment only regulates interactions between government and citizens First established through Seymane’s case (1604) 5 Coke Rep. 91; protection of home against abuse of power by government through jurisprudence; ‘knock-and-announce’ rule; ‘my home is my castle’ | GPS placed on car by law enforcement agencies: violates reasonable expectation of privacy (unauthorised GPS monitoring (U.S. v. Jones) Call Data Record (Verizon) used to prove involvement in armed robberies; Carpenter vs. USA 2015 San Bernardino terrorist attacks: In the aftermath, US authorities asked Apple to provide information to crack suspect’s phone; Apple refused, but phone still cracked with support of an Israeli firm, thereby showing need for protection of users from government intervention |
| Fair Information Practice Principles, FIPPs (US) | A set of 8 principles developed in the USA in the 1960s to regulate relations between customers and companies. Two principles are especially important: notice and choice Regulation traditionally applying a ‘sectoral approach’. Legal/Regulatory frameworks can be very detailed and practical, but sometimes non-existent or difficult to keep up to date due to speed of technological innovation Generally, data flow freely unless restricted in a specific instance for a specific sector. Regulation of business activities only applies if there is commercial agreement between business and customer. In the USA, divided among federal and state contract law, general and sector-specific consumer protection laws (e.g. the Health Insurance Portability and Accountability Act, the Children’s Online Privacy and Protection Act) apply, as well as tort law | Notice has taken the form of very extensive texts written in complex legal terms presented as a pop-up window, while choice has been reduced to ‘clicking’ a box labelled ‘accept’. To what extent ‘notice’ in this form is acceptable and whether it is actually possible to choose are both debated, since the cost of exclusion from digital services can be so high as to make it nearly impossible for an individual to choose to refuse, if it means exclusion from employment, education or health care |
| Data protection (EU, Council of Europe) | Comprehensive regulatory frameworks (‘omnibus approach’) based on principles such as data minimisation and purpose limitation No use (flow) of data without specific legal basis. Individual consent is frequently used, but such a basis might also originate from public interest (e.g. research, keeping statistical record), or prevailing interest of other individuals Fundamental right (next to traditional privacy right) in the EU; the existence as an additional right is potentially relevant in areas where data are combined to generate inferences on individual behaviour/characteristics and to be able to make a case about impact of data on groups Three key parties: data subject (= identifiable natural person depending on interpretation of abstract/concrete criterion); controller (main responsibility); data processor (technical/organisational support) Includes many individual rights such as the right to know about collection, the right to request one’s data and to be forgotten; the right to data portability; the right to review of automated decisions, etc First laws since early 1970s, developed since then continuously on national, international (e.g. Council of Europe Convention 108) and supranational levels (e.g. EU GDPR) | The ‘right to be forgotten’ can be implemented as the right of a person to have a link relating to their name which is inaccurate, misleading, or distressing removed from the search engine result. This would potentially also provide recourse in face of malicious activities like ‘revenge porn’ When a European citizen is the subject of a decision made by an algorithmically driven system, they are entitled to review of such a decision. For example, Uber’s use of an automated fraud detection system led to firing of drivers. An Amsterdam court ruled that this practice contravenes Article 22 of the EU General Data Protection Regulation (GDPR), which seeks to protect individuals from automated decision-making. Uber was ordered to reinstate the drivers, with compensation |
| Habeas data (South America, Argentina and Brazil) | Literally means ‘the data belongs to the body’; reaction to the political situation in dictatorships in South America, but not always clearly defined conceptually Narrowly understood, it entitles a citizen to obtain all available information about oneself (or close family members only), specifically in the context of a court/administrative procedure Broadly understood, it enables anyone to access information from public archives, and potentially also from private sources in cases where this information might be of public interest Established as a legal right in the constitutions of several Latin American countries including Brazil and Argentina | Getting access to data of ‘disappeared persons’ (desaparecidos) in the aftermath of Argentinian military dictatorship (1976–1983). The relevant data are typically held in public archives, but there is no clear possibility or institutional mechanism that provides access for family members, researchers, or the public without ‘ |
| Informational self-determination (Germany) | Based on ‘right to personal development’, derived from the German constitution through a combination of the protection of ‘human dignity’ and general right to personality (‘allgemeines Persönlichkeitsrecht’) Established by the German Federal Constitutional Court in the ‘census judgment’ from 1983 (BVerfG, ‘Volkszählungsurteil’, 15.1.1983 – 1 BvR 209/83, 269/83, 362/83, 420/83, 440/83, 484/83, BVerfGE 65, 1) Right to decide which information about self is to be communicated; Protection from unlimited collection, use and storage of data Not being afraid of what the state knows or might know; room for self-development—strongly connected to view of life as divided up into public and private spheres The person is an actor who decides and can stop data collection; state-citizen relationship is the focus | Privacy notices on a website are meant to enable voluntary, specific, informed, and unambiguous consent of individuals to particular uses of their data, thereby enabling informational self-determination. But what does it mean to accept such complex and lengthy notices? Are such practices to express consent functional and sufficiently transparent? This raises the issue of whether someone is making an informed decision and whether one has a choice in the matter A regional academic hospital carries out a large-scale longitudinal study through which it collects genetic material and keeps health records of families. The original intention is to understand genetic and hereditary diseases better to improve public health. Once the data are collected and included in studies (publications and research), can the individuals from whom the data were collected still determine what happens to it? How can it be guaranteed they will not have to be concerned about how public and private institutions use insights derived from their data? |
| Emerging conceptual approaches (Sect. 4.2) | ||
| Differential privacy (in different jurisdictions, in relation to the privacy-preserving creation of statistical insights) | A statistical/mathematical method to make it more difficult to look up or infer personally identifiable information from large datasets. Artificial ‘noise’ is added to data collected on individuals. This makes it difficult to analyse individual data points, while significant overall trends (‘population level insights’) across the entire sample remain visible Three important parameters for implementation: -accuracy of the dataset -artificial adjustments making the set less accurate on a granular level (noise) -number of queries available to probe the dataset (‘privacy budget’) Proponents claim it adds privacy protection ‘by default’ while enabling the responsible use of large datasets. However, the method is difficult to implement throughout an entire process when many actors are involved and requires a high level of data literacy to be able to interpret datasets correctly. Additionally, it is not useful in cases where accurate data on individuals are required for analysis and processing | A globally active technology corporation wants to understand how a new app is being used in a certain market and whether available features are being picked up by users or not. The insights on an individual level are not so important as the overall picture Policymakers want to understand whether a curfew leads to less mobility across the entire population of a country using search queries or mobile phone data, to determine whether citizens/residents are actually staying at home (e.g. COVID-19 movement restrictions) |
| Contextual integrity (USA in origin, not applied as formally binding concept) | Starting point is expectations of privacy—distributed across actants (persons and technologies) Actants will have different privacy expectations based on context and flow; data flows and systems are designed accordingly and with appropriate levels of safeguards depending on context Parameters of informational norms are actors (subject, sender, recipient), attributes (types of information), and transmission principles (constraints under which information flows); against ‘transparency and choice’ approach because it is not context-sensitive | Apparently harmless data (how often you log on and at what time of the day) get associated with your profile and is used to target you (you must be lonely and insecure, here is an advertisement for a new leather jacket). In this example, there is a shift in context from social platform to marketing opportunity |
| Group Privacy (in different jurisdictions, in reaction to individualisation of privacy) | Looking at how the use of data impacts groups and their collective autonomy for future development ‘Big Data’-driven applications might not directly impact individual or do so based on individual consent, but the insights created by them reshape society Difficult to put in practice since it is largely unclear who should take agency on behalf of whom | Predictive Policing; predicting the likelihood of crime based on statistical methods and different sources of open/closed data, often based on historical data for a given area Using algorithm-based systems to create profiles for selecting applicants for job interviews |