| Literature DB >> 32357446 |
Pekka Ruotsalainen1, Bernd Blobel2,3,4.
Abstract
Digital health information systems (DHIS) are increasingly members of ecosystems, collecting, using and sharing a huge amount of personal health information (PHI), frequently without control and authorization through the data subject. From the data subject's perspective, there is frequently no guarantee and therefore no trust that PHI is processed ethically in Digital Health Ecosystems. This results in new ethical, privacy and trust challenges to be solved. The authors' objective is to find a combination of ethical principles, privacy and trust models, together enabling design, implementation of DHIS acting ethically, being trustworthy, and supporting the user's privacy needs. Research published in journals, conference proceedings, and standards documents is analyzed from the viewpoint of ethics, privacy and trust. In that context, systems theory and systems engineering approaches together with heuristic analysis are deployed. The ethical model proposed is a combination of consequentialism, professional medical ethics and utilitarianism. Privacy enforcement can be facilitated by defining it as health information specific contextual intellectual property right, where a service user can express their own privacy needs using computer-understandable policies. Thereby, privacy as a dynamic, indeterminate concept, and computational trust, deploys linguistic values and fuzzy mathematics. The proposed solution, combining ethical principles, privacy as intellectual property and computational trust models, shows a new way to achieve ethically acceptable, trustworthy and privacy-enabling DHIS and Digital Health Ecosystems.Entities:
Keywords: computational privacy; ethical design; ethics; fuzzy logic; models; privacy; trust
Year: 2020 PMID: 32357446 PMCID: PMC7246854 DOI: 10.3390/ijerph17093006
Source DB: PubMed Journal: Int J Environ Res Public Health ISSN: 1660-4601 Impact factor: 3.390
Widely used ethical models and their weaknesses.
| Ethical Model | Focus/Goals | Problems | |
|---|---|---|---|
| Normative ethics | Consequentialism [ | Focus is consequences of actions. Choices that bring more value are morally to make. | Difficult to know or calculate consequences of acts in advance. |
| Utilitarianism [ | The morally right action is that produces most overall good or wellbeing (e.g., happiness, welfare) and minimizes overall harm. | It is difficult to measure and compare impacts of acts to happiness or harm. | |
| Deontology [ | Choices cannot be justified by their effects. Action is good based on its characteristics. Action should follow moral rules and laws. Duty is highest value. | Ignores consequences of actions. | |
| Virtue ethics [ | Virtue (e.g., honesty, attitude) requires wisdom. Virtue and character straits of a person enable us carry out moral actions. | Based on personal characters. There is no agreement on what the virtues are. People are not honest. | |
| Applied ethics | Computer ethics [ | Impacts of information technology upon human values and formulation of policies for ethical use of information systems. | Impacts are difficult to measure. High-level principles are offered. |
| Information ethics [ | Ethical and moral issues arise from the development and use of information and information technologies. | Rules and principles are difficult to implement in information and communication technology (ICT) environments. | |
| Professional ethics [ | Personal and corporate standards of behavior expected by professionals. | Standards are not global. High-level principles. | |
| Business ethics [ | Moral principles that guide the way a business behaves. | Real principles seldom known by the customer, stockholder model is dominating. |
Common privacy models and their problems.
| Privacy Model | Features | Weaknesses in Digital Health Ecosystem |
|---|---|---|
| Westin and Altman models [ | Protection by limiting access of other to themselves. Selective control of access to self. | Health data is collected and used invisibly. Personal control is nearly impossible. |
| Communication privacy management theory (Petronio) [ | Privacy has boundaries. Regulation of the degree of boundary permeability using rules. | There are no boundaries in a Digital Health Ecosystem |
| Privacy as contextual integrity [ | Context (e.g., health care the Internet) have own principles and norms regulating information flow inside and between contexts. | Contexts are dynamic and virtual. Different contexts lead to different privacy solutions. Stronger parties can defines own norms and controls for information flow. |
| Online privacy [ | Continuous protection of personal information in online activities. | Stakeholders’ privacy features often unknown or unreliable. Privacy approach based on social norms and laws are ineffective. |
| Privacy as social issue [ | Privacy is a social value. Personal privacy need balanced with public, organizational and business interest. | Governmental and industrial needs often dismiss personal needs for privacy. |
| Privacy as Fuzzy concept [ | A human approach to privacy using fuzzy methods mathematics. | Difficult to collect reliable input data. Output of some methods is crisp. |
Widely used trust models and weaknesses.
| Model | Feature | Weaknesses in Digital Health Ecosystem |
|---|---|---|
| Disposition to trust [ | General willingness to depend on others characteristics. | General personal tendency to trust is unreliable. |
| Organizational (institutional) trust [ | Confidence that organization has promised trust features will perform beneficial actions. | Trust features are seldom known or measures, but based on beliefs in implementations. |
| Recommended trust [ | Based on beliefs in others recommendations. | Recommendations are typically based on quality/cost and not on information privacy. |
| Trusting belief [ | Subjective belief that a trustee has beneficial features. | Belief cannot be used as the base of decision. |
| Fuzzy approach to trust [ | Qualitative approach to trust using natural language. Trust value is computed using fuzzy rules. | Collection of input data can be demanding. Some methods require crisp input. Determination of Fuzzy rules requires expertise. |
| Computational trust [ | Mathematical methods are used to calculate trust value/rank from attributes. | Attributes are difficult to measure and seldom available. |
Figure 1Representation of an ethically acceptable, privacy preserving and trustworthy Digital Health Ecosystems accredited by ISO 23,903 (after [77]).
Figure 2Adapted HL7/OMG Authorization Reference Model (after [4]).