| Literature DB >> 26797878 |
Gianmarco Baldini1, Maarten Botterman2, Ricardo Neisse3, Mariachiara Tallacchini4.
Abstract
Even though public awareness about privacy risks in the Internet is increasing, in the evolution of the Internet to the Internet of Things (IoT) these risks are likely to become more relevant due to the large amount of data collected and processed by the "Things". The business drivers for exploring ways to monetize such data are one of the challenges identified in this paper for the protection of Privacy in the IoT. Beyond the protection of privacy, this paper highlights the need for new approaches, which grant a more active role to the users of the IoT and which address other potential issues such as the Digital Divide or safety risks. A key facet in ethical design is the transparency of the technology and services in how that technology handles data, as well as providing choice for the user. This paper presents a new approach for users' interaction with the IoT, which is based on the concept of Ethical Design implemented through a policy-based framework. In the proposed framework, users are provided with wider controls over personal data or the IoT services by selecting specific sets of policies, which can be tailored according to users' capabilities and to the contexts where they operate. The potential deployment of the framework in a typical IoT context is described with the identification of the main stakeholders and the processes that should be put in place.Entities:
Keywords: Agency; Ethics; Personalization; Privacy; Users’ empowerment
Mesh:
Year: 2016 PMID: 26797878 PMCID: PMC5972157 DOI: 10.1007/s11948-016-9754-5
Source DB: PubMed Journal: Sci Eng Ethics ISSN: 1353-3452 Impact factor: 3.525
Internet of things challenges
| Challenge | Description | |
|---|---|---|
| 1 | Economic incentives for data protection of the user are not directed to the user | Economic incentives for data protection of the user are limited to the businesses creating the IoT applications and devices |
| 2 | Incomplete information on the consequence of data disclosure | The user has often incomplete information about the consequences of disclosing data either voluntarily (e.g., providing data) or involuntarily (e.g., collection of position information). This lack of information affects each privacy decision. The incomplete information can also be a consequence of a limited perception by the user (e.g., the digital divide problem). In the IoT, this issue could be more relevant than in the Internet as the physical world information (e.g., physical position) could increase the information space. This is related to a concept of |
| 3 | Too large information space about the consequence of data disclosure | The complete set of needed information to make a rational choice could be so large that the user may not be able to access the IoT service in an effective way |
| 4 | Psychological biases | For example, the perception of immediate benefits (e.g., free access to an IoT service or application) can impact the long-term negative impact (e.g., risk to users’ privacy) |
| 5 | Trade-offs between businesses needs to collect and process data and rights to privacy | There is something of a tension between the market’s needs for data collection and correlation to support innovation and the business success of the IoT systems and applications (for both the public and private sector) and the protection of users’ data. While government (e.g., regulators’ bodies) may support the balance on one or another direction, one significant challenge is to design and apply regulations in a very dynamic environment where the life-cycle of the IoT applications in the market can be much shorter than the regulatory process |
| 6 | Cost of implementing privacy enhancing or data protection solutions | The costs of implementing PET, or other solutions to ensure proper care in collection, storage and retrieval of data. Who is going to support these costs? For example: that the willingness of the user to pay for the service, or the political will to ensure societal guarantees enforced through legislation |
| 7 | Accountability | The accountability of the IoT applications regarding users’ privacy. Who is going to be legally accountable for the user’s data? As seen in recent events, a data breach can be extremely damaging to a business company from an economic point of view. Are PET producers responsible for privacy breaches or the application where the PET is applied? Or the users themselves? |
| 8 | On-line and off-line identity | It is difficult to separate the on-line information from the off-line information and their linkage can generate privacy breaches |
| 9 | Digital divide | Users have different set of capabilities in accessing the IoT devices and applications. Depending on their level of technical proficiency, users have different levels of perceptions of the privacy risks or different understanding of the requests sent to them through the IoT |
| 10 | Conformance to regulatory frameworks | The definition, implementation and conformance to regulations in this context can be hampered by two factors: (1) the speed of the evolution of the IoT can be faster than the regulatory process itself, so that regulations can be moderately effective when they are enforced, (2) already deployed IoT systems and devices may require significant rework or replacement (e.g., recall of the IoT devices) which can be very expensive for companies |
| 11 | Support for dynamic context | The use of the IoT services and devices and the processing and storage of personal data may change depending on the context as recommended in Nissenbaum ( |
Fig. 1Enforcement Architecture
Fig. 2SecKit expert GUI for security policy profile authoring
Fig. 3Pictorial description of the policy framework to support ethical design