| Literature DB >> 33764258 |
Andreas Schönau1, Ishan Dasgupta1, Timothy Brown1, Erika Versalovic1, Eran Klein1,2, Sara Goering1.
Abstract
Neural devices have the capacity to enable users to regain abilities lost due to disease or injury - for instance, a deep brain stimulator (DBS) that allows a person with Parkinson's disease to regain the ability to fluently perform movements or a Brain Computer Interface (BCI) that enables a person with spinal cord injury to control a robotic arm. While users recognize and appreciate the technologies' capacity to maintain or restore their capabilities, the neuroethics literature is replete with examples of concerns expressed about agentive capacities: A perceived lack of control over the movement of a robotic arm might result in an altered sense of feeling responsible for that movement. Clinicians or researchers being able to record and access detailed information of a person's brain might raise privacy concerns. A disconnect between previous, current, and future understandings of the self might result in a sense of alienation. The ability to receive and interpret sensory feedback might change whether someone trusts the implanted device or themselves. Inquiries into the nature of these concerns and how to mitigate them has produced scholarship that often emphasizes one issue - responsibility, privacy, authenticity, or trust - selectively. However, we believe that examining these ethical dimensions separately fails to capture a key aspect of the experience of living with a neural device. In exploring their interrelations, we argue that their mutual significance for neuroethical research can be adequately captured if they are described under a unified heading of agency. On these grounds, we propose an "Agency Map" which brings together the diverse neuroethical dimensions and their interrelations into a comprehensive framework. With this, we offer a theoretically-grounded approach to understanding how these various dimensions are interwoven in an individual's experience of agency.Entities:
Keywords: Neuroethics; brain computer interfaces (BCIs); deep brain stimulation (DBS); personal identity; privacy; responsibility
Mesh:
Year: 2021 PMID: 33764258 PMCID: PMC8434765 DOI: 10.1080/21507740.2021.1896599
Source DB: PubMed Journal: AJOB Neurosci ISSN: 2150-7759
Four ethical dimensions in neurotechnology research.
| Dimensions | Agency Concerns | Quotes |
|---|---|---|
| Responsibility |
| “I guess I didn’t concentrate hard enough. But it also may be that the measuring is not optimal.” ( |
| Privacy |
| “I wouldn’t want anyone to know my feeling when I see my husband, or go to bed with my husband, … see my doctor, or any of those things” (unpublished, DBS user at MGH). |
| Authenticity |
| “I did not like that at all … No, that clearly didn’t fit with who I am … it was really too much; that really wasn’t me, you know; I really felt as if there was someone [else] standing next to me … ” ( |
| Trust |
| “Both subjects reported that the sensations elicited by electrical stimulation of the SI cortex felt ‘unnatural’ and unlike anything they had ever felt before” ( |
Each ethical dimension (responsibility, privacy, authenticity, trust) is aligned with sample core questions most strongly associated with the respective dimension and relevant quotes from individual end users.
Figure 1.The Agency Map shows how different dimensions of agency are integrated in a single user of neurotechnology: (1) authenticity as reflecting upon the past self and creating a future self that has continuity over time; (2) privacy as a function of negotiating other people’s access to private data, thoughts, body, and so on; (3) trust as the ability to discern and make use of sensory feedback received from or through the device; and (4) responsibility as the capacity to exercise control over an intentional action.
Qualitative Agentive Competency Tool (Q-ACT).
| Dimension | Agential Competency | Agency Inquiry Prompts |
|---|---|---|
|
| Exercising Control |
Do you feel that the (device mediated) movements you perform are under your intentional control? Do you sometimes feel that you share control with the AI/device? To what extent do you feel responsible for those movements? |
|
| Negotiating Access |
Would you say that information about your behaviors, thoughts, or attitudes is becoming more or less private? Do you feel that you are active in determining who has access? Do you feel that having the device alters your privacy with respect to decision making in your life? Do you feel that others know more about the state of your physical body due to the presence of the device? Are you comfortable with that access? |
|
| Integrating Self |
Has living with the device changed your perception of yourself? Have others noticed changes in you? Do you feel comfortable with these changes? Do you feel able to adequately shape the person you are now and want to be? |
|
| Fostering Self-Trust |
Do you think the device works reliably and records/stimulates correctly? Does the feedback from the device feel natural/trustworthy? Do you trust yourself when you are using the device? Do you think that others trust you when you are using the device? |
Each dimension (responsibility, privacy, authenticity, trust) is aligned with a competency agents make use of and a series of prompts researchers and clinicians can use to ascertain how neurotechnologies impact these competencies.