Summary: Brain-computer interfaces are currently being used to help people with neuromuscular disorders regain daily functions such as mobility and communication. The military is developing BCI technology to help members respond quickly to threats. Researchers are investigating ethical issues related to the use of neurotechnologies such as BCI on the battlefield.
Source: The conversation
Imagine a soldier having a tiny computer injected into his bloodstream that can be guided with a magnet to specific regions of his brain. With training, the soldier could then control weapon systems thousands of miles away using his thoughts alone.
Embedding a similar type of computer into a soldier’s brain could suppress their fear and anxiety, allowing them to complete combat missions more efficiently. To go further, a device equipped with an artificial intelligence system could directly control the behavior of a soldier by predicting the options he would choose in his current situation.
While these examples may sound like science fiction, the science to develop neurotechnologies like these is already in development. Brain-computer interfaces, or BCIs, are technologies that decode and transmit brain signals to an external device to perform a desired action. Basically, a user would just have to think about what they want to do, and a computer would do it for them.
BCIs are currently being tested in people with severe neuromuscular disorders to help them recover daily functions like communication and mobility. For example, patients can turn on a switch by visualizing the action and having a BCI decode their brain signals and transmit them to the switch. Similarly, patients can focus on specific letters, words, or phrases on a computer screen that a BCI can move a cursor to select.
However, ethical considerations have not kept pace with science. While ethicists have pushed for more ethical investigation of neural modification in general, many practical questions around brain-computer interfaces have not been fully considered.
For example, do the benefits of BCI outweigh the substantial risks of brain hacking, information theft, and behavior control? Should BCI be used to curb or strengthen specific emotions? What effect would BCIs have on the moral agency, personal identity, and mental health of their users?
These questions are of great interest to us, a philosopher and neurosurgeon who studies the ethics and science of current and future applications of BCI. Considering the ethics of using this technology before implementing it could prevent its potential harm. We argue that responsible use of BCI requires protecting people’s ability to function in a variety of ways considered essential to being human.
Extending BCI beyond the clinic
Researchers are exploring non-medical brain-computer interface applications in many fields, including gaming, virtual reality, performance art, warfare, and air traffic control.
For example, Neuralink, a company co-founded by Elon Musk, is developing a brain implant so healthy people can potentially communicate wirelessly with anyone with a similar implant and computer setup.
In 2018, the US Army’s Defense Advanced Research Projects Agency launched a program to develop “a safe and portable neural interface system capable of reading and writing to multiple points in the brain at once”. Its goal is to produce valid military non-surgical BCIs for national security applications by 2050.
For example, a soldier in a special forces unit could use BCI to send and receive thoughts with another soldier and a unit commander, a form of three-way direct communication that would allow real-time updates and faster response to threats.
To our knowledge, these projects have not opened a public debate on the ethics of these technologies. While the US military recognizes that “negative public and societal perceptions will need to be overcome” to successfully implement BCI, practical ethical guidelines are needed to better evaluate proposed neurotechnologies before deploying them.
Utilitarianism
One approach to addressing the ethical issues raised by BCI is utilitarian. Utilitarianism is an ethical theory that strives to maximize the happiness or well-being of anyone affected by an action or policy.
Improving soldiers could create the greatest good by improving a nation’s combat capabilities, protecting military assets, keeping soldiers at bay, and maintaining military readiness. Utilitarian advocates of neuroenhancement argue that emerging technologies like BCI are morally equivalent to other widely accepted forms of brain enhancement. For example, stimulants like caffeine can improve brain processing speed and improve memory.
However, some worry that BCI’s utilitarian approaches have moral blind spots. Unlike medical apps designed to help patients, military apps are designed to help a nation win wars. In the process, BCI may violate individual rights, such as the right to be mentally and emotionally healthy.
For example, soldiers using drones in remote warfare today report higher levels of emotional distress, post-traumatic stress disorder, and broken marriages than soldiers in the field. Of course, soldiers regularly choose to sacrifice themselves for the greater good. But if neuroenhancement becomes a job requirement, it could raise unique concerns about coercion.
See also

Neurorights
Another BCI approach to ethics, neurorights, privileges certain ethical values even if this does not maximize general welfare.
Neurorights advocates defend the rights of individuals to cognitive freedom, mental privacy, mental integrity, and psychological continuity. A right to cognitive liberty could prohibit unreasonable interference with a person’s mental state. A right to mental privacy might require the guarantee of a protected mental space, while a right to mental integrity would prohibit specific harms to a person’s mental states. Finally, a right to psychological continuity could protect a person’s ability to maintain a consistent self-image over time.

BCIs could interfere with neurological rights in various ways. For example, if a BCI alters the way the world appears to a user, they may not be able to distinguish their own thoughts or emotions from altered versions of themselves. It can violate neurological rights like mental privacy or mental integrity.
Yet soldiers are already losing similar rights. For example, the US military is authorized to restrict soldiers’ freedom of expression and the free exercise of their religion in ways that are generally not applied to the general public. Would infringing on neurological rights be different?
Human capacities
A human capabilities approach emphasizes that safeguarding certain human capabilities is crucial to protecting human dignity. While neurorights focuses on an individual’s ability to think, a capability view considers a broader range of what people can do and be, such as the ability to be emotionally and physically healthy, to move freely from place to place, relate to others and nature, exercise the senses and imagination, feel and express emotions, play and recreate, and regulate the immediate environment.
We find a capability-based approach compelling because it gives a stronger picture of humanity and respect for human dignity. From this perspective, we argued that proposed BCI applications must reasonably protect all core capabilities of a user at a minimum threshold. BCI designed to enhance capabilities beyond average human capabilities should be deployed in a way that achieves the user’s goals, not just those of others.
For example, a two-way BCI that not only extracts and processes brain signals, but provides the user with somatosensory feedback, such as pressure or temperature sensations, would pose unreasonable risks if it interferes with the user’s ability to trust their own senses. Similarly, any technology, including BCIs, that controls a user’s movements would undermine their dignity if it does not allow the user to circumvent it.
A limitation of a capacity view is that it can be difficult to define what counts as a threshold capacity. The view does not describe which new abilities are worth pursuing. Yet neuro-enhancement could alter what is considered a standard threshold and could possibly introduce entirely new human abilities. Addressing this requires complementing a capabilities approach with a more comprehensive ethical analysis designed to answer these questions.
About this neuroethics and neurotech research news
Author: Nancy S. Jecker and Andrew Ko
Source: The conversation
Contact: Nancy S. Jecker and Andrew Ko – The Conversation
Image: Image is in public domain
#Braincomputer #interfaces #soldiers #control #weapons #thoughts #extinguish #fear