undefined
Argentina|en-AR

Add a bookmark to get started

Global Site
Africa
MoroccoEnglish
South AfricaEnglish
Asia Pacific
AustraliaEnglish
Hong Kong SAR ChinaEnglish简体中文
KoreaEnglish
New ZealandEnglish
SingaporeEnglish
ThailandEnglish
Europe
BelgiumEnglish
Czech RepublicEnglish
HungaryEnglish
IrelandEnglish
LuxembourgEnglish
NetherlandsEnglish
PolandEnglish
PortugalEnglish
RomaniaEnglish
Slovak RepublicEnglish
United KingdomEnglish
Middle East
BahrainEnglish
QatarEnglish
North America
Puerto RicoEnglish
United StatesEnglish
OtherForMigration
27 March 202514 minute read

Ethical and Legal Challenges of Neurotech

The increasing interest and investment in the development of neurotechnology brings applications previously only known to science fiction within reach. Perhaps the most striking example are brain-computer-interfaces – also known as BCIs – which enable direct communication between the brain and external devices. They detect and interpret the brain's electrical signals, converting them into commands for computers or other devices, allowing interaction through mental processes without physical movement. This imbues BCIs with the immense potential for medical and technological advancements. Recent years have seen paralysed people regain the ability to walk and patients with degenerative diseases, such as amytrophic lateral sclerosis (ALS), being able to text and use a computer using only their brainwaves. However, BCIs also pose significant risks and ethical challenges, with concerns ranging from health and privacy issues to the potential for misuse and unequal access. Given the seemingly endless possibilities of BCIs and the potentially severe risks they pose, it is crucial for nations and international organisations to develop robust and adaptable regulations to ensure their safe and ethical integration into society.

 

Functionality of Brain-Computer-Interfaces (BCIs)

BCIs vary according to their system of acquiring brain signals, as well as the directionality of the transmission of data between the brain and the computer. Understanding these technical differences and the possible consequences is paramount to establishing a functional regulatory scheme for these systems.

System of acquiring brain signals

BCI systems can be (partially) invasive or non-invasive, each system with its own pros and cons. As the name suggests, invasive BCIs involve invasive surgery to implant electrodes directly into or on the brain. This method acquires high-resolution signals, which are critical for applications requiring precise control, but it comes with surgical risks. Some innovative approaches utilise technologies common to other areas in the medical field – such as stents or catheters – to implant BCI systems inside the body, but outside of the actual brain matter. These partially invasive procedures carry less of the risks associated with serious brain surgery. Non-invasive BCIs do not require any surgery and only use external sensors – such as those used in an EEG – to record brain activity. Due to the lack of an invasive procedure, this method is safer for the patient, however, produces a weaker signal with more interference.

Directionality of Transmission

BCI signals can transmit from the brain to a computer and vice versa. Most BCIs are unidirectional and operate from the brain to the computer, decoding neural activity and relaying its commands to an external device (read-out BCIs). Some, however, also operate in reverse, sending precise electrical stimulation to the brain without simultaneously detecting signals (write-in BCIs). Bidirectional BCIs employ both functions, being capable of both sending information from the brain to a device and vice versa.  

 

Areas of Application

The main area of application of BCIs is medical, supporting individuals with disabilities and treating symptoms of various diseases. However, non-medical purposes, such as cognitive enhancement, are increasingly entering discussions surrounding neurotechnology.

Medical Applications

Write-in BCIs aim to influence brain activity, usually to treat medical conditions. Invasive methods include deep brain stimulation (DBS), where electrodes implanted in the brain produce electrical impulses to treat symptoms of diseases such as Parkinson’s disease, epilepsy, Tourette syndrome, obsessive-compulsive disorder, or tremor. Additionally, DBS is currently being studied as a potential treatment for inter alia chronic pain, dementia, depression and addiction. Another method of write-in technology includes spinal cord stimulation (SCS), where electrodes implanted on the spinal cord stimulate the spinal cord to help alleviate pain. Whilst DBS and SCS cannot cure medical conditions, they can help lessen symptoms, especially where treatment through medication proves to be ineffectual.

Conversely, the main role of read-out BCIs is to capture and analyse neural data from the brain to interpret changes in intentions, behaviours, perceptions, and cognitive states. This data can then be transmitted and used for various purposes, including helping to restore motor and language functions. These devices can assist patients with severe paralysis regain some degree of independence by decoding their motor intentions, enabling them to operate various devices, such as robotic arms, wheelchairs or robotic exoskeletons. Additionally, read-out BCIs are used for real-time language translation and mood detection for those who have lost speech and communication abilities, for example due to degenerative diseases such as ALS. Newest trials have seen paralysed patients play video games, browse the internet, post on social media, send text messages and move a cursor on a laptop, using only their thoughts.

When combined, these technologies can even be used to allow paralysed individuals with spinal cord injuries to regain the ability to walk. Thanks to a read-out device implanted in the brain and a write-in device imbedded in the spine, the severed section of the spinal cord can be bridged, restoring the communication between brain and spinal cord and allowing patients to control their legs again (so-called Brain-Computer-Spinal-Interface (BCSI) or Brain-Spine-Interface (BSI)).1

Non-Medical Purposes

Another avenue of exploration is human enhancement, whereby BCIs are used on healthy individuals to augment, improve or extend human capabilities beyond what is considered the typical or baseline state. This can include enhancing cognitive functions like memory, attention, learning and problem-solving skills, as well as boosting physical capacities or sensory experiences. Areas of application could extend from BCIs being used for relaxation, gaming or pure convenience.

 

Risks and Ethical Challenges

One of the primary risks of invasive BCIs is the potential health impact from surgery and implantation, including infections, haemorrhages, and brain tissue damage. The body’s immune response may even reject an implant entirely. Even after successful implantation, possible scar tissue on the brain can lead to a degradation of signal quality, whilst device malfunctions pose further safety concerns. Implanted hardware requires maintenance, and the rapid advance of technology makes the necessity of hardware- and software upgrades likely. Therefore, management of abandoned hardware within the body, as well as methods for safe removal, must be considered. Furthermore, the implications of generally accessing and potentially altering mental processes are not fully understood.

Concerns surrounding autonomy and agency have also been voiced regarding write-in BCIs, which could potentially evolve into being capable of sending signals to perform behaviors that the patient cannot control. They therefore contain the potential to subvert free will.

Moreover, read-out BCIs can acquire a user’s brain data – i.e. quantitative data about human brain structure, activity and function – as well as physiological and behavioral information, enabling them to make specific inferences about that user’s brain activity or thoughts. Besides the innate capabilities intended by the programmer, which could prove to be more intrusive than intended or needed, BCI devices are also susceptible to malware attacks such as malicious algorithms or “brain spyware”. Due to these capabilities, read-out BCIs present a potential privacy and security risk to users' private data, including financial and facial information. This raises critical ethical concerns, with the need for stringent security measures being of utmost importance.

 

Regulation

Considering the increased threats to privacy and autonomy posed by the rapid development of BCI technology, calls for regulation and the establishment of new human rights to protect cognitive liberty, mental privacy, mental integrity and psychological continuity are growing louder.

International Approach

Whilst some nations have begun to tackle these questions, Germany, like most countries, has yet to develop a national strategy on how to regulate BCIs and other types of neurotechnology. Chile, on the other hand, has officially become the first country in the world to afford brain activity constitutional protection as so-called “brain-” or “neuro-rights”.2 Spain has followed suit, enacting the “Digital Rights Charter”,3 a non-legally binding declaration which recognises the need for existing rights to adapt to the digital environment. Likewise, the French Charter for the Responsible Development of Neurotechnologies4 outlines ethical guidelines and principles in relation to the advancement of neurotechnology. Across the Atlantic, individual U.S. States, including Colorado and California, have already signed bills into law aimed at protecting data found in a person's brainwaves. It remains to be seen whether the USA will follow up these developments on a federal level.

On a supranational level, the OECD has set the first international standard for governments and innovators to ensure that neurotechnology development and use are conducted responsibly.5 Similarly, UNESCO is working on developing a global framework on the ethics of neurotechnology.6 The draft emphasises many of the same points as the OECD, especially regarding the need to ensure the respect of human rights. It calls for clear prohibitions against harmful uses, robust oversight mechanisms and comprehensive, yet agile regulations to keep pace with technological advancements. The UN General Assembly have expressed similar concerns and recommendations.7

EU Approach

The EU has identified BCIs as a frontier in the interaction between humans and technology that is in need of further regulation, with most of its institutions directing their focus toward BCIs. The EU Council released a research paper examining the capabilities and risks of BCIs.8 A report by an EU Parliament Committee recommends a sensitive, calibrated approach to regulation of emerging neurotechnology, which includes both ethical frameworks and binding legal regulation.9 The León Declaration10, signed by all Telecommunications and Digital Ministers of the EU Member States, outlines the EU’s key commitments to ensure a human-centric approach to neurotechnology. It, too, emphasises the protection of fundamental rights, promoting digital inclusion, fostering a sustainable digital environment, and ensuring safety and security in the digital space.

Whereas the comprehensive safeguarding of fundamental rights in relation to neurotechnology is still a project of the future, certain European legal instruments already explicitly address BCIs.

Regulation of “write-in” BCIs, including NIBS devices

The introduction of the Medical Device Regulation (MDR)11 has sparked controversy within the EU about the classification of non-invasive products for brain stimulation (NIBS products) without an intended medical purpose. Unlike its predecessor, the Medical Devices Directive (MDD)12, the MDR explicitly addresses such devices and makes the MDR applicable to write-in BCIs, as they modify neuronal activity13. Unlike the purpose-based approach usually taken to determine whether something qualifies as a medical device, the MDR brings such products into its ambit regardless of their intended purpose. This makes it applicable to any write-in consumer BCI, including those meant for non-medical purposes, e.g. wellbeing, mental augmentation, recreation, leisure, convenience, authentication, etc., even where such brain stimulation is achieved through non-invasive devices, i.e. NIBS products. Detailed requirements for the development, manufacture and sale of NIBS products without an intended medical purpose are provided and the need for clinical evaluation and risk management is emphasised.14 Furthermore, write-in BCIs without an intended medical purpose are reclassified under the highest risk class (Class III)15, mandating stringent regulatory oversight. This elevates NIBS devices to the same risk level as invasive brain stimulation devices, implanted inside the brain.

Critics argue that the evidence on which this reclassification has taken place is deeply flawed, with the assessment being based on incorrect statements about the safety of certain types of non-invasive methods of brain stimulation, which are not supported by scientific evidence.16 Although the certification requirement of all NIBS devices as medical devices is supported, a classification under a lower risk class is deemed appropriate. It is asserted that a classification under risk Class III will lead to increased costs and significant delays in NIBS research and development in the EU. Furthermore, the accessibility of NIBS treatment for patients living in the EU will be reduced, disadvantaging EU citizens and increasing the risk of  overuse of alternative treatments with more severe and established adverse effects, to compensate the lack of NIBS availability.

Regulation of read-out BCIs

As read-out devices, which do not serve a medical purposeand do not modify neuronal activity, they do not fall within the ambit of the MDR, nor do other legal instruments target them explicitly. Instead, such devices are more likely to be regulated solely under general provisions, such as the General Product Safety Regulation (GPSR)17, as well as frameworks relating to privacy and data protection, such as the General Data Protection Regulation (GDPR), which protects neural data as personal data. To the extent such devices incorporate artificial intelligence as an integral (safety) component, they could additionally fall under the AI Act18 or the proposed AI Product Liability Directive19.

Impact

This differentiation in regulation between write-in and read-out BCIs results in different requirements for their entry into the EU market. Whilst medical devices and other Class III items which fall under the MDR must undergo rigorous clinical assessment before being approved for placement, devices which only fall within the ambit of the GPSR do not require an external pre-market assessment.

 

Conclusion

It remains to be seen how - and how fast - countries and supranational institutions will tackle the regulatory challenges presented by BCIs. A solution must be found which adequately protects individual rights without stifling innovation. What this solution will look like remains uncertain.

What is clear is that combatting privacy and safety concerns surrounding the personal data which can (potentially) be accessed through BCIs has been universally identified as a top priority. Actors in the field of BCIs would, therefore, do well to develop stringent safety, privacy and ethics protocols from the outset of their endeavours and document these correspondingly. Providing evidence of such measures, as well as their efficacy, could very well provide prudent companies a head start, when the inevitable regulatory oversight is established.

Until then, it is advisable to keep an eye on the developments at an EU level, particularly surrounding the MDR. This can help stakeholders preempt compliance obligations and adjust development procedures accordingly and early on.


1 E.g. Michael Roccati, Paralysed man with severed spine walks thanks to implant; David M’Zee, Fatherhood joy for paralysis implant dad; Gert-Jan Oskam, Brain implants help paralysed man to walk again
2Neurorights in Chile”, Overview by The Neurorights Foundation
3 Spanish “Carta_Derechos_Digitales”
4 French “Charter for the responsible development of Neurotechnologies”
5 Recommendation on Responsible Innovation in Neurotechnology, adopted in December 2019, supplemented by its Neurotechnology Toolkit, released in April 2024.
6 First draft of the Recommendation on the Ethics of Neurotechnology - UNESCO Digital Library
7Impact, opportunities and challenges of neurotechnology with regard to the promotion and protection of all human rights”, Report of the Human Rights Council Advisory Committee.
8 “From vision to reality: Promises and risks of Brain-Computer Interfaces”, research paper authored by the EU Council’s Analysis and Research Team in September 2024.
9 Committee on Legal Affairs and Human Rights authored a report in September 2020 entitled “The brain-computer interface: new rights or new threats to fundamental freedoms?”, Doc. 15147 - Report - Working document
10 León Declaration on European Neurotechnology: A Human Centric and Rights-oriented Approach, cannot be accessed, as the website seems to have been taken down.
11 Medical Device Regulation (MDR) (2017/745)
12 Medical Devices Directive (MDD) (93/42/EEC)
13 Section 6 Annex XVI to the Medical Device Regulation, entered into force in December 2022.
14Common Specifications for products without an intended medical purpose (Annex VII Commission Implementing Regulation (EU) 2022/2346).
15 Art. 1 (c) Commission Implementing Regulation (EU) 2022/2347, Implementing regulation - 2022/2347 - EN - EUR-Lex
16 See for example, the European Society for Brain Stimulation (ESBS), European reclassification of non-invasive brain stimulation as class III medical devices: A call to action - Brain Stimulation: Basic, Translational, and Clinical Research in Neuromodulation
17 General product safety regulation (2023) | EUR-Lex
18 Regulation (EU) 2024/1689, Regulation - EU - 2024/1689 - EN - EUR-Lex
19 Proposal for a DIRECTIVE OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL on adapting non-contractual civil liability rules to artificial intelligence (AI Liability Directive) COM/2022/496 final, EUR-Lex - 52022PC0496 - EN - EUR-Lex