Adobe>
27 March 2025
Authored by Dr Erica Harper and Timo Istace, our recent report, 'Neurotechnology and Human Rights: An Audit of Risks, Regulatory Challenges, and Opportunities' offers a deep dive into the human rights implications of neurotechnology, focusing on both therapeutic and commercial applications. It identifies six critical human rights areas at risk from neurotechnology advancements: discrimination, freedom of thought, privacy, rights within the criminal justice system, mental and bodily integrity, and workplace rights. For each of these, the paper outlines the relevant human rights frameworks, potential impacts, associated risks, and proposes actionable recommendations for governments to safeguard these rights.
Given the complex and rapidly evolving nature of neurotechnology, the authors emphasize the challenges in crafting effective regulatory frameworks. They highlight that while enforceable domestic laws are essential for protecting human rights, states face significant technical and political hurdles in developing such legislation. An arguably more feasible option is the development of non-binding guidance that could serve as a normative baseline for policy development, foster international coordination, and promote consistent approaches to neurotechnology regulation, while still allowing for advancement and innovation.
The paper also addresses ethical concerns, such as the risk of normalizing neuroenhancement and exacerbating ableism. It calls for a proactive approach to ensure neurotechnology does not inadvertently lead to societal harm, including the violation of fundamental rights or the creation of new forms of inequality.
Erica Harper explained, 'As neurotechnology advances, it is crucial that we safeguard human rights by fostering international cooperation and establishing a regulatory framework that ensures innovation does not come at the cost of dignity, autonomy, and equality. To effectively address the risks, policymakers must prioritize the development of a comprehensive regulatory framework that balances innovation with the protection of fundamental human rights, ensuring that technological progress does not undermine individual freedoms and equality.'
ITU
Our event brought together human rights practitioners, data scientists, and AI experts to explore how artificial intelligence can support efforts to monitor human rights and the Sustainable Development Goals.
Adobe
Our research brief, Neurotechnology and Human Rights: An Audit of Risks, Regulatory Challenges, and Opportunities, examines the human rights implications of neurotechnology in both therapeutic and commercial applications.
This open discussion will consider the strengthening of international labour rights and human rights standards with focus on freedom of association.
Wikimedia
This evening dialogue will present the publication: International Human Rights Law: A Treatise, Cambridge University Press (2025).
Adobe
This training course, specifically designed for staff of city and regional governments, will explore the means and mechanisms through which local and regional governments can interact with and integrate the recommendations of international human rights bodies in their concrete work at the local level.
UN Photo / Jean-Marc Ferré
This training course will explore the origin and evolution of the Universal Periodic Review (UPR) and its functioning in Geneva and will focus on the nature of implementation of the UPR recommendations at the national level.
Olivier Chamard / Geneva Academy
The Treaty Body Members’ Platform connects experts in UN treaty bodies with each other as well as with Geneva-based practitioners, academics and diplomats to share expertise, exchange views on topical questions and develop synergies.
Victoria Pickering
This project aims at providing support to the UN Special Rapporteur on the Rights to Freedom of Peaceful Assembly and of Association Clément Voulé by addressing emerging issues affecting civic space and eveloping tools and materials allowing various stakeholders to promote and defend civic space.