Neurotechnologically Augmented Lawyers? Billable Units of Attention? New Study Sees Threats and Opportunities for Neurotechnology and the Law

0
20
Neurotechnologically Augmented Lawyers? Billable Units of Attention? New Study Sees Threats and Opportunities for Neurotechnology and the Law

The rise of artificial intelligence has led some to speculate that robots could someday replace lawyers in performing many legal tasks. But what if chips in lawyers’ brains — or some other form of neurotechnological augmentation — could enable them to combine the benefits of AI systems while retaining desirable human capacities to create a kind of super lawyer?

If that sounds far-fetched, consider that neurotechnologies are already being used in contexts ranging from medicine, to treat neurological conditions such as Parkinson’s disease, to the workplace, to monitor workers’ attention, and that investors such as Elon Musk are backing neurotechnology, as are companies such as Meta, Facebook’s parent.

Now, a new report commissioned by the Law Society of England and Wales, Neurotechnology, Law and the Legal Profession, cautions that neurotechnological advances in decades to come could require lawyers to grapple with the implications of brain monitoring and manipulation in human rights, criminal law, employment law, and even law practice itself. 

“This tech is coming, and we need to think about regulation,” said Dr. Allan McCay, author of the report. “Action is needed now as there are significant neurotech investors such as Elon Musk and Meta (Facebook). We need law reform bodies, policy makers and academics to be scrutinizing these technological advances rather than waiting for problems to emerge.

“To take criminal law as an example, numerous questions emerge,” said McCay, who is deputy director of The Sydney Institute of Criminology and an academic fellow at the University of Sydney’s Law School. “One might ask which bit of conduct constitutes the actus reus (criminal act) where a person injures another by controlling a drone by thought alone.”

Neurotechnologies are technologies that interact directly with the brain or nervous system, either by monitoring and recording neural activity or by acting to influence it. The technology can be in the form of a brain implant or external device such as a headset, wristband or helmet.

Already, the report says, people can connect their brains directly to the internet and post to social media without bodily action, while others are able to control drones by way of a brain-computer interface.

For the legal profession, the report says, this evolving technology raises questions on a number of fronts:

  • Regulation. What approach should governments take to regulating neurotechnology? Regulating this technology while it is still nascent offers the opportunity to influence their direction before they have wide uptake, the report says, but could be tricky as it is not yet clear how any problematic issues could play out. At the same time, regulation runs the risk of inhibiting valuable innovation. And then there is the question of exactly which body or bodies would be right to regulate this technology.
  • Legal doctrine. The potential implications of neurotechnology for legal doctrine are far too broad to be covered in a single report, the author concludes, so he focuses on criminal law as an illustration. If posting revenge porn is a crime, for example, is a crime that is committed by controlling a mouse or trackpad, what happens when the posting is accomplished through only a mental act? Has criminal conduct occurred? Or what if a hacker is able to control not only the brain-connected device, but somehow stimulate the connected person’s brain to cause them to act? For those who are convicted of crimes, might brain monitoring replace ankle monitors?
  • Human rights. Various individuals and groups are calling for the recognition of “neurorights,” the report says. These include tenets such as the right to personal identity, the right to free will, the right to mental privacy, the right to equal access to mental augmentation, and the right to protection from algorithmic bias.
  • Legal education. Given the pace of neurotechnological change, legal educators should take an “anticipatory stance” and begin to incorporate scenarios that prepare law students for thinking about the implications of this technology for their future careers. Thought should also be given to the possibility “that law students themselves may make use of neurotechnologies to assist with their studies.”
  • The legal profession. Neurotechnology is likely to impact legal practice in any number of ways, the report speculates. Firms will develop neurotechnology practices. Lawyers may someday augment their own cognitive capabilities using neurotechnology. The day could even come when it would be negligent for a lawyer not to augment. Clients may demand that lawyers bill not by their time, but by their neurotechnologically monitored “billable units of attention.”

By design, the report raises a range of questions about the legal implications of this emerging technology without offering specific answers, and it points to potential opportunities, while also highlighting risks. The bottom line is a call to action for legal professionals and policymakers to begin planning for a future that may be not be too far off.

“The question of how to make the most of the coming opportunities may well be one that merits further discussion and strategic thinking,” McKay concludes in the report.

“Overall, neurotechnology is likely to have an increasing impact on society and thus on the law and the profession. In order to best respond to the challenges and opportunities, the first step is to find out about neurotechnology, and hopefully this report is a useful starting point. The next step is to think about what to do and again it is to be hoped that this report will provide stimulus leading to action.”

“This thought-provoking report sets out some of the many opportunities that could arise from developments in neurotech,” said Law Society of England and Wales president I. Stephanie Boyce.

“It also sets out some of the challenges that lawyers may need to grapple with now and in the future. As the report makes clear, neurotechnology could greatly improve the lives of many but also facilitate ethical failures and even human rights abuses.”