Uploading the Mind: Could a Digital Brain Feel Pain?
Scientists may one day be able to use electronic copies of human brains to explore the nature of the mind. But is it ethical to make that e-mind suffer and die if one can resurrect it at will and delete any memory of the suffering?
Successfully emulating human or animal brains could pose many ethical challenges regarding the suffering these copies may undergo, a researcher says.
Scientists are pursuing several strategies to create intelligent software. In one, called "whole brain emulation" or "mind uploading," scientists would scan a brain in detail and use that data to construct a software model. When run on appropriate hardware, this model would essentially replicate the original brain. [Super-Intelligent Machines: 7 Robotic Futures]
"This is future, hypothetical technology, but many people are optimistic about an eventual 'post-human' existence — and others, of course, are convinced this is absolutely impossible," said study author Anders Sandberg, a philosopher at Oxford University's The Future of Humanity Institute in England.
Ethics of mind uploads
Although it remains uncertain whether mind uploading is possible, Sandberg is now exploring the potential ethical consequences of software that can suffer.
"If one thinks whole brain emulation may be possible one day, then it seems plausible that an emulation could have a mind and moral rights," Sandberg told Live Science.
Sign up for the Live Science daily newsletter now
Get the world’s most fascinating discoveries delivered straight to your inbox.
Sandberg has a background as a computational neuroscientist, running computer simulations of neural networks, which are systems that mimic how brains work.
"One evening, when I turned off my computer as I left my office, I realized that I also was deleting a neural network," Sandberg recalled. "Was I actually killing something? I quickly realized that the network was simpler than the metabolic networks of the bacteria I was no doubt squashing just by walking on the floor, yet I saw that a sufficiently complex network might be worth caring about."
The creation of more-complex artificial networks will probably proceed in steps. Before anyone tries whole brain emulation of a human, scientists will likely first attempt whole brain emulations of animals. Indeed, some suggest that virtual lab animals could replace real animals in scientific and medical research.
Animal-brain emulations raise the important question of whether these copies can suffer. If so, virtual experiments on the e-minds carry ethical considerations. "If it is cruel to pinch the tail of biological mice, the same cruel impulse is present in pinching the simulated tail of an emulated mouse," Sandberg wrote online April 14 in the Journal of Experimental & Theoretical Artificial Intelligence.
"I think a moral person will try to avoid causing unnecessary suffering, even if it is in an animal. So if an emulated animal brain could feel suffering, we ought to avoid inflicting it if we can," Sandberg said.
However, "there will likely not be any agreement on whether software can suffer, and no easy way to prove it," Sandberg said. "I hence think we should use a 'better safe than sorry' strategy and assume that emulated animals might well have the same level of consciousness and sentience as the real animals, and treat them in ways that avoid pain. This may mean giving then virtual painkillers or leaving out pain systems from the simulations." [The 10 Greatest Mysteries of the Mind]
Human uploads
The number of questions concerning the implications of mind uploading rises for human emulations. For instance, making several copies of one human poses legal challenges, Sandberg said.
"For example, contract law would need to be updated to handle contracts where one of the parties is copied. Does the contract now apply to both?" Sandberg said. "What about marriages? Are all copies descended from a person legally culpable of past deeds occurring before the copying?" he said, adding that the copies would have privileged information about each other that would make them obvious witnesses during a criminal or other trial.
In addition, "How should votes be allocated if copying is relatively cheap and persons can do 'ballot box stuffing' with copies? Do copies start out with equal shares of the original's property? If so, what about inactive backup copies? And so on. These issues are entertaining to speculate upon and will no doubt lead to major legal, social and political changes if they become relevant."
Even the act of creating a human emulation is ethically questionable. The process will most likely involve destroying the original brain, making the activity equivalent to assisted suicide with an unknown probability of "success."
Also, "early brain scans might be flawed, leading to brain-damaged emulations we have a duty to take care of," Sandberg said. Researchers may be ethically forbidden to pull the plugs on these emulations, and whether scientists can store them and try to make a better version is uncertain.
"Obviously, a human brain emulation that is suffering is as bad as a human suffering," Sandberg said. "We ought to respect emulated people, and hence treat them well. Even if we might harbor doubts about whether they really feel or deserve rights, it is better to assume they do."
Answering the question of whether software can suffer may require developing a human emulation to whom "we can ask, 'Do you feel conscious? And are you in pain?'" Sandberg said. "At that point, I think we will start getting philosophically relevant information. I think we will not be able to resolve it just by reasoning. We have to build these systems."
It remains an open question whether it is moral for a human emulation to voluntarily undergo very painful and even lethal experiments under the assumption the suffering copy will be deleted and replaced with a backup. Current views on self-experimentation prevent such behavior on the grounds that certain activities are never acceptable for science, but Sandberg noted that views on what constitutes unacceptable suffering and risk have changed over time. [7 Absolutely Evil Medical Experiments]
"Emulations can be instantiated several times, stopped, deleted, restored from backups and so on," Sandberg said. "This confuses many ethical systems.
"The issue here is that death is usually bad for several connected reasons. It might involve suffering, and it always is the irreversible stopping of experience and identity," Sandberg said. "But emulations can have partial deaths that do not seem as bad. One can imagine an emulation risking their life, being destroyed and then restored from backup minus the memories since last backup."
The questions posed by whole brain emulation suggest that people might want to prepare themselves "for some imminent, dramatic changes a few decades down the line," Sandberg said.
"There would be an option to escape biology and mortality, assuming one agreed emulations were a continuation of one's identity," Sandberg said. "The potential for chaos would be large — society needs to look ahead before the technology is perfected to maximize the chance of a good outcome."
Follow us @livescience, Facebook & Google+. Original article on Live Science.