Human Enhancement is Happening Whether You Like it or Not

Bionics are taking hold outside of straight-up medical needs. What are we so afraid of?
March 24, 2017, 3:38pm
Image: Andrew Cagle

To see Mark Muhn riding his bike around his neighborhood in California, you'd think he's just like anyone else using a recumbent bike. "People see a guy with a cool-looking bike. People have no idea I'm paralyzed," he says. In fact, Muhn is a champion cyclist—last October, he won the Cybathalon, the world's first athletic competition for cyborgs.

Five years ago, researchers at the Advanced Platform Technology Center at the VA Medical Center in Cleveland, Ohio, implanted 24 electrodes and 60 feet of wire into his legs, as well as two pulse generators about as big as iPhones into Muhn's belly. When Muhn hits a button to turn the system on, the packs send signals to the muscles that cause the leg muscles to contract, pedaling the legs and moving the bike.


It wasn't simple to get all those parts to work together. The key to getting the shocks coordinated with the pedaling was a sensor system installed in the bike to monitor where Muhn's legs were in the rotation. To make his muscles stronger to push him faster, Muhn trained for six months before the competition, cycling every day on a stationary bike at home. He would strap his feet in, hit the button, turn on the TV. Sometimes he'd even nod off. "My legs are strapped in, the program is running and firing my muscles, so I don't have to think about it," he says. He feels like he goes faster if he concentrates on moving his legs, though he admits it's probably his imagination.

The ethics around Muhn's case aren't so complicated—because he can't move his legs on his own, the implanted gadgets help him do something that most able-bodied people can do. But it's not hard to imagine a situation in which an able-bodied person uses technology like this to augment their own capabilities, improving them past the level usually attainable by human beings. If and when that day comes, we as a society will have to confront our fears and ask ourselves: What kind of humans do we really want to be?

"Our technology progresses faster than our ability to grasp it," says Nicholas Agar, a philosophy professor at Victoria University of Wellington in New Zealand. "There's no line in technology that says, 'This is what human beings are capable of and they can't go beyond that.'" It's up to us to find that line.

The days of human enhancement have long since arrived. Originally developed to treat sleep disorders such as narcolepsy, the drug Modafinil has become popular among members of the general public as a way to increase focus and help retain information. Nearly 20 years ago, cybernetics professor Kevin Warwick implanted a radio frequency chip in his arm that allows him to open doors and turn lights on and off. Coalitions of cyborgs and body hackers have been thriving for decades; they recently went mainstream in the transhumanist movement, an intellectual and political group built on the principle that technology should overcome the limits of biology at all costs. You could even argue that caffeine has been one of our most ancient enhancements. People with illnesses and disabilities have benefited too: Cochlear implants can help deaf children hear; patients with Parkinson's disease can reduce their symptoms with deep brain stimulation.


More outlandish, science-fiction type technologies could be in our future. Athletes equipped with "bladerunner" limbs, or powerful exoskeletons, or doped with hormones or other performance-enhancing drugs, could break every existing Olympic record. Upgraded soldiers that can resist stress and torture, or obey a supervisor automatically, could change the nature of warfare, stacking the deck against countries without the means to outfit their militaries. Cognitive enhancements—brain implants that can give humans perfect memory recall, or upload their brains to the internet, or Google facts automatically—could make the SATs comically elementary.

It's hard to tell when, or if, these kinds of technologies will become a reality. If they do emerge, they will likely be developed by the military, or intended to treat medical conditions, says Wendell Wallach, a senior advisor to the Hastings Center, an independent bioethics research institute.

The benefits of these hypothetical devices tend to be exaggerated and oversimplified, Agar says. Technologies like these will likely come with some side effects, so people interested in getting them will have to go through a complex cost-benefit analysis. People may decide it's not worth it if the device is permanent, if it requires surgery, if it takes too long to train or get used to, if it's too costly, or if it grants the user simply an improvement on human abilities instead of a new skill altogether. "There is something about going inside the body that makes people very uncomfortable," says Stephen Clarke, a senior research associate in the Oxford Uehiro Centre for Practical Ethics—it always involves some sort of risk. Amid all this uncertainty, one thing is clear: The decision to get such a device is going to be a lot messier than people think.


Even the testing of such devices outside a medical context will likely be fraught. Today, guidelines in medical ethics state that experiments on humans must be done with the intention of diagnosing, treating, or preventing some medical ailment. Since bionic enhancements would be for the express purpose of treating able-bodied people, it's unlikely that institutional review boards, which approve experiments before they begin, would give those experiments the go-ahead (some legal scholars, such as Max Mehlman, have argued that the guidelines might not prohibit experiments in enhancement).

The rules of informed consent, one of the backbones of modern medical research, would likely have to be loosened. If a device is not reversible, can a person consent to all the possible side effects that are not yet understood? "It's not only not therapeutic, but you could be actually signing up for lifetime of torture with no recourse," Wallach says. If a soldier agrees to be a research subject for a military experiment, can he truly consent of his own volition, or is it coercion? "I think if we have a right to paternalism, that's where we do," Wallach says. "If there are risks, and there are risks in anything that's surgical or internal, then society has a right to protect you from yourself."

And there are issues of psychology to consider. Take a prosthetic hand that allows superhuman strength or dexterity. It's one thing if the person's hand has been severed in an accident and they receive the bionic hand as a replacement—that's a clear therapeutic use. But what if they want to sever their hand to get the new device? "I would worry about the psychology of someone who wants part of their body amputated," Clarke says.


Privacy is another risk. Pretty much any device will emit some sort of signal to communicate with itself or with an external device, which leaves it vulnerable to hacking. And when corporations start making these devices, people should be wary of what kind of information they're collecting—say, a brain chip created by a company that helps your memory but can also track your thoughts and market to you accordingly. "When someone stands to profit, you should be suspicious," Agar says.

Then there are the nebulous, less logical arguments, those knee-jerk reactions that people use to justify their discomfort with technology like this. "A lot of people seem to think [enhancement] is extremely problematic. But when it comes to try to spell out why, they often find themselves at a loss for words," Clarke says.

To some philosophers and ethicists, that discomfort is reason enough to stall technology's development. Leon Kass, a professor emeritus in the Committee on Social Thought and the College at the University of Chicago and the chairman of the President's Council on Bioethics from 2001 to 2005, called this the "wisdom of repugnance" on the subject of cloning. In a June 1997 article in The New Republic calling for a ban on human cloning, he wrote: "We are repelled by the prospect of cloning human beings not because of the strangeness or novelty of the undertaking, but because we intuit and feel, immediately and without argument, the violation of things that we rightfully hold dear."


But today, most experts agree that's not good enough. "Don't be lazy. If you feel anxious about a new technology, it's not enough to say, 'I feel anxious,'" Agar says. He and many others in the field have tried to parse what some of the most common reactions really mean. Some tap into our sense of fairness—creating superhuman athletes can be called "cheating"; giving only some people in society access to these technologies could deepen gulfs between socioeconomic classes. Sometimes people sneer that researchers are "playing God," claiming that humans ought not to be interfering with the natural order of things (never mind the fact that we've been doing that, in the form of medicine or agricultural domestication, for millennia). "Somehow it's thought that it's particularly wrong or dangerous if we do it to ourselves," Clarke says. Clarke has found that these fears also seem to align with political beliefs—liberals are concerned that bionics will expand existing inequalities, while conservatives are more afraid that the technology will destabilize society altogether. Wallach sums it up succinctly—people who express these fears are afraid of change.

At this stage, an outright ban on the technology itself, or its experimentation, would probably do more harm than good. It would inhibit proper scientific and ethical vetting, obfuscating technology from the appraising eyes of the general public. It could even lead to a black market for enhancement, which could put people at increased risk for side effects and poorly performed procedures. A few small experiments don't equate to a superhuman takeover, or to a technology that will necessarily reach a much larger population, Clarke points out. "It's a good idea not to forbid these things, to allow them to be experimented with at least some oversight," Wallach says. "I think if you have people who are clearly willing and want to do this, I don't see why not."

A dialogue is necessary. But how and when we will have it, and who will participate, remains to be seen. Agar says it's most valuable to have broad ethical discussions, especially with those who might not agree with us. But the internet, where most of our frank discussions are taking place these days, is riddled with trolls trying to undermine those discussions.

Clarke says it's unlikely that we'll be able to collectively suss out the ethics of a hypothetical enhancement until it's actually here—it's just not how our brains work. "I think we will confront these things when they confront us," Clarke says.

Muhn struggled to think of a nefarious application for the kind of technology he's currently using to win cycling races. "It would definitely be able to make able-bodied people run faster," he says. It's questionable if that's "cheating," but he doesn't think that's necessarily a bad thing. "This is a perfect exercise program for people who don't want to exercise," he adds. But after mulling it over a bit, he could imagine a less-than-ideal scenario. "I can see where overuse could lead to health problems and muscle damage, so I guess there is a downside. There is always a way to mess up a good thing."

HBO_ _will take you inside the exciting world of bionic technology, exploring the cutting edge ways in which humans are integrating computers into their bodies._ Watch Friday at 7:30PM and 11PM._