Could a robot designed as a sexual companion ever feel something like love for me? And could I, as a human with emotional intelligence, ever feel love for it? These questions challenge our definition of love, but they also challenge our understanding of both human emotion and artificial intelligence. Will intimate relationships between humans and robots ever get beyond just sex?
This is a topic up for debate at the 12th Human Choice and Computers Conference in Manchester, UK, where academics and researchers are gathering this week to discuss humanity's relationship—sexual, romantic, or otherwise—with our AI counterparts.
Sex robots have been a thing for a few years now, increasingly nudging up the realism spectrum as AI, robotics, and manufacturing methods advance. No longer in the realm of science fiction (à la Pris Stratton, the "basic pleasure model" in Blade Runner), your very own lovable robotic companion is just a few mouse clicks away.
"In the long run what we want is to be genuinely loved and desired for who we are as a complete and embodied person"
But in 2016, when artificial intelligence can beat a human in a game tougher than chess, you still have to pre-programme even the most advanced of available "sex robots" with a personality. True Companion's RoxxxyGold comes with a base personality so she "likes what you like, dislikes what you dislike"—but for anything resembling emotional variety, extra personalities such as "Frigid Farrah" or "S&M Susan" have to be bought and installed.
So, will these robots ever be able to transcend their soulless form, with the help of artificial intelligence, and provide a more loving form of companionship, something akin to that shown in Spike Jonze's Her and Alex Garland's Ex Machina? Will a sex robot ever love me, and I, in return, love it?
Charles Ess, professor of media studies at the University of Oslo and keynote speaker at the conference, thinks that the answer is an unequivocal negative.
"As far as I can tell, the consensus in the AI and robotic communities is, in a strong sense, no," Ess told Motherboard in an interview. Ess, who has a background in applied ethics and philosophy, tackles the question with an armoury of philosophical definitions of love.
"Because to love you, or anyone else, requires what philosophers like to call 'first person phenomenal consciousness'—basically the capacity to be aware of oneself and to be aware of one having emotions and desires," he continued. "The current state of the art, and what I've seen of projections, all point to no. For example, there's an expert group located in Stanford that's going to issue a report before the end of the month, and their prognosis for 2030 is also 'no'. They think we're just not going to have that kind of consciousness [in robots]."
"Empathy is not about projecting onto, or appropriating someone to use as you want"
Ess predicts, in notions parallel to the robots already on the market today, that what buyers will be left with are essentially zombies.
"So if you don't have that kind of consciousness, and you don't have that kind of real emotion, a sort of blunt image of what you have is a zombie. It knows how to move, it can imitate emotion very well," he said. "Robots can already do that and are getting better, so they can evoke a sense of feeling on your side, and this is the basic trick. There's a whole subfield in robotics called artificial emotions."
Artificial emotions could work wonders in a therapeutic sense, but are no match for the real thing.
"There are circumstances in which that's perfectly fine. But in the long run what we want is to be genuinely loved and desired for who we are as a complete and embodied person," said Ess. "Robots will be able to fake that, but you and I will know because we bought them, or we rented them, that it's a fake."
For some, this is good news. In a presentation delivered at this year's Ideacity in Toronto, Kathleen Richardson, senior research fellow in the ethics of robotics at De Montfort University and director of the Campaign Against Sex Robots, stated that sex robots are interrupting the development of our own empathy, a very special human feature.
"Empathy is about taking into account what another person is thinking and feeling, and responding appropriately to it," she said. "Empathy is not about projecting onto, or appropriating someone to use as you want. You can do that with an object. But I don't want you to do that with a person."
Richardson thinks that sex robots are "part of a new kind of technological elaboration that rests on a very dehumanising practice, where people are treated as sex objects," and that it's not the right direction for humanity to be heading in.
"Everyone thinks [sex robots are] very exciting to begin with, but when you're alone and you have to carry a 40Ib robot upstairs, or their programme breaks down and you have to call 'support' and go through automated options to speak to an 'advisor' to find out why your robot is twitching it's head repeatedly—the excitement will quickly fade," Richardson told Motherboard. "You will realise there's only other people and we have to find a way to build healthy and loving relationships with each other."
But looking further ahead, if in the future artificial intelligence technology becomes so advanced that even human-like sentience is reproduced, would it be ethically right for humans to use these objects for love? Again, for Ess, the answer is no.
"In robot ethics, there's now a fairly well-established tradition of saying 'no'," he replied. "If we develop machines that have some version of autonomy, and especially the ethics of sense of respect for one another as persons—if we build machines that can come to approximate that, then it would seem we would have to develop rules for respecting that as well. They would have rights."
Ess said that the water gets even murkier when we try to define when that consciousness happens—when an object becomes no longer just an object. It's worth considering: If we get to that point, will these robots even be robots any more?
Want more Motherboard in your life? Then sign up for our daily newsletter.