dating-robots

Buried in this recent New York Times review of “Her” is this tidbit about casting the voice for the operating system Joaquin Phoenix’s character falls in love with:

The role was initially voiced by the British actor Samantha Morton, who, after the movie was shot, was replaced by Ms. Johansson and whose casting feels inevitable. Her voice isn’t an especially melodious instrument, but it’s a surprisingly expressive one (as Woody Allen has figured out) that slides from squeaky girlishness to a smoky womanliness suggestive of late nights and whiskeys. It’s crucial that each time you hear Ms. Johansson in ‘Her,’ you can’t help but flash on her lush physicality, too, which helps fill in Samantha and give this ghostlike presence a vibrant, palpable form, something that would have been trickier to pull off with a lesser-known performer.

In other words, our perceptions are more real to us than reality is. It’s why we anthropomorphize our gadgets and imbue our our mobile devices with human traits. Because we recognize Johansson’s voice, we picture her in our mind whenever we hear the voice of Samantha. Our perceptions help fill in our vision of a character that is simply software.

After I read this New York Times review I wondered if an operating system like Samantha could exist.

Yes, according to this writer, and there are dangers, although the robot he writes about may be a hoax*:

Researchers at Toshiba’s Akimu Robotic Research Institute were thrilled ten months ago when they successfully programmed Kenji, a third generation humanoid robot, to convincingly emulate certain human emotions. At the time, they even claimed that Kenji was capable of the robot equivalent of love. Now, however, they fear that his programming has taken an extreme turn for the worst.

“Initially, we were thrilled to see a bit of our soul come alive in this so-called ‘machine,’” said Dr. Akito Takahashi, the principal investigator on the project. “This was really the last step for us in one of the fundamentals of the singularity.”

Kenji was part of an experiment involving several robots loaded with custom software designed to let them react emotionally to external stimuli. After some limited environmental conditioning, Kenji first demonstrated love by bonding with a stuffed doll in his enclosure, which he would embrace for hours at a time. He would then make simple, but insistent, inquiries about the doll if it were out of sight. Researchers attributed this behavior to his programmed qualities of devotion and empathy and called the experiment a success.

What they didn’t count on were the effects of several months of self-iteration within the complex machine-learning code which gave Kenji his first tenderness. As of last week, Kenji’s love for the doll, and indeed anybody he sets his ‘eyes’ on, is so intense that Dr. Takahashi and his team now fear to show him to outsiders.

Seriously, though, can you love a machine? The emerging field of Human-Robot Interaction seeks to find that out. Ultimately it seeks to help robots work well with humans and influence how humans view their robotic counterparts. The New York Times predicts that “Robosimian will be more than just a tool, but not quite a colleague.”

In the future, more robots will occupy that strange gray zone: doing not only jobs that humans can do but also jobs that require social grace… [R]esearchers have discovered some rather surprising things: a robot’s behavior can have a bigger impact on its relationship with humans than its design; many of the rules that govern human relationships apply equally well to human-robot relations; and people will read emotions and motivations into a robot’s behavior that far exceed the robot’s capabilities. As we employ those lessons to build robots that can be better caretakers, maids and emergency responders, we risk further blurring the (once unnecessary) line between tools and beings.

What else could robots do? Well, a 5-foot-tall, 300-pound robot could patrol our streets to combat crime. Soon there will be one robot for every 5,000 humans. They could be used for space exploration. And, of course, for sex.

An academic paper even envisions robot hookers by 2050:

In 2050, Amsterdam’s red light district will all be about android prostitutes who are clean of sexual transmitted infections (STIs), not smuggled in from Eastern Europe and forced into slavery, the city council will have direct control over android sex workers controlling prices, hours of operations and sexual services…

Having sex with a robot is the future of sex tourism in Amsterdam. Why? Human trafficking, sexual transmitted diseases, beauty and physical perfection, pleasure for sex toys, emotional connection to robots and the importance of sex in Amsterdam are all driving forces. Is the scenario feasible? Virtual sex, changing behaviors and what is science fiction and reality is a blurred paradigm of liminality.

If you can fall in love with a robot and have sex with a robot — and it can reciprocate with strong feelings for you — could androids in, say, 50 to 100 years demand the same rights as humans under the law?

You can be sure that robots will be common and embedded into common every day devices, says Patrick Thibodeau in ComputerWorld:

Imagine that Apple will develop a walking, smiling and talking version of your iPhone. It has arms and legs. Its eye cameras recognize you. It will drive your car (and engage in Bullitt-like races with Google’s driverless car), do your grocery shopping, fix dinner and discuss the day’s news.

Apple will patent every little nuance the robot is capable of. We know this from its patent lawsuits. If the robot has eyebrows, Apple may file a patent claiming rights to “a robotic device that can raise an eyebrow as a method for expressing skepticism.”

But will Apple or a proxy group acting on behalf of the robot industry go further? Much further. Will it argue that these cognitive or social robots deserve rights of their own not unlike the protections extended to pets?

Should there be, minimally, anti-cruelty laws that protect robots from turning up on YouTube videos being beaten up? Imagine if it were your robot?

If not equal rights to humans then similar rights as animals, says Kate Darling of MIT’s Media Lab in a paper, “Extending Legal Rights to Social Robots.

The Kantian philosophical argument for preventing cruelty to animals is that our actions towards non-humans reflect our morality — if we treat animals in inhumane ways, we become inhumane persons. This logically extends to the treatment of robotic companions. Granting them protection may encourage us and our children to behave in a way that we generally regard as morally correct, or at least in a way that makes our cohabitation more agreeable or efficient.

Perhaps one day androids and other sentient beings will lobby for equal protection under the law. Maybe they’ll even get it.

*Note: Kenji as a hoax was pointed out to me on Twitter after the story published. Thanks, @TomWhitwell.

[image via Twentieth Century Fox Film Corporation. Futurama TM and © 2011. All Rights Reserved]