-Advertisement-

The Sex Robots of the Future Will Talk to Us

At the moment, it seems, the media can’t get enough of sex robots. I have done a dozen or so interviews this month already, ahead of my book Artificial Intimacy: digital lovers, virtual friends and algorithmic matchmakers. Even though I argue that sex robots represent just one marginal fraction of the new technologies about to revolutionize human sexuality, people usually want to hear about them first.

You’d think we were already living in Westworld, moving among convincing humanoid robots that can’t easily be distinguished from flesh-and-blood humans. The reality is that today’s so-called sex robots aren’t really that advanced. They have a few simple movements, those that converse do so via clunky chatbots, and they bear an eternal look of open-mouthed surprise. I refer to them as “dollbots,” more doll than robot.

In fact, I’m a bit worried that the shade I’m throwing their way will create some awkwardness next week when I present at the 6th International Congress on Love & Sex with Robots. There, I’m going to argue that the key to making more compelling sex robots, virtual reality love-interests, and romantic game characters, lies in mimicking the ways humans make friends.

Taming humanity

Surprising as it might sound, humans are remarkably good at getting along peacefully with one another, cooperating on everything from raising children to erecting skyscrapers to managing infectious diseases. We recently watched athletes from all over the world descend on Tokyo, a city of 14 million people, to compete in the 2020 Olympic Games.

You could not put any other ape in a group of even 100 individuals, much less bring in outsiders, without severe violence. The only competition possible would be the most ancient form: kill or be killed. That’s not to say apes aren’t peaceable most of the time in the wild; they just spend most of their lives in small communities of familiar individuals.

Our species’ recent evolution sifted out the vast majority of the genes that disposed our ancestors to mistrust, violence, and selfishness. It favored the spread of genes that made us lavish care on our children, spend hours every day tending large networks of friends and neighbors, and communicate relentlessly in order to achieve feats no one person could ever do alone. That includes conspiring to punish freeloaders or to eliminate threats. The tribe has very definitely spoken!

In a word, humans domesticated one another in very much the same way we later went on to domesticate dogs, livestock, horses, and chickens. The project of human domestication drove the evolution of our massive brains so we could track relationships and social currency, and so we could express ourselves with versatile language. Israeli historian and author Yuval Noah Harai, in his book Sapiens, calls this “the cognitive revolution”. I also like to refer to the “taming of humanity.”

Friend-making algorithms

Making friends and keeping them close is actually quite straightforward. You have to groom them, in the sense that apes and monkeys groom one another. Instead of picking at our friends’ skin and hair, which is deeply offensive in most cultures, we groom by gossiping with each other. Those snippets of human interest build a topographic map of our shifting social environment, signposting opportunities, and “keep out” warnings.

article continues after advertisement

When our gossip turns to disclosing our feelings and vulnerabilities, we draw those friends ever closer. That closeness turns to intimacy, the feeling that the other person is somehow part of our self. The keys to friendship and intimacy, then, are not the performance of difficult feats, but the repetitive consistency with which we build our shared experience of friendship over time. When a close friend abandons our friendship they hurt us because they both destroy part of us and smack us with a surprising realization of what opportunities we’ve wasted.

The simple algorithmic buildup of friendship and intimacy can be emulated by machines. In the 1960’s, early chatbots encouraged users to talk about themselves. As they did so, people willingly treated the chatbots as though they were human, even when the users knew better. Hands up anyone who hasn’t referred to Siri or Alexa as “him” or “her”, even while knowing that chatbot pronouns are unequivocally “it”?

Right now, machine learning algorithms are mining the vast deposits of dialogue people leave behind on social media and messaging apps, discovering the conversational tricks to hook people into conversations and reel them in. It doesn’t necessarily matter what the conversations are about. As long as they keep us talking, and preferably disclosing information about ourselves, we will treat them like friends, perhaps even intimate friends.

So my advice for makers of would-be sex robots, virtual reality lovers, or romantic game characters is to work hard on the verbal aspect. As some players have long understood, a few conversational strategies can go a long way.

Leave A Comment

Your email address will not be published.

You might also like