• Login
  • Register

Work for a Member company and need a Member Portal account? Register here with your company email address.

Article

What interacting with robots might reveal about human nature

Robot panic seems to move in cycles, as new innovations in technology drive fear about machines that will take over our jobs, our lives, and our society—only to collapse as it becomes clear just how far away such omnipotent robots are. Today’s robots can barely walk effectively, much less conquer civilization.

But that doesn’t mean there aren’t good reasons to be nervous. The more pressing problem today is not what robots can do our bodies and livelihoods, but what they will do to our brains.

“The problem is not that if we teach robots to kick they’ll kick our ass,” Kate Darling, an MIT robot ethicist, said Thursday at the Aspen Ideas Festival, which is co-hosted by the Aspen Institute and The Atlantic. “We have to figure out what happens to us if we kick the robots.”

That’s not just a metaphor. Two years ago, Boston Dynamics released a video showing employees kicking a dog-like robot named Spot. The idea was to show that the machine could regain its balance when knocked askew. But that wasn’t the message many viewers took. Instead, they were horrified by what resembled animal cruelty. PETA even weighed in, saying that “PETA deals with actual animal abuse every day, so we won’t lose sleep over this incident,” but adding that “most reasonable people find even the idea of such violence inappropriate.”

The Spot incident, along with the outpouring of grief for the “Hitchbot”—a friendly android that asked people to carry it around the world, but met an untimely demise in Philadelphia—show the strange ways humans seem to associate with robots. Darling reeled off a series of other ways: People name their Roombas, and feel pity for it when it gets stuck under furniture. They are reluctant to touch the “private areas” of robots, even only vaguely humanoid ones. Robots have been shown to be more effective in helping weight loss than traditional methods, because of the social interaction involved.

People are more forgiving of robots’ flaws when they are given human names, and a Japanese manufacturer has its robots “stretch” with human workers to encourage the employees to think of the machines as colleagues rather than tools. Even when robots don’t have human features, people develop affection toward them. This phenomenon has manifested in soldiers bonding with bomb-dismantling robots that are neither anthropomorphic nor autonomous: The soldiers take care of them and repair them as though they were pets.

“We treat them as though they’re alive even though we know perfectly well they’re machines,” Darling said.

That can be good news—whether it’s as weight-loss coaches or therapy aides for autistic children—but it also opens up unexplored ethical territory. Human empathy is a volatile, unpredictable force, and if it can be manipulated for good, it can be manipulated for bad as well. Might people share sensitive personal information or data more readily with a robot they perceive as partly human than they would ever be willing to share with a “mere” computer?

Social scientists (and anxious parents) have wondered for years about the effect of violent video games on children and adults alike. Even as those questions remain unresolved, an increasing number of interactions with robots will create their own version of that debate. Could kicking a robot like Spot desensitize people, and make them more likely to kick their (real) dogs at home? Or, could the opportunity to visit violence on robots provide an outlet to divert dangerous behaviors? (Call it the Westworld hypothesis.)

An even more pungent version of that dilemma could revolve around child-size sex robots. Would such a thing provide a useful outlet for sex offenders, or would it simply make pedophilia seem more acceptable? Making the dilemma more challenging, it’s extremely difficult to research that question.

The sway that even rudimentary robots can hold over humans was clear near the end of Darling’s talk. A short robot whirred out on stage to alert her that she had five minutes left to speak. The audience, which had just listened to a thoughtful, in-depth litany of the ethical challenges of human-robot interactions, cooed involuntarily at the cute little machine. And Darling, who had just delivered the litany, knelt down to pat its head.

Related Content