Living

Andrew Fiala On Ethics: Is the world ready for sex robots and mechanical soldiers?

This 2002 file photo shows two Boeing X-45 unmanned combat air vehicles at Edwards Air Force Base. The X-45 was designed to fly autonomously and carry 3,000 pounds of weapons into combat. Some scholars and tech experts have called for a ban on autonomous weapons.
This 2002 file photo shows two Boeing X-45 unmanned combat air vehicles at Edwards Air Force Base. The X-45 was designed to fly autonomously and carry 3,000 pounds of weapons into combat. Some scholars and tech experts have called for a ban on autonomous weapons. Associated Press

The robots are coming. Robots can manufacture consumer goods, milk cows, defuse bombs, fight fires, prepare food, and serve it. Soon we’ll see self-driving cars, robot soldiers, and yes, even sex robots.

Some argue for a ban on certain robots. A group of scholars and tech experts – including Stephen Hawking, Elon Musk, and Noam Chomsky – has called for a ban on autonomous weapons. More recently, computer scientist Kathleen Richardson started a campaign against sex robots.

The impending robot invasion creates a brave new world of ethical problems. One hyperbolic fear is that robots will turn against us, as in the sci-fi scenarios of “Terminator” or “The Matrix.” Robot defenders argue, however, that there is no need to fear a robot apocalypse. Since robots are basically rule-following machines, they will not turn on us unless programmed to do so.

But there is no perfect system of rules. Conflicting rules force hard ethical choices. Robots programmed to save humans may have to kill some to save others. For example, driverless cars programmed to avoid pedestrians may swerve into traffic, putting other humans at risk. Rule-following is no guarantee of safety in our complex world.

Robot enthusiasts will argue, nonetheless, that robots are more rational than we are. Machines can calculate probabilities and maximize outcomes in a way that human decision-makers cannot. A robot soldier might be better than a stressed-out human soldier at following the rules of engagement. Robot cars may minimize the overall harm of high-speed collisions better than frantic, angry or self-interested human drivers.

But the fact that robots do not feel squeamishness, fear or doubt is a concern for those who value the emotional component of ethical decision-making. Feelings of guilt, remorse, fear, joy and hope are important components of the moral life. A robot who feels no joy in saving a child and no guilt at killing one is a kind of moral monster.

Another worry is that robots make it too easy to do dirty work. Robot soldiers would make war easier. Since robots don’t suffer PTSD or leave behind orphans and widows, it would be easier to send them into battle. But if robots can kill without risk, we might take combat less seriously and thus be more permissive about going to war.

Furthermore, the robot revolution poses problems for human agency and identity. This is the danger of sex robots. Do we really want people having sex with machines? Proponents imagine sexbots as a humane substitute for human prostitution. But critics worry that sexbots may increase the demand for sex objects, thus contributing to sexual violence and putting women and children at risk. What would we think of pedophiles who build childlike sex robots?

Robot enthusiasts argue that robots will decrease risk, increase productivity and improve human happiness. Smart machines can kill, drive and flip burgers with more precision and less danger than sleepy and disinterested human beings. Unlike human beings, robots don’t get tired, depressed, jealous or drunk. Nor do they complain when they are ignored, mistreated or disrespected.

But creating robots to do dangerous and degrading work is only a deflection from deeply human problems. The scourges of war and sex-trafficking will not be solved by robotic soldiers or cyber prostitutes. We need human solutions to these problems grounded in humane values such as love, respect and self-control.

We also need to remember that good work is intrinsically valuable. Happiness is found in a job well done. In our effort to speed up work and create efficiency through mechanization, we forget that work is what we do and who we are. There are pleasures and virtues to be found in cooking, driving and milking cows. We need productive occupations. When the robots take over, what will we do all day besides fondle our phones and poke at our apps?

Some activities that are so important that we ought not have robots do them: killing and sex are obvious examples. A fully human life is more than mechanical tasks and rule-following behavior. Human experience includes emotional, ethical and spiritual depth, as well as concrete embodied relationships. There is no robotic replacement for the labors and loves that make life worth living.

Andrew Fiala is a professor of philosophy and director of The Ethics Center at Fresno State. Contact him: fiala.andrew@gmail.com

  Comments