Thirty years ago, few people envisioned just how completely computers would be integrated into our everyday lives; today, they're everywhere. In Robot Ethics: The Ethical and Social Implications of Robotics, Patrick Lin (a science ethicist), Keith Abney (a philosopher of science) and George Bekey (a computer scientist) argue that the same is true about robots. Today, they are technological oddities; tomorrow, they'll be ubiquitous and indispensable. That's why, they write, we need "the emerging field of robot ethics."
In their introduction to the book, which is a collection of essays in robot ethics from philosophers, lawyers, and scientists, Lin, Abney, and Bekey point out that people have been thinking about the ethics of robotics for millennia. Isaac Asimov's three laws of robotics are only the most recent entry in a long tradition. "Homer," the editors write, "described in his Iliad the intelligent robots or 'golden servants' created by Hephaestus, the ancient Greek god of technology... Leonardo da Vinci conceived of a mechanical knight that would be called a robot today." But the need for a serious inquiry into robot ethics is now greater than ever before, because robots are now advanced enough to participate, on their own, in the ethical world:
[I]n August 2010, the U.S. military lost control of a helicopter drone during a test flight for more than thirty minutes and twenty-three miles, as it veered toward Washington, D.C., violating airspace restrictions meant to protect the White House.... In October 2007, a semiautonomous robotic cannon deployed by the South African Army malfunctioned, killing nine "friendly" soldiers and wounding fourteen others....
Already, robots are taking care of our elderly and children.... Some soldiers have emotionally bonded with the bomb-disposing PackBots that have saved their lives, sobbing when the robot meets its end.
Already, fascinating moral questions are emerging. If a robot malfunctions and harms someone, who is responsible -- the robot's owner, its manufacturer, or the robot itself? Under what circumstances can robots be put in positions of authority, with human beings required to obey them? Is it ethically wrong for robots to prey upon our emotional sensitivities -- should they be required to remind us, explicitly or implicitly, that they are only machines? How safe do robots need to be before they're deployed in society at large? Should cyborgs -- human beings with robot parts -- have a special legal status if their parts malfunction and hurt someone? If a police robot uses its sensors to perform a surveillance operation, does that constitute a search? (And can the robot decide if there is probable cause?)
Some of these questions are speculative; others are uncomfortably concrete. Take this example involving (what else?) robot sex, from an essay by David Levy:
Upmarket sex dolls were introduced to the Korean public at the Sexpo exposition in Seoul in August 2005, and were immediately seen as a possible antidote to Korea's Special Law on Prostitution that had been placed on the statute books the previous year. Before long, hotels in Korea were hiring out "doll experience rooms" for around 25,000 won per hour ($25).... This initiative quickly became so successful at plugging the gap created by the antiprostituion law that, before long, establishments were opening up that were dedicated solely to the use of sex dolls... These hotels assumed, quite reasonably, that there was no question of them running foul of the law, since their dolls were not human. But the Korean police were not so sure. The news website Chosun.com... reported, in October 2006, that the police in Gyeonggi Province were "looking into whether these businesses violate the law . . . Since the sex acts are occurring with a doll and not a human being, it is unclear whether the Special Law on Prostitution applies."
It seems inevitable, Levy writes, that more advanced "sexbots" will push this issue even more to the fore, forcing lawmakers to figure out just which aspects of prostitution they want to outlaw.
Levy's sexbot example is emblematic of a theme running through this collection of essays: The ethical problems posed by robots aren't just about the robots. They're also about old, familiar human behaviors which we must reconsider once robots are introduced. How will spouses feel, Levy asks, about the use of sexbots? Some will see it as adultery, others as a intrinsically meaningless. The answer, Levy argues, really has nothing to do with the robots themselves. "It will depend very much," he writes, "on the sexual ethics of the relationship itself when robots do not enter the picture."
Kevin Hartnett is a writer in Ann Arbor, Michigan. His last article for Ideas was about choosing Congress by lottery.
Guest blogger Simon Waxman is Managing Editor of Boston Review and has written for WBUR, Alternet, McSweeney's, Jacobin, and others.
Guest blogger Elizabeth Manus is a writer living in New York City. She has been a book review editor at the Boston Phoenix, and a columnist for The New York Observer and Metro.
Guest blogger Sarah Laskow is a freelance writer and editor in New York City. She edits Smithsonian's SmartNews blog and has contributed to Salon, Good, The American Prospect, Bloomberg News, and other publications.
Guest blogger Joshua Glenn is a Boston-based writer, publisher, and freelance semiotician. He was the original Brainiac blogger, and is currently editor of the blog HiLobrow, publisher of a series of Radium Age science fiction novels, and co-author/co-editor of several books, including the story collection "Significant Objects" and the kids' field guide to life "Unbored."
Guest blogger Ruth Graham is a freelance journalist in New Hampshire, and a frequent Ideas contributor. She is a former features editor for the New York Sun, and has written for publications including Slate and the Wall Street Journal.
Joshua Rothman is a graduate student and Teaching Fellow in the Harvard English department, and an Instructor in Public Policy at the Harvard Kennedy School of Government. He teaches novels and political writing.