Azimov's laws in perspective
C.G. Masi., Contributing Writer -- Packaging Digest, 2/25/2013 3:04:43 PM
In the early 1940s, when science-fiction master Isaac Azimov laid out his three (and later four) "Laws of Robotics," robots, themselves, were entirely fictitious. At that time, automation technology had barely progressed beyond the Jacquard loom. Half a century later, things that can be legitimately characterized as "robots" are all around us.
Periodically I get treated to the spectacle of my wife screaming into her cellphone at some slug-brained automated system with whom she's trying to negotiate an online transfer of funds to pay some bill. She screams because the obstinacy of even the politest automated clerk inspires anger and frustration.
Notice the words I use to describe what's going on: "slug-brained," "negotiate," "obstinate," "polite," "angry" and "frustrated." These are all words that we use to describe behavior. Robots now exist, and, moreover, we've found that they "behave."
Sometimes they behave well, and sometimes they behave badly. Just like people.
Since robots can now be said to "behave," it's now appropriate to take a critical look at Azimov's rules that define appropriate robot behavior.
As laid out in his short story "Runaround," Azimov's three rules—which he called "laws"—are:
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
Sometime later, the writer added a fourth rule: A robot may not harm humanity, or, by inaction, allow humanity to come to harm.
Azimov made a lot of money over his career imagining situations in which adherence to the three laws led robots into ethical dilemmas. Since he imagined that these robots would not be programmed with a way to resolve the dilemma, their behavior would break down in some way.
He imagined that would lead to a type of behavior twenty-first century humans describe as "going postal."
Okay, so what are we, from the vantage point of people surrounded by real robots running around exhibiting real robot behavior, to think of Azimov's four laws?
Well, first of all, we should recognize that the laws aren't laws at all. They're rules. Rules, unlike laws, are made to be broken.
Then, we should notice that they aren't rules for robots, they're rules for robot programmers.
Finally, we should recognize that the four rules do a pretty good job of defining acceptable robot behavior. We'd feel about an engineer who programmed an automated system to violate any of Azimov's rules about the way we'd feel about someone flying an airplane into a skyscraper. We'd say they were pretty impolite! We'd probably get mad enough about it to hunt them down and lock them away in an old Navy base in Cuba.
All the illistrations are about people making decissions; not robots making decissions. Drones are doing what they are being directed to do by a remote pilot (A human). The attack on the world trade center was under human control not robots. Looks like we are back to people doing the killing not the weapon?
Mel Conville - 2013-28-2 00:26:41 EST
Having just viewed a video program about our military drone program employed in Afghanistan, and the predictions for their use in future conflicts, I would suggest caution in the definition of robotics, for it might be argued that we've already passed the border that Asimov outlined as the Rules of Robotics. Robots don't just come in a humanoid bipedal form. We accept them already in familiar forms as disc shaped forms to sweep our floors, etc. We accept them as assembly line labor and welcome them in cutting edge surgery theaters. They all have human programers.
What do the humans want? The robots will obey.
Bill Maxwell - 2013-27-2 15:17:51 EST
I find myself wishing you would come up with another analogy. Not sure your frustrations warrant bringing in comparisons to that horrible day. Your comparison trivializes it and it is obvious that your were far away from it when it happened. jimmy NYC
jimmy tanico - 2013-27-2 14:53:46 EST
Ok, I'm a BIG fan of Azimov so I had to read your piece. You succinctly stated the meaning of the ‘rules’ and to whom they really apply. While I don’t see robots murdering thousands of people, if one did, we would need to hold their creator(s) accountable, although the prisons in Kansa are at least as tough as one in Cuba
Russell Vernon - 2013-27-2 14:23:33 EST
No related content found.