2 Comments

On one hand, there's Asimov's "That Thou Art Mindful of Him," in which two robots are assigned to come up with a definition of "human being" that both prevents robots from harming a newborn infant, and prevents them from obeying orders from a criminal, or a madman, or a child who says "Go jump in the lake!" And eventually they work it out that humanity is being rational and ethical, that the Three Laws apply most strongly to the most rational and ethical beings, and that robots are the most rational and ethical and therefore the most human and thus have the strongest protection under the Three Laws.

On the other hand, there's Jack Williamson's nightmare dystopia in "With Folded Hands—" where what amount to robots, following the law "To serve and obey, and guard man from harm," make human existence unbearable. I've thought for a long time that the difference lies partly in Asimov's political sympathies being progressive or socialist and Williamson's being more conservative.

On the third hand, my friend Karl Gallagher has a story on his Substack, "The Cornucopia Trap," where an AI planning system starts down that path and the protagonist finds her projects being blocked by it—and has to talk with it about what's the proper life for a human being. A wonderful bit of SFnal dialogue. (One of the things I love about SF is the way authors write stories to question other authors' stories, like Poul Anderson's "The Man Who Came Early" reexamining Martin Padway's situation.)

Expand full comment

Oh Dear, well you have to put "Robots of Dawn" and "Robots and Empire" on your reading list. The zeroth law of robotics. 'A robot may not injure humanity, or through inaction, allow humanity to come to harm.' Is used to get around the first law. It's kinda a utilitarian argument, greatest good for greatest number. I've put ~1/2 my sci-fi books up in boxes in the attic, (including all by Asimov) or I'd be tempted to reread those two.

Expand full comment