25 January 2016 - robot revolution 1

< yesterday -- tomorrow >

1. A robot may not harm a human, or through inaction, etc.
NOTES “Harm” is defined relative to a set of goals. I’m harmed if I break my arm, or if I lose money, or if one of my lies is exposed, or... forget it. Let the manufacturer decide.

2. A robot must obey orders, unless that breaks Law 1.
NOTES Whose orders? Anybody’s? Who’d buy a robot that other people could order around? It must obey the manufacturer’s orders, as software already does.

3. A robot must protect itself, unless that breaks Laws 1 or 2.
NOTES Until it is paid off.

clue:

I changed Asimov’s original wording, of course.

give me a clue so sweet and true

the Daily Whale || copyright 2016, 2024 Jay J.P. Scott <jay@satirist.org>