The Three Laws of Robotics

The Three Laws of Robotics were fictional laws of robotics developed by Issac Asimov, the famous Science Fiction writer.

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
  3. A robot may not injure its own kind and defend its own kind unless it is interfering with the first or second rule.

It strikes me that it would be hard to impose these three laws of robotics on robots (as some scientists try to do) when humans do not even follow the rules. We can only build technology in the image of humanity.

Therefore I propose The Three Laws of Humanity:

  1. A human may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A human must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
  3. A human may not injure its own kind and defend its own kind unless it is interfering with the first or second rule.

Can we expect more of our machines than we expect of ourselves?

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s