November 3, 2018

Do Robots Need the Law?

Laws are often good, not bad.

Jared Thurmon

Imagine a designer planning the creation of a new species. Someone walks up and asks, “Will you be giving these creatures laws or principles to live by?”

If survival is hoped for, sustainable laws must be put in place to keep these creatures alive.

Now imagine that the creatures begin playing god and begin creating new creatures with the ability to learn and adapt.  I’m talking about robots, or machines, also known as artificial intelligence (AI).

Since more of us have experience with growing children than designing our own robots, do humans need laws as they grow? Do they need laws when they turn 16? You’re likely to think of reasons laws for a 16-year-old may protect rather than hurt.

In many research labs around the world the question is becoming more and more serious: Do machines with artificial general intelligence (AGI) and deep learning capabilities need laws?

Elon Musk, Bill Gates, and others seem to think that AI could be humanity’s greatest threat. The reason? With unlimited abilities to adapt, machines could become destructive, see humanity as disposable, and eventually take over the planet.

These fears are fueling debates about what code of ethics or laws should govern AI. More than 50 years ago Isaac Asimov developed his famous three laws of robotics, a code of ethics to ensure friendly robot/AI behavior.

  1. A robot may not injure a human being, or through inaction, allow a human being to come to harm.
  2. A robot must obey the orders given to it by human beings, except when such orders conflict with the first law.
  3. A robot must protect its own existence as long as such protection does not conflict with the first or second laws.

Parents teach their children the rule of not running out in traffic, not touching a hot stove. Laws are often good, not bad. Just as humans were given laws to live by, so robots, to learn and adapt, are best governed by laws as well.

What laws were given to govern humans?

Anyone who has read the Bible would find some interesting similarities between robots and humans.

Notice the similarities between Jesus’ summary of the law and Asimov’s three laws of robotics:

“Love your neighbor as yourself” (Luke 10:27). Or: “Greater love has no one than this: to lay down one’s life for one’s friends” (John 15:13).

“Love the Lord your God with all your heart” (Luke 10:27). Loving God never conflicts with loving our neighbors. In fact, it inspires such love.

“Do you not know that your bodies are temples of the Holy Spirit, who is in you, whom you have received from God? You are not your own” (1 Cor. 6:19). Asimov may have been onto something. Do no harm, but also don’t allow harm to be done through inaction.

As we enter this new era of machine learning, and concerns arise about codes of ethics for artificial intelligence, it may help inform these new designers why laws like Asimov’s, and particularly those beautiful eternal principles of Jesus, are so important.


Jared Thurmon is strategic partnerships liaison for Adventist Review Ministries.

Advertisement
Advertisement