Saturday, December 19, 2015

AI Consciousness

The Asimov Laws, also known as The Three Laws, were implemented in the I,Robot series by author author Issac Asimov. These were the rules that govern the programming of any AI form. 
  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws. 
The definition if AI is the area of computer science that deals with giving machines the ability to seem like the have human intelligence and the power of a machine to copy intelligent human behavior. While many may say that what sets AI and humans apart is that human being  have souls, but on a mush simpler and immediate level, it is choice.  The Three Laws control the programming of an AI, the simplest machine would need some sort of code and direction of their "purpose". To be able to choose and  whether the consiquences  results in the harm or safety of another person is not extanded to AI. In I,Robot, we see these Three Laws and their decisions are governed by a program.

Take the Google car, you are in the car and it's driving by itself. To your right you have a ditch and the road only has two lanes. There is a vehicle coming the other way and all of a sudden a person walks in front of you. What will the car do? Well even if you managed to tell it to hit the person, it would violate its program. It can't move over into the other lane because it would harm the other person. So the response to the situation would be to drive into the ditch, the program would not save itself because it violates the 1st and 2nd law, even if it means the life of the passenger. 

 In the Most Human Human we are introduced to the turning test, this is used to determined the "Human-ness" of a machine. In an example of the turning test you have the human and a robot have a conversation, but it is not in his programming.

Person Trade you a water pistol for the big red block. 
Robot Sorry, I don’t know the word “trade”.
Person A “trade” is a free exchange of property. 
Robot Sorry, I don’t know the word “free”. 
Person A “free” act is done willingly, without being forced. 
Robot Sorry, I don’t know the word “act”. 
Person “Acts” are what people do on purpose, and not by accident. 
Robot Sorry, I don’t know the word “people”. 
Person Sorry, I thought you were smarter than you are. 
Robot Sorry, I don’t know the word “sorry”

AI can only do what it is program to do, we can not give it the "Human" experience. It would required to have the choice to make on any event. In Most Human Human  we see Deep Blue play chess with Garry. Deep Blue can predict all the possible moves, but can not guarantee which move it will make. If Deep Blue had not been program to play chess than this would have been a very different scenario.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.