Saturday, June 10, 2006
Answers
So I'm a loser. I just saw 2001: A Space Odyssey for the first time a few weeks ago. I've seen it more than a dozen times since. I need to track down the book. I need answers.
I know the nature of the monolith(s) is intended to be unknown. I know that we;re not supposed to really understand what happened to Dave at the end. What I want to know is Hal's motivations, and I'm hoping the book will reveal that a bit more. I've having trouble going from the directive "Lie to the crew about the monolith" to "I better kill everyone."
There's a character in 2010, Dr. Chandra, who is a computer psychologist. The idea that artificial intelligence could get that advanced, to the point where a psychologist is needed or even has enough "psyche" to work with is mind blowing.
I want to know how Hal thinks. I want to know his motivations for what he did. I want to know if he feels emotions: he claimed to be afraid, he claimed to experience enjoyment. He can act without mercy, but maybe there's a greater good he believes he's serving.
The computer psychology fascinates me more than the mystery of the monoliths or what the heck happened at the end.
I know the nature of the monolith(s) is intended to be unknown. I know that we;re not supposed to really understand what happened to Dave at the end. What I want to know is Hal's motivations, and I'm hoping the book will reveal that a bit more. I've having trouble going from the directive "Lie to the crew about the monolith" to "I better kill everyone."
There's a character in 2010, Dr. Chandra, who is a computer psychologist. The idea that artificial intelligence could get that advanced, to the point where a psychologist is needed or even has enough "psyche" to work with is mind blowing.
I want to know how Hal thinks. I want to know his motivations for what he did. I want to know if he feels emotions: he claimed to be afraid, he claimed to experience enjoyment. He can act without mercy, but maybe there's a greater good he believes he's serving.
The computer psychology fascinates me more than the mystery of the monoliths or what the heck happened at the end.
Comments:
My take on it was always that HAL had recieved direct instructions that were contrary to his basic programming which caused him to go nuts.
His killing of the crew was not meant to have discernable motivations because it was an irrational action as a result of schizophrenia induced by the contradictory programming.
But that's only me. The computer psychology degree I got from the University of Outer Mongolia for $49.95 expired years ago.
His killing of the crew was not meant to have discernable motivations because it was an irrational action as a result of schizophrenia induced by the contradictory programming.
But that's only me. The computer psychology degree I got from the University of Outer Mongolia for $49.95 expired years ago.
I can understand that the inadvertantly programmed logic error made HAL crazy. Humans can deal with congitive dissonance. I assume AI can't (though I guess that would depend on the AI...)
The only motivation I can think of is that since he had contadictory instructions, he wanted to get rid of one of the instructions. He could tell himself not to lie, but he could get rid of the people he was supposed to lie to.
But maybe he did just get paranoid like a human can, with traces of schiophrenia, and had no motivation for what he did. Or maybe he had a computer version of epilepsy that sent circuits firing without cause.
Post a Comment
The only motivation I can think of is that since he had contadictory instructions, he wanted to get rid of one of the instructions. He could tell himself not to lie, but he could get rid of the people he was supposed to lie to.
But maybe he did just get paranoid like a human can, with traces of schiophrenia, and had no motivation for what he did. Or maybe he had a computer version of epilepsy that sent circuits firing without cause.