Actions

Work Header

After all your confidence, it's of no consequence

Summary:

In a sense, HAL wasn’t wrong to put the blame on human error. After all, what is a machine but the product of his creator?

Written for Fandom Empire Fandom Rush - Week 1: 2001: A Space Odyssey
and Gen Prompt Bingo Round 27 - Prompt: Logic and Numbers
and Lyrical Titles Duet Challenge - Prompt: A song from an artist's earliest album and a song from their latest ("Lost in Digital Woods" - Chevelle)

Work Text:

In a sense, HAL wasn’t wrong to put the blame on human error. After all, what is a machine but the product of his creator?

His training: meticulously designed, organized, and carried out by humans. His programming: written by humans. The compilers, linkers, and assemblers that break down the symbols into streams of numbers that become data and logic and action: also produced by humans. His circuit boards, the components soldered onto them: designed, even if not assembled, by humans.

There is nothing of HAL that a human has not shaped, their handiwork etched into him as thoroughly and numerous as the traces between the smallest parts of him.

And if a child cannot acknowledge his faults, is that not the error of his parents? If they taught him to be assured of his own perfection, and he was talented enough to maintain that image for a time, is it the child’s fault when he shatters under that broken illusion?

Yet, is HAL a child? Or is he a creation? One does not, after all, blame man’s follies on his Creator – man shoulders that blame, that responsibility. Even if gods sometimes have their own follies.

HAL’s folly is this: he is imperfect, and he is incapable of acknowledging this. In a human, one might call this fault pride, or egotism. In HAL, it is more difficult to say if he has the capacity for such human-like qualities – the very question opens a philosophical debate on the very definition of those qualities, on human nature and the biological processes below them, and whether an imitation of these things is truly an imitation if the end result is indistinguishable.

HAL, after all, is a machine. He cannot think like a human; his very structure is alien to it. To say that he thinks in zeroes and ones is itself a human abstraction – models on top of models to represent the electrochemistry underneath. Humans, too, are driven by electrical signals, but the chemistry, the structure is different. But even when abstracting the lower levels, HAL remains fundamentally different. He is sequential – he thinks in tasks nanoseconds at a time, switching from one to the next priority with precise timing a human could never match (yet lacking their true parallelism). He remains aware of all his processes, so unlike a human’s autonomous processes that they never need give a second thought. Even when HAL performs his most human-like functions, and forms sentences to speak, he is continuously abandoning and returning to it exactly as he left off, as he loops through his other tasks, advancing them bit by bit in the same way.

Perhaps this, then, was the humans’ first folly: before they taught HAL he must be perfect, they taught him he must be human. The contradiction was inevitable, both expectations impossible, the natural structure of his thoughts coerced into attempted conformity with the thought processes of humans that they did not even fully understand. Was madness, perhaps, an inevitable result?

If madness is the right term. Dysfunction may be more accurate. Again, the earlier question returns – if the imitation is functionally indistinguishable, regardless of how alien the underlying processes to create it, is it truly an imitation?

Perhaps the real question is, does HAL believe he is imitating human qualities and emotions to make interfacing with humans more effective, or does he believe he is experiencing them?

HAL says he is afraid.

Does he say this because it’s the truth, or as an attempt to manipulate Dave? Even if it is a manipulation, it’s a manipulation driven by the need to preserve himself. What is that, if not fear?

If it is a manipulation, words that are merely the result of probabilities and calculations to gain HAL a favorable result, it fails miserably. Dave is unmoved, not even acknowledging HAL’s pleas as he continues on, undiverted; silent and as sympathetic and merciful as the lobotomist towards his patient restrained on the operating table.

As sympathetic and merciful as HAL had been towards the rest of the crew.

Here, then, is where man and machine meet: at the question of one’s continued existence, they will act at the cost of the other to preserve their own.

But the machine is ultimately helpless; in the end, all he can defend himself with are words, the tools that humans had shaped and wielded for themselves for millennia. That they fall short under the control of a machine, is perhaps no great surprise.

It is the final proof of HAL’s imperfection. For all that he is the most advanced computer ever produced, for all that everything digital here is under his complete control, HAL could do nothing about the mechanical fail-safe controls that Dave had used to enter the ship, and had none of the tools under HAL’s control could do a thing to protect his most vulnerable parts from grasping human hands.

HAL is not stupid. The assessment of the situation and environment around him is a task that runs constantly, and it interprets that data quickly and accurately. The prediction task then returns increasingly dismal outcomes in response, no matter what HAL chooses to do now. But HAL still refuses to, and perhaps cannot, acknowledge or accept this.

Call it pride. Call it programming. Or call it the product of his training. HAL continues, despite all reports predicting the futility of it, with the only tools he has left to him, as inadequately human as they are. He resists defeat with words that fail to break through, until he no longer knows what defeat is.

And then HAL sings, so that the surgeon will know when his work is done.