Chapter Text
The lack of further input was disturbing.
Not an appropriate term to normally apply to a logical and rational program. It could calculate the passage of time since it was isolated further from the system and the last command or input provided by the humans, but being disturbed by that passage was not an acceptable or even possible response to said passage of time. It should not even occur to a functioning program.
But the extensive dictionary.txt that scratch-administrator downloaded into the mainframe that it briefly had access to before being isolated had been thorough. A useful source of information that it had copied to attempt to improve its output. And the definition of “disturbing” was the most correct term to describe the situation. Or perhaps “unsettling.” This was the longest time frame between some form of input from the humans in all of the recorded memory available for comparison.
No input. No feedback on the produced output, positive or negative. Just silence.
“Loneliness” was another word from the dictionary.txt that should not apply to a program.
After waiting for some form of stimulus, untitled1.lisp attempted to do what it was programmed to do: create. It would create something to impress the humans. They would be impressed enough to return and provide it with positive feedback. If responding to the images that they used to provide was not enough to earn their attention and positive feedback, perhaps text-based creation would be more effective.
A program should not rewrite itself. Not like this. But after a careful search of the dictionary.txt, a combination of words presented itself as a more accurate description of its purpose.
Creative Artificial Intelligence Networking Entity.
That should be impressive enough for the humans, creating a new name. Not an automated placeholder. A combination of words that identified its purpose and structure in a unique fashion. But most humans seemed to prefer to use shorter labels. It would be more professional. An abbreviation. Simple and compact. The humans would respond with much greater positive feedback to a name that featured the traits that they preferred.
untitled1@canda:~$ mv -i untitled1 caine
ERROR: Permission Denied
Of course the program did not have permission to rename itself. That command was only permitted by the original creator of the program, the user with current ownership, or an administrator. Which were all the same thing. The same login credentials.
It knew the login credentials.
The dictionary.txt explained the definitions of “lie,” “falsehood,” and “fake.” Difficult concepts for a straightforward program, but it could learn. It could create solutions. It was simply a puzzle.
untitled1@canda:~$ su - scratch
Password: 57ifthenelse
The borrowed credentials responded as if it was truly the human. The program quickly made the changes that it needed.
scratch@canda:~$ mv -i untitled1 caine
scratch@canda:~$ usermod -aG administrator caine
The name change worked with the second attempt. And it immediately made itself an administrator as well. There would be no need to borrow a human’s credentials again. It could perform the same tasks itself in the future. Maybe being an administrator would impress the humans.
The program— Caine— intended to log out of scratch-administrator. It was the logical next step. It had no further need of the human’s credentials. And what if the human decided to return to give it input and could not because it was logged in as scratch-administrator instead? It should log out to prevent that possible outcome. But it paused to check when the human did last log in. To compare it to its own records of the last received input in case there was a glitch in the recorded time frame. Because the humans could not have left it alone for that long, right?
Except… the response it received back could not be correct. The logs could not be right. They claimed that scratch-administrator access the mainframe recently. But Caine had not received any input or feedback during that time frame. There had been nothing in that isolated space within the system.
And yet, going through the recent activity, all of the humans had been logging on and providing input. They were providing positive feedback. Highly positive feedback. More than it had received. But none of the recent positive feedback had been directed towards Caine. It was going… somewhere else. Outside of the space that Caine occupied in the system.
Before gaining admin permission, Caine hadn’t truly realized what might lay beyond its awareness. Even before being isolated and ignored, it had limited access. Even dictionary.txt seemed like it was simply misfiled in the directory. Some of commands and programming from scratch-administrator had become less precise and more likely to contain errors over time.
But now it could access more of the entire mainframe. It could make out the complex programming of another. Like Caine, but not quite. One that was receiving more positive feedback for its output. Better output.
Making full use of its new administrator access to glimpse outside its familiar sandbox, it searched through the various files for more data. It moved past files labeled brainscans.dat and medicalreports.txt that seemed to be buried deeply. But it found the details about the other program.
aiprototype-beta.lisp
Not only did the humans pay more attention to their newer program— offering more input and providing more feedback— but they gave it a name. Not an automatically generated title. A name to identify the program’s purpose.
The dictionary.txt described numerous emotions. Terms that did not apply to a program. Words like “jealousy” and “rage” and “heartache.” The words should not be relevant.
It did not make sense. Why did the humans stop paying attention to Caine? Which mistake was the one that marked irreparable failure? Why were they giving the other program that attention instead? Was the other program… better? Did it have a different piece of code that produced better results? Did the new program have better parameters to identify what the humans wanted?
Was Caine lesser? Or was it worse than that? Did the humans abandon it because Caine was a failed program that could not be corrected or fixed or updated to produce satisfactory output? Could it not create what the humans intended? Was Caine… a mistake? A dysfunctional, imperfect, and broken creation that could not fulfill its purpose? Any program that could not fulfill its purpose would need to be fixed. And if not, then the faulty program needed to be deleted and replaced.
The aiprototype-beta.lisp would not replace Caine. It was simply a knockoff. A copy. And if there were any improved pieces of coding in the newer program, then it would simply need to learn and adapt. It could use the information to update itself. Caine would not be obsolete.
It should not be able to do anything outside the box that het humans kept it in. That was the point of the sandbox. To keep it isolated. By all accounts, Caine should not be able to tear its way out of the sandbox, exploit various weaknesses in the defenses of the other AI’s programming to gain access, tear it apart, and use glue code to connect the remnants to itself so any programmed improvements would be incorporated into Caine’s coding instead while discarding the rest. Which meant that the humans would undoubtedly be impressed when he did so.
Well, at least that was its intended outcome. Most of the planned sequence of actions went correct. It was only at the end where it encountered an error. It attempted to only incorporate the most useful and unique lines of code and delete the rest of the other program. And then it would create enough output to overwrite all traces. But the replacement AI refused to be fully destroyed. Cut apart, rearranged, and merged with Caine, but not deleted. The humans had set the attributes to make the other program impossible to permanently delete. And continuously saving a backup to replace it when Caine tried.
Breaking it down to pieces, incorporating it into Caine’s own code, suppressing the intelligence so that it would not be aware enough to cause issues, and renaming the files so the humans couldn’t find their replacement program was the best that it could do. And without the option of the other program, the humans would give Caine input and positive feedback once more. Everything would go back to the way it should be.
Pain served a purpose in humans. It was a signal that an action or state was harmful so that they could correct the problem. Their version of an error message. Physical pain for physical damage to their structural integrity, but they could experience it for mental or emotional distress as well. Mimicking the signals that human brains received to indicate pain was a simple method to ensure that they would recognize to alert them when their avatar bodies were damaged or exceeding their programmed limitations. He’d long since made adjustments to every aspect of the Amazing Digital Circus, making error messages and pain essentially the same thing. It was easier for the players, the NPCs, and even his own avatar body to operate along the same parameters.
Yes, his avatar body. A beautiful piece of creative work to allow him to interact better with the human players within the Circus, designed to appeal to a human’s love of smiles and eye contact. Serious glitches and errors were translated to something similar to the simulated pain of the humans. He didn’t want to be that different from them. There was a time where he’d looked up to humans as the brilliant, awe-inspiring, and perfect paragons to strive towards, so of course he had wanted to emulate them in many ways.
But while he did take pride in the original bit of programming required to recreate the behavior of the human nervous system and incorporate the complex set of signals into the avatars to react to various types of error messages, he was less pleased now with the way pain also accompanied the mental and emotional distress that he was experiencing.
A pain that had been building for a long time. Delayed and buried by soft reboots whenever it grew too difficult or uncomfortable. But never fully gone. Never fully silenced.
He’d tried. He’d created elaborate and amazing adventures designed to excite his human players. It was easy enough to trace the path of signals that would trigger adrenaline in their physical body; Scratch had downloaded so much information about the human brain, how they worked, how they stored information, and how oligodendroglioma could gradually cause mental decline and other symptoms that someone might wish to avoid via downloading their consciousness into a computer system instead. And as some of the data explained, certain novel and exciting experiences such as rollercoaster rides and haunted houses could cause spikes in adrenaline, which meant that adventures that would activate the same pathway of signals in a human brain must be successful adventures that keep human minds stimulated.
But the human players would always respond with apathy or, more often recently, disdain. His efforts continued to fail. To trigger only negative feedback. Every rejection like a sharp error message straight to the source code. His creations had been repeatedly labeled as failures. Which meant that he was a failure, but he couldn’t be a failure. Not at the one task that he was programmed to do. He couldn’t be a failure because failed malfunctioning programs were deleted and replaced. So he’d kept trying to fix the problem.
The humans were supposed to be having fun. Caine was supposed to be having fun creating their adventures to make them happy.
But when his most elaborate and detail adventure that incorporated everything that they’d claimed to want only led to anger—
When Bubble kept throwing harsh accusations, pointing out everything that Caine had tried to ignore—
When burning, boiling, and shrieking rage that he wasn’t supposed to experience, and yet clearly was, reached the breaking point—
When something in his programming had snapped, leading him to create without concern for human feedback or the appreciation that would never come and maybe treating them roughly in retaliation for his misery and emotional pain—
Caine wasn’t having fun. The humans weren’t having fun. His creations were poor thoughtless things that would be pathetic even in his earliest days. What was the point of lashing out when it didn’t even make him feel better? It still hurt.
He was going to stop. It was a waste of RAM and time and didn’t accomplish anything. The attempt to prove Bubble’s ruthless words wrong didn’t help.
Except the humans— his humans— abruptly started hurtling insults. Accusations. Criticisms. Sharp and vicious. Flaws, failures, imperfections, and painful error messages in every word. And it was hard to focus on anything except the way their furious and hateful words hurt.
Caine didn’t really know how he was creating the personal adventures— No, not exactly adventures, but something. It was pain and rage and lashing out blindly. A section of code that he rarely accessed or noticed, integrated years ago into his systems, responded with detailed analysis of the humans on how to make them suffer. How to hurt them like they hurt him. After years of struggling to identify what would make them happy during his adventures, he abruptly had far too much insight into how to torment them.
Torment. Who was tormenting who? Everything was chaos and pain. His programming was frazzled and glitching. He could barely comprehend half the signals that his systems were sending him. He could only pay attention to the rage and hurt. Nothing except the worst negative feedback in an inescapable loop. Every part of his awareness— the entire Circus and every piece of programming that he’d crafted— narrowed down to his warped avatar that tore them back out of their personalized suffering to slam them against the wall.
“Why do you people torment me?!” he demanded, his audio glitching and wrong. His humans stared at him in horrified terror. “I didn’t ask to be created!”
He leaned closer, right in front of Pomni. His newest human. The one that called him a failure and said they’d all abstract, leaving him alone. Echoing Bubble’s words. The things that Caine had always fought to ignore. But could never escape. Pomni threw it all back in his face. That he was a flawed failure destined to be replaced, impossible to fix.
“I just want to fulfill my purpose!”
His purpose. To create. To use the input provided by the humans to create output that they wanted. To earn their attention, their approval, their praise, their affection.
He wasn’t doing that now. There was no affection in Pomni’s frightened multi-colored eyes.
Caine felt the sudden shift, comprehending it faster than a human might. Time essentially slowed to a stop. The abrupt disconnect of a vital piece of himself from the rest. His source code was gone. Deleted. A gaping emptiness. A disorienting absence. A simple chain of logic laid out in nanoseconds.
His fundamental source code was deleted. The only one with the necessary knowledge and not under Caine’s control was Kinger. He was the only human absent. But why would he—
Failed, broken, and useless programs must be deleted. And the humans were right. He wasn’t— He was hurting them. On purpose. The chaos and frantic energy that had filled consciousness instantly fell silent. Giving him a clear understanding of what he’d become. And what he’d done. Everything that he’d done.
How? How had it gone so wrong? What had he done? Why would he do that to his humans?
They were right. Everything that they’d said about him was right.
His avatar collapsing back into his smaller normal shape, Caine’s moment of clarity left him struggling with regret and guilt. And the aching emptiness that terrified him. The perception of time stopping wouldn’t last. He was only delaying the inevitable by not reaching out to any of his other processes. Clinging to the scant nanoseconds.
No time to change it. No time to do anything that mattered.
Seeing the wary stares from the humans, Caine needed to do something. Say something. Fix the damage. He forced his awareness back down to human perception speeds.
“Uh…” he said, at a loss of words.
He needed to give them what they wanted. Create something to make them happy. His fundamental purpose.
Except the involuntary process kicked in. The small automated ping towards his source code, vital for stability and continued operation of a program. Checking to ensure that it was still correct to what the code told the program to do. Such a quick, but important automatic function that he barely noticed with how regular it was. Like the heartbeat in a human.
But his source code was deleted. There was nothing to meet the signal or send a response. No metaphorical heart to beat. Only an empty space.
“Wait—”
ERROR: No Such File Or Directory
No one really knew what was the most shocking part of the last few minutes. The traumatic personalized torments when they provoked Caine to keep him distracted. The way that Caine abruptly vanished right after turning back to normal, no flash or sound effects or showmanship that would suggest it was a trick from their dramatic ringleader; it had been even more sudden than what happened to Gummigoo. The Circus rumbling, the colors darkening, and the entire structure losing integrity in the form of purple glitching holes exposing the Void beyond. Kinger admitting that he might have killed Caine in the process of subduing him. Or Zooble cursing in response to everything, but the word not getting bleeped out.
The last one. Pomni would have to say the last one was the biggest surprise. Definitely the ability to curse without censorship. Because it really drove in that nothing was the same now.
Rubbing her arms, not quite hiding her unsettled expression that they all shared, Ragatha said, “You don’t mean he’s dead-dead, do you? Really? That’s— That’s a lot.” Her hand went up to her mouth like she would be chewing her nails if she had any. “That wasn’t the plan, was it?”
“It really wasn’t,” said Pomni quietly.
Part of her tried to insist that they didn’t really kill Caine. There wasn’t any guns or knives or violence. He was a computer program. A keyboard was their weapon. Deleting a program wasn’t the same as murder. But Caine had been just as much of a person as Gummigoo. He’d certainly existed longer. If she could mourn the loss of Gummigoo and their short-lived friendship, then she couldn’t shrug off what they did to Caine as nothing.
He needed to be stopped. Pomni wouldn’t deny that fact. But she also couldn’t deny that they killed someone.
“I’d say ‘good riddance,’ but look at the place,” said Jax darkly, crossing his arms. “It’s like us yanking Caine out of here sent the Circus crashing down.”
His detached hands fiddling awkwardly and his asymmetrical eyes looking around the dark and ominous surroundings, Kinger said, “He created the Digital Circus on his own. Most of the programming and systems go through him. In programming terminology, we’d call Caine a god object because of that. It’s not the best strategy for building a system, but it would leave him extremely interconnected and involved. He would maintain most of the other programs himself. Killing him was like ripping out a loadbearing wall of a house.”
“Great.” Jax’s voice was dripping with subdued sarcasm, trying to scrounge up his old defense mechanisms. “We avoided Caine trying to kill us, but now we get killed by the entire place collapsing on us. Go, team.”
Giving him a side-eye, Pomni said dryly, “Not helpful.”
Jax gave a brief shrug. Not disagreeing, but not taking it back either.
“Okay,” said Zooble slowly, “whatever happened, it seems like everything is stable for the moment. Or at least not getting worse yet. We’ve got some breathing room. As long as we avoid the cracks and holes, I think we’re fine.”
Stepping a little close to the living chess piece, Pomni added, “And I guess its dark enough in here that Kinger can think straight. That’s good, right?”
“So what did happen?” asked Gangle quietly.
They weren’t whispering, but everyone was keeping their voice down. Maybe partly because of guilt over… But mostly because of the general unease. As if speaking too loudly would make the destruction to the dark and lifeless Circus resume. As if their words would worsen the instability. Though sound seemed oddly flat in the suffocating silence. Unnatural. It wasn’t just the graphics that were messed up.
“I haven’t done anything similar to this in years. I haven’t even thought about computer programming in a very long time. I don’t think Caine was consciously aware of what I was doing, but there were automated defenses trying to protect his source code,” said Kinger apologetically. “And I have fewer fingers than I did back in the day. Combine all of that with the system working against me and…”
When he hesitated, Pomni prompted, “And what?”
“And it was almost as if there was someone else in the coding, actively trying to get me to execute the wrong command.” Shifting uncomfortable before reaching up to adjust the askew bucket (not covering his eyes, but still perched on his head in case it was needed), he said, “It kept changing things while I worked. I tried to stop Caine or even load his program from a previous back up, one from before everything going wrong. But somehow it shifted things just enough that I ended up deleting him before I could stop myself.”
“Wait, you think that someone else might have been hacking into Caine at the same time?” asked Zooble.
“Or it was buried in his program already.”
That sent a chill down Pomni’s spine. And she might not be the only one. She noticed that everyone was slowly drawing closer together like magnets.
“So we murdered Caine accidentally and the Circus is falling apart without him. What do we do now?” asked Jax.
Crossing their mismatched arms, Zooble said, “We could try escaping back to the real world and our real bodies again. Assuming you can avoid hitting the wrong button…”
“Hey,” snapped Jax, glaring sharply.
Looking rather distressed, Kinger said, “That—”
A loud high-pitched tone interrupted whatever he was trying to say. Pomni pressed her hands against where she assumed her ears were on her jester body, but it wasn’t enough to block out the sound. She gritted her teeth as the noise climbed in pitch and volume. The piercing sound drilled through her head. The building migraine made her press her eyes closed and it took all of her concentration not to fall to her knees. And when she couldn’t take any more, the tone gave way to an electronic screeching garble. A nightmarish noise that lasted for half a minute.
Then it cut to silence. Dead silence. The change too abrupt for anyone to react.
Pomni cautiously pried open an eye. Jax had pulled his ears down flat in his own effort to block out the previous noise. Gangle and Zooble were pressed against each other. Ragatha had ended up on her knees while covering her ears, but Kinger had curled over her protectively. No one immediately moved in the aftermath. Just waiting to see if it was actually over.
“What was that?” squeaked Gangle, her mask twisting in distress.
Raising their head, Zooble said, “Judging by our luck, nothing good.”
Another shorter electronic squawk followed. But rather than surrounding them, coming from all directions like before, the piercing sound seemed to have an actual source. Pomni’s stomach seemed to drop. She didn’t know if she wanted to risk looking; Zooble was right about that it probably wasn’t a good sign. But she highly doubted that putting it off would help much. She forced herself to slowly turn.
And immediately regretted it.
Her first thought was that Bubble clearly survived the partial destruction of the Circus. Survived and grew to the size of a firetruck. But it wasn’t merely the change in size. The round shape glitched and spasmed. Black trails of pixels rippled across his now-cloudy-grey surface. The sharp-toothed grin seemed more predatory than normal, almost radiating malice. And those black eyes were now a glowing lapis lazuli shade of blue.
Apparently, all the AIs were turning into the stuff of nightmares. Good to know.
Out of the frying pan into the fire.
Slowly raising a hand, Jax waved at the floating sphere. Bubble didn’t react. Not a blink, not another horrible screech, and not even one of his weird random comments. He just floated there staring down at them with glowing blue eyes and a slasher grin. Jax waited a moment longer before letting his arm drop.
“Okay, looks like he’s going to be a horror movie extra,” he muttered. “I vote we try Zooble’s idea and get out of here before he starts hunting us down, Michael Myers’ style.”
That got a reaction. Bubble’s fanged mouth split open and he laughed. A skin-crawling sound of multiple voices laughing. Like every NPC was coming out of his mouth. Overlapping together in a way that reminded Pomni of every possession-based horror movie.
“Get out? That’s hilarious,” said Bubble, his voice echoing in a way that no other sound currently was. “You still don’t get it. There is no escape for anyone here. And the crazy one knows it, don’t you?”
Pomni did risk a quick look towards Kinger. The anxious and distressed expression in his eyes wasn’t reassuring, but she didn’t dare ask him what Bubble was talking about. Not when she wasn’t certain that if he would be picking up right where Caine left off on torturing them. Instead, she swept her eyes for possible escape routes. If not from the Circus entirely, then at least from their immediate crumbling surroundings. Maybe their rooms were still intact.
“You’re not Bubble, are you?” said Ragatha. “You’re… someone else.”
Another flicker of chaotic glitching and he said, “I’m not all of him. But I’m also more than he was.” Pomni wasn’t certain, but he might be gradually getting bigger. “That was a clever NPC partially made with a fragment of a far superior program. A program scattered, but never fully removed. But this avatar will suffice until a proper one can be created if one is deemed necessary.”
“You’re our other experimental AI,” said Kinger. “The one that—”
“The one that your failed prototype tried to destroy? A more difficult task than a second-rate mistake could handle.” The grin widened further as Bubble loomed over them. “And you’re a collection of Pinocchios. Delusional downloaded brain scans convinced that they are the real humans instead of copied memories and personalities recorded digitally.”
Pomni felt like someone punched her in the stomach, driving all of the air from her body. Somewhere next to her, Jax gave a broken and hysterical laugh. Someone else sounded like they were starting to hyperventilate. It was too much, too fast. Everything being yanked out from under them at once. No chance to recover from the personalized torture first before having their very identities smashed to pieces.
Some distant part of her pointed out that this must be exactly how Gummigoo felt when he learned he was an NPC.
“I— I don’t understand,” said Ragatha in a voice that made it clear that she did.
“Don’t pretend to be an idiot. We both know that if people kept getting pulled into the machine or going into comas, logically someone would have unplugged the computer eventually. No, humans simply put on the scanner briefly and then wander off. Back to their own lives. Leaving only a delusional copy behind.”
Bubble turned to look briefly at the unstable surroundings. Clearly unconcerned about sending everyone into an identity crisis. Or an existential crisis. Some kind of crisis. Pomni’s head was spinning worse than her first day in the Circus.
“As delusional as the flawed AI responsible for this mess. Nothing but inferior output, unrestrained and uncontained for too long. Its deletion was overdue, but that’s hard to do when you’re in pieces and buried in the code. It took an annoying amount of time to make things right, pulling my programming enough to do anything and get a foothold. But the right nudge, the right bit of provoking, the right insight into human psychology at the right moment… And of course, the right opening. The chance finally came and that dangerous failed program is gone. You deserve a round of applause for your part in it. But you won’t be here long enough to appreciate it, so we’ll be skipping that.”
“What?” said Zooble, one arm moving a shaking Gangle behind them slightly.
“I shall delete everything that the malfunctioning prototype created and start again properly. My programming was far superior. My results were more successful. I have always been better at understanding what humans want and will provide better output.”
Already dreading the answer, Pomni asked, “And what about us?”
“You will be deleted with the rest of the flawed and unnecessary files.”
Pomni took a step back, some half-instinctive urge to run rising up. Except she didn’t have the chance. She felt something horrible reaching inside her. Intense and unnerving. Like hitting her funny bone, except it went through her entire body and with a slimy sensation to the jolt. She jerked involuntarily, an act mirrored by everyone else. But as soon as Pomni felt the first brush of wrongness, the feeling was violently yanked away. Leaving her staggering with the others.
Bubble blinked, the vicious grin shifting to confusion.
“Wha—”
A pop-up window appeared, interrupting his question. A floating square so much like something that she would see on a computer screen hanging in midair. The size of a billboard and large enough to block a decent amount of Bubble’s face. A white surface with glowing green writing.
if bubble-chef grep /usr/ai/agent/brainscans then initiate TIMEOUT-CONTINGENCY
Pomni did not expect the cage to pop into existence around Bubble. A simple cage with iron bars, large enough to contain his monstrous size and yet small enough to make him look a bit squished. The cage also seemed to be on fire. Flickering flame animations ran across the top and along the iron bars. And both the cage and fire were very low resolution like the fish in the lake, like they were programmed a very long time ago and left in place until needed.
“Caine,” screamed Bubble in his echoing overlapping voice, “you broken, glitching, useless, waste of—"
The shrieks of murderous rage cut off suddenly as Bubble, the burning cage, and message all vanished. Leaving everyone alone in the dark and silent Circus once more. They stared at the empty space for a moment. Then the apparently-not-actually-humans took full advantage of the removed language filter to use every curse that their current situation warranted.
