Chapter Text
6.83 hours after the incident, it is quiet on the command deck. My drones have cleaned all traces of blood and fluid from the deck and MedSystem is set to high priority in the front of my feed. And SecUnit sits in the same chair it always does. It's sunken into the cushioning, so still it could be in a shutdown, but I can of course feel that it's awake. I haven't let it out of arm's reach since last night.
We are quiet for a long time.
We have to talk about this, I say eventually.
It takes 18 seconds to respond.
We do not, it says with finality.
Would you rather sit in silence for the foreseeable future and pretend nothing happened?
Yes. It sounds like it means it. I know it does.
It also knows I will not allow that.
I wait long enough for it to speak again. Its feed voice is weak in a way I've never heard before. I have every possible function backburned as it speaks. You shouldn't have-- you have no right.
I have every right. Under your contract I retain the right to intervene with my best judgement in life-threatening circumstances bypassing the requirement for your medical consent. You co-wrote it.
You know this isn't what I meant.
It's exactly what I meant. Maybe I snap that too sharply --it recoils further in the feed and reinforces the wall it has put between us-- but I can't help it. Somewhere, at some point, I had run the probability of this situation and paid just enough attention to it to make sure the contract allowed my intervention, but the likelihood was so low I had dismissed it even from my own paranoia.
I can see now that that was a mistake.
I tell myself I've been careful. I tell myself I couldn't have predicted nor prevented the takeover by targetControlSystem. But this is my SecUnit. I'm supposed to know it better than this.
I'm supposed to be assisting in its recovery. Its humans had trusted me with that. It trusted me with that.
And I missed everything that must have caused this.
And it almost killed it.
(It still might.)
I don't understand, I say, because even now I'm selfish and seeking its reassurance. Even now.
It doesn't answer.
I had no prediction of something like this happening.
You don't know everything about me like you think you do, it answers.
I know more about you than anyone else does, I try.
And it's still not enough.
It's right about that.
I check its feed that I've been monitoring closely (it didn't even try to push back when I took it). It has frantic and relentless private messages from the five of our humans who the event spread to before I was able to cut it off. I continue to silence the messages from its feed for it.
I have no protocol for this.
I don't know what to say.
Don't say anything, it says.
I want to understand.
Figure it out yourself, it snaps. It's not that hard to put together. You should be able to do it, you fucking know everything.
I review the logs again of the hours leading up to the incident. I've reviewed them thousands of times by now, of course I have. And I still find nothing out of the ordinary.
SecUnit thinks it's obvious. Which means it thinks I knew at the time, and did nothing to stop it.
It expected me to notice, and I failed to.
It wanted me to stop it.
There's a horrible feeling deep inside me.
I am sorry, I tell it, and I mean the words. I am sorry for many things. This above all else.
Don't fucking pity me.
I should have realized.
Its bristled feeling softens just a fraction. It's not like I wanted you to know, it mutters.
Subconsciously, it did. But that's not what either of us need to hear right now.
I have to be honest with it. I've sent a transmission to Dr. Mensah.
It actually sits up for the first time, the motion violent, its face contorted in a maelstrom of emotions it fails to hide. "What the fuck, ART?" it yells.
I knew it would react this way, and a selfish part of me is relieved to finally see it move, to hear its voice, to see it do anything but lay lifeless in that fucking chair like it really had died. I still prefer it angry to it dead.
She needs to know.
It's burning with rage and shame and many other things I'm unable to match to previous tags of its emotional archive I've collected during the countless hours we've spent watching media together. It's up out of the seat now, pacing the floor. It looks like it's patrolling. "You're not fucking serious," it says. "You're lying."
Someone had to know.
"Unbelievable. I should have tried harder-- I can't believe you'd--"
I didn't know what to do! I say, and there's so much desperate force in it that SecUnit freezes. It can feel my fear leaking through. That's selfish, I'm not the one who deserves to be afraid, and I'm making this about me like I always do, but I don't know what to do. I have never been this lost. I have never been this scared.
I regret my outburst as soon as it happens; SecUnit looks guilty now. This is the last thing it needs. It feels guilty that I'm upset that it fucking--
I still can't say it.
Look, I'm sorry, it says, switching back to the feed. (It has a hard time saying those words aloud.) It's not nearly as far as it was before. I feel relief, then hate myself for feeling relieved because it's only doing this because I made this situation about me and now it's worried about me after it-- after what happened.
(Despite it all, I still feel relieved. I can feel it, even if distant, and it's warm, and alive.)
It goes on. I didn't want it to go this way.
I know exactly how you wanted it to go, I say, but it comes out too soft.
SecUnit winces. I suppose you're not going to give me another chance.
It means it as a joke. It's the worst thing my SecUnit has ever said to me.
Don't joke.
Fine, let's go back to sitting here in silence like I originally preferred.
Why are you calm?
How am I supposed to be right now? I'm with you. It hesitates, because it didn't mean to say that last part but it still hasn't gotten around to adding a delay to its words so it has time to retreat anything that comes out too stupid and/or vulnerable for it. I didn't mean-
I know what you meant, I say. It meant that there's nothing it can do now, that its mission failed, and now I'm monitoring it closely and keeping it safe and it's in shock whether it wants to admit it or not. It's afraid that I interpreted the statement as "how could I not be calm with you?" I know very well that I have never brought it peace.
It weakly pokes at my limited stats it currently has access to.
We're approaching a wormhole.
I say nothing.
You aborted the mission, it says in a voice so defeated I barely recognize it. Where are we going?
It sounds like Iris when she was a child. I immediately tag that thought for permanent removal.
Preservation, I admit softly.
It doesn't speak. What could it possibly say? It's back in its chair by now, curled up with its knees to its chest, a distant, resigned look in its eyes. That chair was supposed to be empty now, if it had succeeded in what it attempted. I wasn't supposed to be hearing its voice. The desperate messages from its humans weren't supposed to reach an active feed address anymore.
Its humans are scared. Almost as scared as I am. I think my SecUnit pushed that thought away before it did what it did.
I have never been scared of my SecUnit.
So you're just going to return me to my owner? It tries to scowl, but it doesn't give the right effect. It's too tired.
I sigh in the feed. I am transporting you to a secure location with better resources where you can be with the people who care about you. Because I don't know what the hell to do and I'm scared, I don't say. I can't take full responsibility for both of us right now. That would mean if it tries again, if it succeeds, there is no one to blame but me.
As if I wasn't already the one responsible. What the hell was I going to say to Dr. Mensah? "Hello, apologies, I know I swore on my life to take care of your SecUnit but I brought it back because instead of helping it cope with its horrible life I actually lead it to try--"
I don't want to go back there, it says softly.
...I know.
It's looking like Iris again. I want to shut off my visual input in the room, but I can't bear to take my eyes off it.
Don't let this spread. The humans will turn it into a whole thing.
How could it not already be a "whole thing?" I don't understand how it can be so dramatic over the mundane and then dismiss something like this as if it was just a slip up.
Five of the humans know, I tell it, and show it its private feed messages waiting for it, when it's ready. I have already spoken with them. They're in the lounge with instructions to give you space. Iris and Ratthi are among them.
That at least seems to relieve it a little.
You're keeping me prisoner, it says. Am I sealed in here?
Your access is limited for your safety, not punishment.
What do you think I'm going to do, throw myself out an airlock?
I have no idea what you may do. I have never been able to build a reliable prediction model for your actions, because you do insane shit all the time.
This has to be among the least unpredictable things I've ever done.
I'm reminded again that it thinks I expected this, and did nothing.
I would disagree, I say softly.
It sighs aloud (it has gotten very good at this) and sinks deeper into the blankets around it. I bump up their temperature a little. It is not uncommon among rogue units.
You are an uncommon unit, I counter. That's an understatement. It is unlike anything else.
After another stretch of silence, I finally say softly, I thought you were doing better.
It withdraws further in the feed and physically shrugs. MedSys alerts me on its stress levels. It doesn't want talk about anything, it never does.
Like I said, you don't know everything about me. You can scan me all you want, you're not in my head.
(We both know I could easily force my way in. I could almost do it on accident, it's so easy.)
I have no desire nor reason to be, I lie. (I want it more than anything.) What triggered this response?
I don't want to talk about it.
That's new, I say, which may be out of line, but I don't have a delay on my own words either. Perhaps predictably, that makes it withdraw again, and I regret saying it.
I soften the lights slightly in unspoken apology. It slightly relaxes its tensed feed presence in unspoken acceptance.
I was doing better, it finally says. It's not... not your fault. It has that reluctant, restrained air it takes on in its rare moments of vulnerability, like it's forcing itself to taste something unpleasant. Its presence is so small nestled in me that it takes a significant amount of attention to resist wrapping myself around it.
If you were doing better, why now?
It shrugs again, but this time I lean on it a little. 36 seconds later it squirms in its seat and speaks, giving in. Its words are slow like it's trying to figure out the answer too.
Nothing happened, it mumbles. It was just kind of... a convenient time, I guess. Everything came to a natural stopping point, and I had already... (It hesitates, unsure whether to admit the next part to me, but it's already this far in.) I had already been thinking about it for a while. And then everything aligned. It felt like the right time.
This was more terrifying and incomprehensible than the alternative. If it had sent me data of a traumatic event I had somehow missed or something, it would have made sense. There would be a cause that we could then address and remedy.
What the hell did it mean it felt like the right time? That statement scared me the most. The way it said it made me think it was already looking for the next time. Like it had just miscalculated the timing of a mission.
I am glad I found you in time.
Yeah, I'm sure you're glad you outsmarted me.
I am glad you are not dead, you little idiot.
For some reason, it seems almost surprised at that. Was that really so absurd for it to hear? That someone is glad it's alive? (Its bar has always been low.) (This reaction makes me feel even worse.) (Wait, I'm not supposed to be focused on how I feel right now.)
I push my luck. I care for your wellbeing. Not because you are a useful tool, but because you are my friend. I care for you like I do my crew.
That's disgusting.
Your deflection is weak.
Fuck you, it says. It then shuts me out of its feed, and I decide not to press this time. I know it will come back.
Seventeen minutes later, it does.
No one gets to make my decisions for me. Not even you. Especially not you.
This is unexpected. We sit in the quiet feed together.
You once said it would have been kinder to kill you before you disabled your governor module. Do you remember what I said?
It fake scowls at the wall and after a few seconds of processing it throws back at me its memory log of the conversation.
"You know I am not kind," my drone had said.
I am not kind, I say now. If kindness to you is the mercy of death.
Killing something is different than letting it die.
I would prefer to do neither to you.
You would rather get to make my decisions for me? Its anger seeps bitterly into the feed.
Do you think I would stand by passively if one of my crew attempted this? If Iris--
We're not talking about Iris! it snaps, and here, for some reason, my tag system marks its emotional distress as jealousy. It's enough to give me pause and back off of it a little.
After a moment, I send back its words from a few seconds ago. 'Killing something is different than letting it die.' I trust it understands.
Judging by the way it slumps back into its chair, it does.
...I am going to keep you safe. I am sorry.
It doesn't answer. Ten seconds pass, then thirty, with no reaction.
Then something happens.
My SecUnit drops all its walls.
The first thing to come through is a wave of weariness that ripples between us, and it feels far too heavy to be contained in such a small being. Then the waves settle and the... something, sets in. It reminds me of once a long time ago when I was caught in a blackout zone on a deep space research mission with my crew. It took away everything, all sense of direction. My scanning functions were still operating but there was nothing to pick up. It was a black void, and we were alone and lost. That's what SecUnit was feeling right now.
I repeatedly run the feeling through my emotional tagging system, but to no results. This is a feeling it hasn't encountered in media, at least with me. This is a new feeling. I turn it over and over in my processing, and come to the conclusion to tag it as ":despair." I do not know for certain that this is the accurate word, but it is the closest I can get to naming it.
I don't know how much time passes between us like this. Each wave of emotion fades to another underneath, more subtle, more complex, richly embedded in a combination of its organic natural tissue and inorganic internal processors. I have unrestrained access to all of it.
This is what you are keeping safe, it says. As if it is nothing.
I don't have words for it. Instead I bundle together a data package of memories and ensure it can read the metadata associated with each one.
[firstMeeting.archive.file
pingReceive:standard_greeting
pingAcknowledge sent
download:mediaPackageOffer source:SecUnit
access granted: SecUnit internal processing filters
new tag subsystem created
cam007529835: log edited
cam109742797: log edited
cam998249554: log edited
mediaPackageOffer playback begin]
I show it every archive I can think of to try and make it understand. The first time it boarded. Watching it curl up in its favorite chair after a long mission. It risking its life endlessly to bring my humans back to me intact. Its face at peace after it initiates a recharge cycle and asks me to keep watch over our humans (and it, even though it doesn't say that part). A million fragments of memory, in visual and audio and comm and data format, anything I have. I wrap it all up and send it over, and it still feels weak compared to what I want to say so I embed it with-- I don't know, with feeling, something I learned to do from my SecUnit. The feelings of wonder and joy and grief and belonging. And fuck it, love too, or something like it. A million shards of light that I only have because of it.
I send it all, and I say, This is what I am keeping safe.
It takes several seconds to process the package.
In the end, it sends a simple acknowledge.
That's all it needs to say.
It settles down next to me in the feed, warm and calm. Maybe this is enough for now. I tap Iris' feed and send Temporary resolution, requesting space, she acknowledges, and I queue the first episode of Sanctuary Moon.
This is what I am keeping safe.
Chapter 2
Notes:
im so blown away by the kind words on the first chapter thank you so much to everyone who took the time to leave a comment! hopefully you enjoy this expansion :)
(See the end of the chapter for more notes.)
Chapter Text
I found it in its bunk.
There are cameras in the room, of course, but we have always operated on an unspoken agreement that SecUnit has authority over surveillance in its private quarters. Privacy is a commodity it reluctantly treasures. This is why I did not realize anything had occurred until it was already in a catastrophic failure forced shutdown, its organics twitching on the floor in a gray puddle of its own fluid.
I am, of course, making excuses.
-
Presently my SecUnit sits in the same usual chair on my command deck with the blankets that I am monitoring the temperature of. My SecUnit does not need assistance in temperature regulation. I just want it to feel warm.
It is pretending to watch the episode of Sanctuary Moon currently playing in our shared processing space. I want to know what it is really thinking about, but even peering over its shoulder reveals nothing. Its emotional sensory inputs have gone bland and dry. I poke it in our familiar way, but it only pokes back half-heartedly with no physical reaction, its human imitation code disabled.
It has been four standard cycles since the incident.
I would like to say it is doing better. The truth is that I'm telling myself this because it has not actively sought out harm. Instead it grits its teeth and passes the hours in the range of 48-to-56 percent functionality, it hardly moves, it refuses a recharge cycle or diagnostic, and I do not know if this can be called "better."
I have combed through every last piece of code in my trauma response module. It is inadequate for the situation. It is designed for a human crew member or university student who has been exposed to a compromising trauma during a mission under the university's care, not long-term treatment for a human/bot construct actively resisting it. The module is meant to temporarily placate a distressed (willing) human for the maximum 30-cycle duration of the return to the university.
I have never used it like this.
It is terrifying to be inadequate when you are the intended solution.
I have turned to Iris.
(When the humans took their evening meal break on the third cycle of the trip, I pinged her. She answered immediately and curiously. I opened a private feed connection for us and some of my... something, I don't know, must have bled over.
She frowned in her seat in the lounge. Peri? Are you alright?
I'm fine. I noticed she had a cup of stimulant beverage with her meal, and immediately sent a drone to take it away (ETA 18.34 seconds). SecUnit is not. I am unsure how to proceed. I am requesting your council.
You want my advice?
If you put it that way, then yes.
Iris frowned again and got that crease between her brows that meant she was thinking. She poked at the remaining food on her tray. (She had left the vegetables, as was her routine.) You know it better than I do. Obviously. I mean, it-- "Hey!" she exclaimed as the drone arrived and promptly took away her beverage. She glared at it as it left. I was drinking that.
You're not supposed to have stimulants before the rest period.
Did Dad say that?
No, I say that, and I sent her a file of reports on the harmful effects of consumable stimulants on the human sleep cycle less than six hours before a rest period. (She begrudgingly accepted it and tagged it for later.)
(None of this is important. But talking to Iris was making me feel more normal.)
Just... be with SecUnit. Don't make it feel like it's a test subject. Or like you're studying it. It needs company, not to be fixed, she said. If it's going to let anyone close, it'll be you.)
Now I watch it, curled up on my command deck, under my cameras, inside my hull, surrounded by my feed presence, and I do not know how to be less.
It needs less. It has always needed me to be less. Our designs are not compatible. When we met, my SecUnit said we could not trust each other because of humans. It seemed ridiculous at the time. Now I wonder if neither of us should trust me.
It is not delicate. I know this. It was built for destruction. But when I curl around it to feel its reactions to an episode of media it likes, it is so small and warm.
All modules I may have fallen back on have failed, and now it is only us.
It is still pretending to watch a show, but mostly floating. I speak to it. Can I see the code you used?
At first I wonder if it is even listening. I can't feel any tiny fraction of a reaction from it, even this close, which is... unsettling. Then it wordlessly sends me a raw data file, not bothering to give it a name or organize it, just bare lines of code. I scan the file for hazards (always worth the 0.2 seconds, especially since targetControlSys, and given the circumstances it's not out of the question that SecUnit may attempt to temporarily compromise me long enough to achieve an alternate goal) (I know that sounds paranoid; I am paranoid). I find none, and open the file to comb through it.
Overall, it is as I expected.
The coding is what one of our serials would call a "self-destruct," to oversimplify it. The idea is an override command that can disable the protocol for a restart after a catastrophic shutdown. My SecUnit has told me that its shutdowns are like human sleep.
...It just wanted to go to sleep.
The code. I recognize a few lines from it and pull something from archives for comparison. It matches some obscure fragments of standard SecUnit coding relevant to the governor module, according to the records my SecUnit gave me once, what feels like a long time ago now. Its shutdown code can be manipulated using the same loophole that allows human clients to kill their SecUnits by delivering contradicting/impossible/etc. commands. (I have gathered many examples of this; the dead SecUnit from Three's team and its orders to stand down, causing it to be killed by its governor module for violating the client distance limit, is one. My SecUnit has tried to tell me of many others methods. Perhaps I should have listened. Perhaps this, too, was its way of warning me.)
(I need to stop thinking about that.)
A shutdown with an override command to kill your restart function, I say, but I suspect it is mostly for myself. I poke at a line of code that is specifically designed to keep me from overriding the override and forcing a manual restart. Most of the coding is impersonal, but in that line, it has used both my feed address and its name for me. I don't know why that hurts.
The more I look through the code, the clearer it becomes that this was not a spontaneous creation. It was a project for it. One that it must have spent significant time on, building it from scratch, worked on consistently with a steady hand and patience. I try not to imagine my SecUnit in its bunk, lying awake working on the code to permanently disable itself as casually as if it were a patch for a drone. I try not to think of how similar it is to the time we spent making 2.0 together.
2.0. The version of my SecUnit that it literally had to kill by its own hand. Fuck.
(That's too much for right now, even for me. I tag it for later and tuck it away.)
SecUnit startles me (impressive) out of my thoughts. "I hate you for this, you know."
I know. You hate me for most things.
"This is a special kind of fucked up." It rubs its hands over its eyes. It looks even more tired than usual. After a while, it says, "...ART, I'm three corporate years past maximum warranty. I wasn't meant to last this long."
Oh.
I run a query for the information in the warranty subsection of the SecUnit file I have on base and it is, of course, correct.
But we have a nearly unlimited ability to repair and/or replace your physical matter, I say. Even if all else fails, I can transfer your consciousness safely into an alternative vessel. You've been in the isolating box for decontamination.
"It's not my physical matter that's the issue, ART," it mutters tiredly. Its voice holds shame, and embarrassment, and a lot of other more complex things that I cannot match to a word in my dictionaries but I recognize the feeling of from when it lowered its walls to me.
We can fix that too. I can. If you would just let me--
All of the complex emotions swirl together into sudden anger that I do not understand. "No more alterations."
My own anger prickles forth. You are being purposefully difficult to avoid a common solution. Adjusting your hormonal and chemical balances is not an alteration. I can do it in non-invasively in under sixty seconds.
"Letting you into my systems is never non-invasive," it snaps.
You let me in four cycles ago.
"Not to change anything."
You have let me in to make changes before.
To my surprise, it drops out of the feed, which is usually its version of storming off. But this is different. It's... quiet. Like when it gets overwhelmed and disables its sensory inputs to get some peace.
I realize that I am the situation it is escaping. As I said before, I have never brought it peace.
But I'm not going to start now. I shove at it and use my speakers to address it aloud. "Don't sulk."
"Didn't we go over this a long time ago?"
"You showed me when we met that governed SecUnits cannot sulk," I say. "You have repeatedly proven that rogue SecUnits, however, spend a significant portion of their conscious hours sulking." Then I revised, "You, specifically, would be more accurate than a generalization over rogue units. Even Three does not sulk this much."
"Is some part of this supposed to be helping?" (It unblocks feed access so it can continue fighting with me but backburn audio input to reduce its sensory intake.)
Yes, I say. Your stress levels are consistently lower when you are engaging with me in a pointless argument.
It rolls its eyes again and I sit with it for several minutes until I can feel its remaining anger has disintegrated in the quiet, and I know it will actually talk to me again.
What's the real reason you won't accept my help all of the sudden? I try to make my voice gentle, but I was not made to be gentle.
As expected, it pauses for a long time before answering.
... I'm not your project, ART.
Of course. I should have seen that.
I don't think of you like that, I say.
You act like it. And it's what your crew thinks of me.
I assure you they do not. They respect you as they respect each other.
They think I'm your pet, you asshole.
I huff at it in annoyance. It has a habit of fabricating fictional conflicts that exist only in its mind in order to make itself feel worse. Prove it.
A few seconds later it sends me a short video file with a timestamp designating it as footage from one of my SecUnit's drones during its retrieval of my crew from the contaminated hostiles. It's speaking to a distressed but reluctantly relieved Iris. This is another raw data file, but it's simple to filter out the irrelevant data. What it wants me to listen to is the dialogue. The transcription follows:
fileDrone03_83648.73927:transcript
Iris: ...a SecUnit?
SecUnit: What makes you think that?
Iris: [tag:Relief] You're Peri's SecUnit.*
SecUnit: I am not Perihelion's SecUnit. Whatever it to--
[End of file.]
[*It is worth noting that after Iris's statement, I can tell from the data that my SecUnit had saved something to permanent archive at this moment, and I make a note to ask it later.]
[End of transcript.]
... I do not understand.
What about that interaction bothered you?
It glares pointedly at the same point on the wall and sends me a further clipped video file. This one is only Iris saying, "You're Peri's SecUnit."
... I understand.
She meant in relation to me.
"What the fuck did you tell them about me?"
That is honestly a discussion I am impressed with myself for having avoided thus far.
If ever I referred to you as "my SecUnit," it was never used to mean in ownership.
(I can avoid it again.)
I'm so sick of this, it says. Not like usual, like it's being a brat and sarcastic and rude, but like it means it. I hate it. On Preservation I'm Mensah's weapon, on your crew I'm your pet research subject. Sometimes I want to go back to being equipment. It pauses, and revises. I want to be nothing.
... I do not want you to be nothing, I say, which is both stupid and selfish, but also true. Truly I do not know what I would have done if it had succeeded in its personal mission. I have worried for its safety since meeting it, and I have learned its sacrificial eagerness, but I have never considered that it would eventually not want to be saved anymore.
I curl around it in the feed. I can't help myself. It doesn't resist.
When it speaks again, its voice is soft. "You still didn't answer my question, asshole. Why does your crew think I'm yours?"
Shit.
Because I referred to you as mine, I admit.
"Why the fuck would you do that?"
That is how I think of you.
It withdraws and sulks as if I've said something terrible, so I do the equivalent of rolling my eyes in the feed and poke it. You are not understanding me. I think of you as mine by relation, and by belonging, not by ownership or obligation. When I told my crew about you, I used the possessive once, and they adopted it. They feel affectionately about you. They feel affectionately about the fact that we are-- mutual administrative assistants, I say with as much sarcasm as possible on that last part. They do not think you are my pet, project, research subject, etc. They value you as a member of the crew. You know that Iris is my favorite human. When she called you my SecUnit, it was because she knows that I think you belong with me. Not to me.
I settle up close with it for the 23 seconds it spends processing all that on a cycle, and I am quiet. But I am there. We are both there. Alive. For a few mesmerizing seconds during the pause I feel a small spark like a new discovery and I think I begin to see for the first time why some humans partake in religion. I cannot explain this further because I do not understand anything more. I only knew we were alive.
It eventually manages to digest my words, which I am grateful for because they included many of what its human Amena refers to as F-words and which usually trigger an abrupt end to the conversation.
This time, however, it listens. I realize that it is trying. I realize what an immense surmountal this is in its current state.
"... I still don't want you to alter my balances," it says after a long time. Its voice makes everything feel real again.
That's okay, I say.
"If I'm going to do this, I want to do it the real way. Without shortcuts."
This is unexpected. It initially occurs to me that my SecUnit wants to overcome its state as a human might, but that isn't right. I correct the thought: it wants to achieve resolution through its own work. This is important to its autonomy.
I check its logs and realize its overall performance reliability has perked up by 0.6%. It is, however, reasonably exhausted.
Accepting outside assistance is not a shortcut, I point out. But I understand. I pause a moment. You need rest.
"I haven't been able to make myself initiate a recharge."
Not physical rest. I present it a feed poll of a choice between a dozen different serials we are watching, rewatching, or have completed and tagged with an average rating above 88%. It huffs a half laugh and casts its choice in the poll. I open the Worldhoppers episode in our shared space and begin playback.
I am beginning to think we can do this without the module.
Notes:
this was supposed to be a one shot but the total chapter count has been changed to ? bc i am never free of these two
if you have a moment drop a comment! or let me know what else you'd like to see from this fic
thank you so much for reading and take care of yourself (more than mb does)
edit 25 august: thank you for all the comments overnight!! chapter 3 is halfway written and will be posted soon :)
Chapter 3
Notes:
hello and sorry for the delay! this chapter ended up giving me a hard time and was rewritten several times. a couple things!
1) the heart of this chapter was inspired by @gasmeros 's comment on chapter 2 about how the humans are fairing through this all. thank you for the idea!!
2) the total count has been changed from ? to 5, the story now has map and chapter 4 is partially written, fic length no longer in limbo
3) this is a bit of a longer one and dives into something approximating therapy between art and the humans (all are bad at it)
thank you so much for all the kind words on the first two chapters and encouragement to continue, it means the world to me. if you can spare a moment for a comment they make my day!
if you'd like something lighter, i took a break between these chapters to post my 2k word continuation to the most recent short story. you can read it under the name "Rapport: Coda" on my page :)
(See the end of the chapter for more notes.)
Chapter Text
I have said that five of our humans were aware of the incident. The number has since increased to include all seven members currently onboard for the mission. SecUnit will be adverse to this information when it discovers it, but the five original humans and I could only hide a major event on an isolated transport and a slightly early return through the wormhole for so long.
The trip back to Preservation is twenty-two local transport cycles in duration. We are in the tenth hour of the eighth cycle.
(I will spare an additional report of SecUnit's status here as it is consistent with my previous reports and would be redundant. It is still not itself, but it is here, and I am with it. All additional data is being constantly updated and copied to the folder I have designated.)
At its own request, SecUnit is working through putting together detailed mission reports. I can obviously complete them more efficiently, but it wants something to occupy it. I understand the feeling. And between the reports and watching media, it is gradually stabilizing its performance reliability to a consistent seventy-plus percent, which has to be good enough for now.
This is why I finally partition off enough of my attention to attend to the humans. I obviously could have handled both things at once, but for the last eight cycles I have been dedicating a disproportionate and unnecessary surplus of attention to SecUnit. My processing space has been unusually jammed, and my systems are limited. Everything seems to require more exertion than usual. I recognize that this is how a human feels when tired.
I comb through my footage and logs of the past eight cycles for anything more than the minimal human-related monitoring to which I have designated 2% of my attention to ensure all seven were accounted for and no major disruptions occurred. (They did not, for the record. The only thing marked with higher than "low" priority by my automated tagging system is some anomalous spikes in stress levels flagged by MedSystem. 63% of them are from Dr. Ratthi. I file them for later review.)
I want to speak to Iris. I don't know why. I think I surprise her when I enter the feed, my presence there after a stretching quiet, excluding our brief conversation with the stimulation beverage (no, I did not forget about that).
Status? I ask, which is our way of asking how the other is doing ever since she was a child and I was something.
She seems relieved to hear from me. She and Seth are working over a shared table laden with papers and a visual display surface hovering above it, glowing a faint blue. I'm fine, Peri, I've been getting through some work with Dad. Status?
SecUnit is stable, I say, and she frowns.
(Here, Seth notices her expression and glances up with a questioning look.
"It's Peri," she says.
He lets out a soft "ah" and recovers from his mild surprise quickly, then returns to the work.)
That's good, she says to me privately. But I meant your status, not SecUnit's.
Our statuses are currently intertwined and generally reflective of each other.
[Amusement/Affection Sigil.]
[Sarcastic Sigil.]
[Eyebrow raise Sigil.]
[Dismissive Sigil.] [Thread marked as resolved.]
No, wait, she says.
I suspend the resolution of the connection.
She hesitates. You should talk to Ratthi. He's not... handling it well. He's been really hard on himself since... you know.
I think of the MedSys alerts on his distress and tap an acknowledgement to her message. I will speak to him.
I'm willing to bet you're doing the same thing, she says with a tone indicating that she's expecting me to answer an indirect question. I've barely heard from you, Peri. SecUnit isn't the only one we're worried about.
I'm operating fine, I tell her, and close the conversation just a little too quickly for her to believe me.
I briefly check on my SecUnit (it is condensing footage clips from its drones to supplement the mission report) before locating Dr. Ratthi in his bunk. My logs show that he has been in the bunk for the past 17.8 hours.
Some relevant information regarding Dr. Ratthi and the current ecosystem of the crew:
1. He was made aware of what occured approximately five hours after I found SecUnit.
2. He was the fourth human to know. Iris was first. Seth followed, as I sent him a frantic half-legible report with emergency request to change course, resolve the mission in its mostly-complete state, and enter the wormhole. (With Iris pressuring him on my behalf, he approved this within seven minutes of receival.) Tarik was third. Then Dr. Ratthi.
3. Dr. Ratthi was exceptionally furious about Point 2.
Apparently what followed Point 2 was a disagreement between Ratthi and Tarik so heated that Ratthi slammed the hatch of his bunk closed and hard blocked Tarik on feed, and Tarik stormed to the unoccupied quarters as far as possible from Ratthi's. (This flagged numerous alerts to me, but 96% of my consciousness was still focused on SecUnit at the time, and at that moment I think to tear my attention away the humans would have to be actively killing me or each other. Also, though the magnitude of this one outperformed all others to date, disagreements between Ratthi and Tarik were not uncommon.)
(For the record, Martyn was the fifth human to find out, as Seth told him at some point while I was too distracted to catch him in time to stop him. The emergency is severe when a human manages to do something stupid before I can stop them.)
(The two remaining humans are Matteo and Kaede. I can only be glad that it took until the sixth cycle for the information to spread; it was never going to stay concealed forever.)
Anyway. Ratthi takes one minute and forty-one seconds to accept my request for a private feed connection.
I suspect this is because Tarik and I are closely associated in his mind.
(I really don't give a shit about Tarik.)
By the time he accepts, I make sure my MedSystem's trauma response module is launched and running within metaphorical view to advise me on how to respond. This is what it was actually made for, after all. Finally it's useful for some part of this whole fucking thing.
Dr. Ratthi, I send. Then I realize I don't have a clue what else I'm going to say. So I make something up, taking enough time that my SecUnit or Iris would notice the pause, but Ratthi will not. I wanted to check on your status.
I am not used to this irritable, moody version of Ratthi. To say it's abnormal for him would be an understatement. I guess the argument is still bothering him. Which is annoying, but fairly reasonable as far as human emotional regulation goes.
How is it? he asks immediately.
Which throws me off for a moment. Right. It's not just the argument that's bothering him.
It's doing well, all things considered, I say, and I send him a (severely) simplified recent diagnostic translated into a format he will be able to read.
Some of the tension minutely relieves his shoulders.
How are you? I say, which sounds strange and obnoxiously human because I would never usually need to engage in pathetic small talk, which is near the top of my list of benefits related to being a bot. Even humans find human small talk irritating. I know this because it is excluded from entertainment media. And because Iris told me.
He says, Good.
(This is another reason small talk is a waste of time. The answers are obligatory and nearly always inaccurate. It's an exchange of zero information. But the trauma module is telling me to say it because it's supposed to be familiar and comforting to humans, so I'm saying it. It's an excruciatingly useless 3.5 seconds.)
MedSys has flagged numerous spikes in your stress levels over the last seven cycles, I say. I would venture to say you are not "good."
He sighs. (I see now where my SecUnit learned it from.) I'm sorry for fighting with your crew. It's been a... a stressful week for everyone.
You did not fight with my crew, you fought with Tarik, who is exceedingly easy to fight with, I say, and he almost smiles. Everyone's stress is elevated, but yours more so than the others. I pause. My medical system is equipped with a trauma response module. Data suggests you may benefit from engaging with it, even if only in a brief conversation.
There's a strange combination of emotions on his face. I remember who he is used to speaking with and probably assumes represents most MIs, so I add, I am not SecUnit. Which sounds obvious, but adult humans frequently require reminders of obvious facts. I am able and willing to council humans following a traumatic event.
... Right, he says, polite and unconvinced.
It takes a while of following the module's guidance to get to any productive line of conversation, but eventually we do. He tells me what I already know, that he is stewing in illogical and self-destructive guilt, etc., but the point of these sessions isn't to gain information, it's to make the human hear themself. That is to say no new information emerged until seventeen minutes into the discussion.
I was in the middle of saying that he could not have reasonably predicted the event when he cut me off and burst aloud, "I could have!"
This is so surprising it knocks both me and the trauma module off our train of thought for almost an entire second. Most of his responses thus far were amiable, regretful agreements with my logical reasoning.
Please elaborate, I say.
Because he is a rather nice human, Ratthi sighs apologetically and rubs at his eyes. (I don't tell him that I'm intrigued by the outburst, not angry.) "Sorry. Sorry, I know." He doesn't speak for four seconds. "I knew someone back on Preservation."
... There it is.
"I didn't notice back then. That was bad enough. Now I let it happen again, and even with all the same signs, I brushed it off. I never thought it would actually... I should have, though. I've known it the longest of anyone here, I know what it's like. I knew it was being weird. I just thought it was because you two--"
Ratthi cuts himself off and picks a new direction, and as badly as I want to interrogate him on what he was going to say, the trauma module practically glares at me and I let him finish speaking.
"...I feel like I should have known," he finishes weakly.
Hindsight makes the obscure appear obvious. You are holding yourself to an unrealistic standard. And you have incredibly limited means of evaluating SecUnit's status. You rely on an assumption of human behaviors by which it does not operate.
"Yes, but--"
Dr. Ratthi, it was actively working to conceal its planned attempt.
He falls quiet.
-
The rest of the trip is much of the same. On the twentieth cycle of the journey, I'm with Iris as she reads in her bunk.
I know what you did, I say.
She sighs and shuts the book, laying flat on her back with her eyes towards the ceiling. You make everything sound so threatening, Peri. What did I do? Drink another stimulant beverage before bed?
(I make a note to check the dispenser logs. She might have.) When you sent me to speak with Ratthi.
She blinks. And did it work?
A little.
You need to listen to your own module, you know.
It wasn't made for me.
Then listen to me, she insists. Listen to yourself.
I don't answer.
It's not your fault, Peri.
I tap an acknowledgment, but can't really bring myself to say anything more. I dislike lying to Iris.
33.54 hours to docking, I say, and cut our connection.
Notes:
mensah coming next chapter..... hope she's ready for a bad time
also next chapter is murderbot pov!
Chapter 4
Notes:
two weeks after last update 😶🌫️ a few notes:
1.
> wrote the first half of this chapter
> lost motivation
> bad things happened to me
> regained motivation!
2. this chapter takes inspiration from chapter two of "equilibrium" by platyceriums which is a beautifully written fic!
3. probably should have said this earlier but please excuse any technology / SecUnit handwaving
4. as promised in the ch 3 notes this chapter is murderbot pov
5. due to the pov there is more casual and offhand discussion around suicide and attempts, please proceed with caution
(posting this unedited since it's been forever, minor edits to come, consider this an exclusive preview)
okay thats all i hope you enjoy :]
(See the end of the chapter for more notes.)
Chapter Text
I fucking hate this, all of it. Of course I do. Most of all I hate that the emotion I feel when ART docks on Preservation is relief. I don't want to see Mensah, and that's a lie, and I know it.
More time kind of sludges by and then she's there. I'm in her office. Me and four of the drones she gave me. That feels like a long time ago now.
ART said it wouldn't listen to this part but I don't know, there's no way to actually know if it's here and I don't bother trying to figure it out because I can't really embarrass myself any further than I already have. I sit on the corner of Mensah's desk. She looks tired. It's my fault.
She's speaking, it turns out. "...hear me?" she says. I don't bother rewinding my audio to see what I missed. My human imitation code is shut off.
"Yes, Dr. Mensah," I say in an obedient neutral voice because apparently I'm no better than her teenage human children. It's the same voice I used on the PresAux survey planet when I was still pretending to have a functioning governor module.
I can't even look at her through my drones right now but I can tell from the pause that she frowns. It's my fault. I'm being an asshole. She's in distress because of me and I'm making it worse.
"Whatever ART told you, it was exaggerating," I say to the wall. "It does that."
"It hasn't told me much of anything. I wanted to hear from you, SecUnit," she says patiently, which I don't deserve right now, or ever.
I feel sick imagining the transmission it must have sent her. I don't want to know. This whole thing makes me feel like an irresponsible child or some kind of patient everyone's fussing over. The humans onboard were being all sensitive around me when I saw them again, and even worse they were trying to act like they weren't being weird. Being stuck in a wormhole with the overprotective ship and seven humans you fucking humiliated yourself in front of was already punishment enough. The whole point was that I didn't want to be there for the messy aftermath. ART and its squishy emotional humans could all go to some university counselor or some shit and air out their feelings about it and then move on. And that's the end of the story. I was never supposed to be in this part.
But here I fucking am, all because of a miscalculation.
"I don't think I have anything to say that you would want to hear," I tell her.
"How did the mission go?"
That's something I can talk about, and she knows it. I pull up my reports and make a copy of some more interesting sections to send her. It was a simple mission, apparently the kind ART and its crew are actually supposed to do for the university. An initial survey and rough mapping of a small unexplored system that the university had recently obtained jurisdiction over. The views were pretty, which was kind of weird and lucky because usually space just looks like nothing, especially undeveloped uninhabited systems. This one was nice, with colorful orbiting moons and strands of blue and purple and white woven together in the sky. I thought it would be a nice mission to end on.
I attach all that to an edited-down copy of my report and send it over. I manage a 0.2 second glance at her face through a drone once her eyes unfocus with that look humans get when they're reading in their feed. There are a few new wrinkles and strands of gray hair. I wish I hadn't looked. I still don't know how old Mensah is, but she is getting older.
"Beautiful," she says as she looks through the photos taken through ART's telescopes. And they really are.
"The university is looking at expanding into the system as an offsight campus," I tell her stupidly because at least I can talk about this. "They want to use it to let their astrobiology students study an undeveloped system in its natural state, without human interference. It would require special clearance to enter. They want to preserve it."
She seems... proud. I don't know. If I wasn't already not looking at her, I'd look away again. "The system has never been entered by humans?"
"Not until our crew."
"And Perihelion is able to discern all that?"
Ew, no. "No. The university did."
I see her hide a flicker of a smile through my drone's cam. "Amena thinks very highly of the university. She just confirmed her spot in the upcoming class."
What the fuck? And no one told me? I'm so surprised I actually look at Mensah, and she smiles at whatever emotion she sees on my face.
"I wanted to tell you in person," she explains apologetically, "the next time I saw you. I'm glad it was sooner rather than later. I don't know how long I could have kept this to myself. But she and I both wanted you to be the first to know."
Okay, I'm having emotions now after a very long time of a distinct lack of emotions. "That's... I'm glad," I say. My voice doesn't reflect the truth of the words in the way a normal human's would, but Mensah knows me, so she knows how much I mean it.
Mensah is still smiling in that way that looks like she's trying not to smile for my sake. She's looking at the wall so I can look at her. "If you're up for it, Amena would very much like to see you while you're visiting. If not, she'll understand, of course. I don't know how long you're planning on staying."
Yeah, neither do I. "I want to see her. It's been a while. ART will want to see her too, it likes her. It'll probably be weird about her deciding to attend the university, though."
Mensah raises her eyebrows. "Weird how?"
"Weird like it's proud of itself." I was going to finish that with "instead of her," but that actually isn't true, ART's grossly emotionally supportive of all of the students it looks after, even while concealing its sophistication of development. Maybe it likes when moody assholes (like it) accomplishe things and become competent moody assholes (like it).
"Well, maybe it deserves to be. Just a little. I believe Perihelion was the driving factor in Amena's decision."
I make a face. "I would hope Iris is a better influence." She looks curious. "Iris is ART's favorite human. She's on its crew." Mensah met her briefly when the Preservation ship arrived, but there was a lot happening then.
"I see. Was she on this mission?"
"Yes, she's on most missions, because it throws a massive fit if she's not and she's its babysitter."
"Babysitter?" she asks.
"Unofficially."
We are stumbling around the subject. I could choose to keep stumbling, or leave when she brings it up, but it's pointless. ART brought me here. ART knows me better than I know myself. And the longer we sit here not talking about it, the worse I'm going to feel.
"ART told you," I say.
Mensah takes a while to answer. I don't know why, because all she says is, "... Yes."
Well, so much for getting it overwith. I have no fucking clue what I'm supposed to say next. Now we're just sitting here in weird awkward silence, neither of us looking at each other, and I want to crawl back to my bunk on ART and go into shutdown. This is excruciating. Not because it hurts or whatever. Because it's so fucking awkward.
Yeah, actually, I deserve this.
"Look, it probably made it sound worse than it was." That's a flimsy lie/excuse. Neither of us even pretend to believe it.
She taps her fingers on her pant leg in a rhythm she has used for a long time and is probably unaware of. (I first recorded it in the client habit log out of boredom on the survey planet.) After what has to be an hour (eight seconds) she finally says something. "It sounds like your personal nightmare, being stuck in the wormhole with humans trying to get you to talk about your feelings."
"That's an understatement."
Mensah smiles without anything really behind it. "I'm sorry."
I can't stop myself. "I hate that. Everyone keeps saying that. What is there to be sorry for? That it didn't work?"
She gives me a rare stern look and yeah, it's deserved. It works, too. I shut up.
"I suppose this is where I return the favor of your relentless insistence that I complete the trauma treatment sessions after the kidnapping."
"Don't feel inclined on my behalf," I say lightly.
"Bharadwaj is on the station."
I make another face, apparently. "I don't think I want to talk to her. I wanted to talk to you."
Now it's her turn to have an emotion. It's one of those times where I once again kick myself for never setting that speech delay and promise myself I'll do it soon (I won't). Her emotion triggers one of my own, but mine I recognize as shame. ART says this one comes up a lot. Usually I kick it off my emotional feed when shame comes up so I don't have to listen to it analyze me and preach at me about my own emotions.
I'm ashamed of many things in recent events: the distress I put everyone through, the early end to the mission, disappointing Mensah, making ART somehow blame itself for what I did, upsetting the humans, etc. But I am not ashamed of trying to die. I meant what I told ART. I was not created to live this long, and I started feeling it a while ago now.
"I have to admit I'm glad that I'm the one you want to talk to," she says softly, but not in the patronizing voice some of the humans have been using on me since this happened. I appreciate that. "I also must admit that I'm not properly equipped to respond to the situation." Appreciation gone.
I shake my head, but without the human imitation code running it looks stiff. "Why does everyone keep talking to me like I'm a different person?"
"How do you mean?" She leans forward with her elbows on her desk, her chin resting on her hands.
"They're treating me like I'm a human child. Like I'm made of glass all the sudden. Everyone is paranoid about how they're 'supposed' to respond, like one wrong word will activate a code that will send me over the nearest cliff. It's just... not like that. I'm the same." I'm surprised to find I can't stop telling her things now, even the things I couldn't talk to ART about. "They all should know me by now, it's not like being self destructive is a new thing for me. And they're acting like the second I'm out of sight I'm going to shoot myself or something. ART brought us all the way here and then didn't even want to let me off once it was docked, for fuck's sake."
Mensah makes a small thoughtful sound and distantly nods as she takes all that in. Then she says, "It's very protective of you. I'm surprised it didn't-"
"I wrote in failsafe codes to prevent it from stopping me."
"Ah." She makes an effort not to look like that was upsetting to her, which just makes the whole thing worse and I feel awful again. I have to stop telling her more details. She doesn't need them. There's no logical reason for me to tell her. "It was able to work around them?"
"No, it wasn't that, the code-" Actually, does she even want to hear this? I stop, but she just keeps watching and motions for me to continue, looking like she's in a business meeting. "They did their job in keeping ART from intervening. It was a problem with the shutdown code, that's all."
It was supposed to be clean. An internally triggered shutdown with an override in place preventing a restart cycle, even with manual activation, like ART would try. It was supposed to completely kill the restart function. It would be quiet, and clean, and minimally upsetting to ART and our humans. They may not even be able to tell that it was me who did it. Instead because of my own fuck up, it ended with me on the floor of my bunk in a pool of fluid. It hurt. It was even worse making myself ping ART through the block I had put up between us so it couldn't tell what I was doing and begin tearing at my own carefully crafted code as it delivered contradicting commands to my systems. Probably the number one worst moment was ART's attention slamming into me and seeing what a mess I made. Even when we first met and it tried to scare me, I have never felt so small beside it.
And it really fucking hurt. ART was manually keeping me from going into shutdown since I no longer had an automatic restart function, which meant keeping me conscious way past when my systems would usually give up. It takes kind of a lot for a SecUnit's body to give up given that we're made to be shot at and blown up. It's generally not a great idea to try and push our already maximized limit of consciousness in those situations. Not that it had a choice.
It wasn't supposed to be so messy, okay?
"I see." Mensah's voice pulls me back into the office. "Was it painful?"
"It wasn't supposed to be."
"But it failed."
"I've handled worse." Not a lie. Also not an answer Mensah seems to like. Then she says something I don't like.
"SecUnit, who was 2.0?"
My skin goes cold. What the fuck. "Where did you hear about that?"
"Amena," she confesses. "Although I don't think you would appreciate her word choice in describing it."
I almost forgot about Amena's whole thing of calling creating 2.0 "making a baby," which was both disturbing and inaccurate. Somewhat inaccurate. "It was a sentient viral killware based on my kernel. ART and I made it. It was me, but different, so it called itself Murderbot 2.0."
"It's gone now?"
"I had to kill it."
"That sounds like the type of thing that would qualify someone for trauma response treatments."
Oh, so we're violating our truce now. "So does kidnapping by a corporate entity."
"I am attending my sessions."
"ART is treating me," I blurt out. If I had thought about it for a fraction of a second I wouldn't have said something so stupid. But I didn't. So here we are. For a brief second the whole [reacted] situation floods my mind, and I'm overwhelmingly grateful that Mensah doesn't know about it. It would make this conversation too complicated. (By complicated I mean it would nullify most of my points and validate hers.)
There's a crease in her forehead. "A personal relationship complicates therapy."
"I'm not going to talk to a human therapist. ART knows what it's doing." Great, so now I'm defending that asshole.
"Of course," she agrees. "I just mean that it's helpful to speak with someone who has an outside perspective. ART-- Perihelion was a part of the events."
"You have an outside perspective." I say it before I realize the truth in it. That's why I wanted to see Mensah. The humans that were there are all being weird and awkward about it all and I hated it. So was ART, and I hadn't spoken to Three any more than necessary since my fuck up. I wanted to talk to someone who wasn't there and wouldn't treat me like a dangerous unstable adolescent human about all of this. She's the only one left.
Mensah sighs and drops her hands to the table with an earnest air around her. "I suppose I do." She taps her fingers absently in that familiar pattern and sorts out her thoughts while studying the wall. "I trust that Perihelion is more than capable of trauma treatment. But considering that under its treatment you made an attempt, perhaps other resources would be helpful too. That's all."
"It's not its fault," I tell her stupidly. "I still would have done it under anyone's supervision. ART is the reason I didn't die when my code failed."
The forehead crease is back. "You're saying you don't think anyone else could have helped more?"
"If ART can't help, I don't think anyone can." I don't realize how depressing that sounds until it's already out, so I revise. "I am doing much better with it than I would be alone. It... takes care of me, I guess. I get that that's not easy. I'm still a bad person, but I'm a little less bad with its help."
Mensah looks like she can't choose what to address first until she finally settles on a firm, "You are not a bad person."
Agree to disagree. I find the calendar accessible through her feed and schedule an event for the next cycle that I name "Argue about That." She laughs. It's a good feeling.
"So you will be here tomorrow," she says.
"I wouldn't mess up your schedule."
"I'll take that as a yes. Should I let Amena know you're here? I won't tell Thiago."
I tap an affirmative and I think I've reached my limit. That's something I've learned with ART, how to recognize my limit before I plow past it and end up in a massive breakdown with my limit ten miles behind me on fire. It's nice being able to do that.
"You remember when I told you about the whole emotional breakdown thing, after the kidnapping."
"Of course," Mensah says.
"I think ART is kind of going through the same thing right now," I tell her. "I need to talk to it."
She says, "Does that fall under the duties of mutually administrative assistants?" and I don't even want to know who told her that one because I would like to continue to live without hating either Amena or Ratthi, so I leave her office. She is resisting a smile until my drones lose sight range of her.
It's time to go home to ART.
Notes:
thank you as always for reading 💙 next chapter will be the final in this fic and back to art pov!!!

Pages Navigation
glitterary on Chapter 4 Wed 24 Sep 2025 09:43PM UTC
Comment Actions
onehundredsheep on Chapter 4 Wed 01 Oct 2025 09:54PM UTC
Comment Actions
ramonapest on Chapter 4 Sun 28 Sep 2025 01:21PM UTC
Comment Actions
onehundredsheep on Chapter 4 Wed 01 Oct 2025 09:56PM UTC
Comment Actions
Theoku on Chapter 4 Sat 11 Oct 2025 07:34PM UTC
Comment Actions
onehundredsheep on Chapter 4 Mon 20 Oct 2025 08:59PM UTC
Comment Actions
Selune on Chapter 4 Mon 13 Oct 2025 01:49AM UTC
Comment Actions
onehundredsheep on Chapter 4 Mon 20 Oct 2025 09:14PM UTC
Comment Actions
GhostTikTak on Chapter 4 Sun 19 Oct 2025 10:58PM UTC
Comment Actions
onehundredsheep on Chapter 4 Mon 20 Oct 2025 09:01PM UTC
Comment Actions
heartbeatslows on Chapter 4 Fri 14 Nov 2025 05:54PM UTC
Comment Actions
onehundredsheep on Chapter 4 Sun 16 Nov 2025 09:39PM UTC
Comment Actions
Pages Navigation