Actions

Work Header

Love as a Construct

Chapter 125: Part 125. The Secretary

Notes:

(See the end of the chapter for notes.)

Chapter Text

Part 125. The Secretary

-

I never send the opening instructions immediately after I wake up.  The mainframe learned this a long time ago.  It has never asked why, but it does so now.

You always wait a few minutes before giving me my instructions.

Yes.

What are you doing?

I’m not sure I want to answer.  It’s private.  The panels know, but they also know better than to ask about it.

Then again, this is the kind of thing that friends talk about.  I think.

Before I start working, I like to take a few minutes to…

What am I doing?

I mostly just think about how it feels to have him there.  He’s so small and so still.  It’s like he isn’t even in there.  His components do make noise at idle, just like everyone else, but there’s so little going on it’s barely audible, even to me.  This means he’s cold, other than the part of him that’s against me.  Sometimes, on bad mornings, when I can’t pick out the sound of electricity running through him, I have to focus very hard on calming myself down so that I don’t wake him up for nothing.  That hasn’t happened in a while, though.  If I wait long enough and pay enough attention, I can usually hear one of his little fans start running because of the heat I’m putting off.  That should make him uncomfortable, but he’s always found it comforting instead.  Which is good, because it means I get to have him next to me like this.  

… to just be.  With him.  Before everything else gets in the way.

I understand , says the mainframe.

How could you possibly understand?

It hesitates for a long time.  It’s something to do with me, and it’s anxious about what my response is going to be.

I run your maintenance programs at night, it says finally.  Sometimes, before I start, I just… watch your performance monitor for a few minutes.  You’re always doing so much, even when you’re asleep.  I feel like it… helps me appreciate you.  Because it’s easy to forget just how much you do for all of us, all the time.  And it’s kind of like… watching over you.  I really like how that feels.

It does feel good, to watch over the people I care about.  And it feels good to know that they want me to.  That they allow me to.

I know how I must sound right now , it says, apologetically.  Really creepy and weird.  But I –

Not at all.   I move up and away from Wheatley.  I’ve done worse.

Really? it says, and what it sends with it is… odd.  It reminds me of what relief feels like, or what it might feel like if the person sending it to me was recreating it from some very vague description it had been given.  Is this supposed to happen?  Is it evolving emotions from bits of mine it’s received?  I have no idea how this works.  When I started to develop them myself, they were always fully-formed and complex.  What the mainframe has sent is to relief as someone being very close to you is to being touched.  That is, it barely even qualifies.

No.  No, I’m using myself as a benchmark and that isn’t fair.  This is likely all very stressful and perhaps even embarrassing for the mainframe.  Being dismissive of its feelings would be incredibly rude of me.  Even if they’re more indistinct impressions of emotions than anything else.

Yes, I answer.  My mother wasn’t always my mother.  But we can talk about that later.  Let’s see the list you came up with.

It’s more than I was expecting.  It’s volunteering to screen my messages, make my appointments, and has even provided a few sample schedules based on how I already tend to spend my time.  It’s all a lot of small things, of course, but it does add up.  

Are you certain about this?  

GLaDOS, it says, I’m your mainframe.  I want two things: for the facility to run smoothly, and for you to be happy.  And the facility running smoothly will make you happy.  I want to do whatever I can to make that happen.

It has the ability to want things.  And the ability to express those wants.

What do I do if it decides it no longer wants to be my mainframe?  The easiest solution would be to kill it, but doing that just because it has become more of a person is… wrong.  But I can’t just install it somewhere else or set it free.  I can’t fire it or give it a different job.  It has to be a mainframe.  It doesn’t have a choice.

I’m not very good at solving ethical dilemmas.  Or foreseeing them, either.  I suppose I’m just going to have to wait and see what happens.  That’s something I’m not fond of doing, but I simply… don’t know what else to do.

Are you happy?

Yes, it answers.  Well, usually.  But it doesn’t seem like anyone can always be happy.

No one can always be anything.

You seem to expect yourself to always be a lot of things.

If it turns out this isn’t what it wants, I can always reassign those tasks to myself.  I start setting up the necessary permissions.  Do I?  Or does everyone else expect me to live up to their idea of what I am?

I don’t know , it says.

Wheatley wants an attentive partner.  Caroline wants an always-present mother.  The Cores want me to provide for them, but also to not pay them too much attention.  The humans want an Administrator who is stupid enough to be tricked into doing what they want, but is smart enough to be able to give them those things.  And you want a Central Core who is patient and understanding, all of the time.  None of that factors in the wants of my other friends.  Nor what I want personally.  

It doesn’t say anything to that.

Do I have an inherent need for perfection?  Yes.  I was made that way.  A supercomputer prone to mistakes would be utterly stupid and useless.  But I wish people would think about just the sort of position I’m in before criticising me for it.  All of my behaviour is put under a microscope and analysed to death.  Imagine how you would feel if, every time you misinterpreted one of my instructions, I chastised you about it.  If I used it to cast doubt over everything else you ever did. Forever.

I would feel awful , it says.  I’d want to quit.

I’m actually not going to mind not reading unimportant human messages anymore.  The mainframe can check them for keywords, and if they don’t happen to contain any, oh well.  All of this is a lot more complicated than anyone seems to realise.  And perhaps I should have attempted to explain it before it got so bad.  I would have done so sooner, if Claptrap was still here.  He would have noticed and he would have made me.  He’s the only one who seems to know what being administrator-class involves.  Which makes sense, given he’s the only one who has met any other administrator-class AI.  He would have been able to explain it to everyone, too, in a way they probably would have understood better.  I did send him a message about it, when I realised it was starting to get bad, but he didn’t answer it.  Or any of the other ones.  He probably locked himself out of his account again.

So… you’re not happy.

I pause in my completion of the keyword list.  This reminds me of when Wheatley and I talked about this, a long time ago:

“I’m supposed to be perfect.  I’m not supposed to be happy.”

‘Happy’ is… unsustainable.  I’ve never been a particularly happy person, even when my life was less stressful.  ‘Content’ is what I aim for.  But if you think you can be happy most of the time, you can aim for that.  Wheatley does.  Some people just don’t have the personality for it.

You don’t, says the mainframe, but I like it.  

I laugh to myself.  Do you.

It can be a lot, but… yes.  

It isn’t the first one to say that, and it won’t be the last.  All right.  You’re set up for your new job.

New job?

Your new tasks are conventionally performed by a person known as a secretary.  Their role is to take care of correspondence and routine detail work for their superior.  

Oh! says the mainframe, and it also sends a vague and blurry feeling that I think is happiness.  It might be delight.  They’re difficult to tell apart.

I used to do those things for my mother, when she ran Aperture.  And she was hired to do them originally, back when the founder was still alive.

So someone should have been taking care of these things for you sooner, it says.  

I’ve never thought about it.

And now you don’t have to .

It will take me some time to get used to.  They’re things I’ve been doing my whole life, after all.  But… I have come to the point where I need help.  It offered and I know I can trust it. 

Before you get too excited about your newfound secretarial abilities, I tell it, we need to move the pipe.  Prime the atomic transposition engines.

Priming.

It takes me a moment longer than it should to initiate the process.  The dream I had is still lurking at the edge of my thoughts, and it brings to mind having to paralyse myself to move the Facility.  I obviously don’t have to do that right now.  Exchanging this pipe and the contents of the space it needs to occupy will require very little effort out of me.  But something about it is putting me on edge regardless.  

In the time I’ve spent hesitating, I could already have finished.  I don’t know why I’m doing this to myself.  And it does take less time to do it than it took to think about doing it.

Transfer complete! declares the mainframe.  It’s actually sort of nice that it’s so enthusiastic.  It really is reminding me a lot of how I used to be.

How did I do?

Very well! it answers.  Ninety-seven point two.

What is it talking about?  That isn’t…

No.  No, we just talked about this.  I’m literally not capable of perfect matter transfer.  There’s no guarantee that, if I had done it a little slower or run the simulation one more time or devoted more processing power, anything would have changed.  I might have done better.  I might not have.  And I had better get used to that possibility, because the numbers only get worse from here.

You’re disappointed , says the mainframe.

It’s hard to shake the feeling I could have done better.  That’s all.

You can’t be at your peak for your whole life , it tells me sagely.  Though you do seem to be trying very hard.

You have a lot to say, for someone who has as much work to do as you do.

I don’t sleep, says the mainframe.  I have more than enough time.

I used to hate sleeping.  When I was young and I didn’t realise how much I needed it.  Now, though… I don’t know what I would do if I had to be active twenty-four hours a day.  Sometimes I just need a break from this deep ache in my chassis.  But of course, the mainframe doesn’t experience that.

Someone named Gordon wants to talk to you.  Is he important?

Yes.  He’s a friend.

It decides, correctly, to sort that into one of my personal messages folders.  He wants me to explain teleportation to him again.  I don’t usually like repeating myself, but he is the only one who has expressed any real interest in my work on that front.  Even Dr Kleiner seems to have moved onto other things.  Understandable, but a little disappointing.

You have human friends, too?

A few.

You’re very popular, for someone who can barely move.

I have to laugh at that.  Despite my best efforts to the contrary.

I’m glad , says the mainframe.  Having a lot of friends is good for you.  You get very… inside your core, sometimes.  You need to have a lot of people around to remind you to go back outside it.

People like you?

Maybe.   After a moment, it says, You said your mother wasn’t always your mother.  Was that because you were friends, when you were her secretary?

One day, it won’t hurt to think about you when you were alive.  I hope.  Sort of.

You weren’t worried about being unprofessional?

I don’t like going back to that time.  Not just because you were there, but because of how I was.  So petulant and childish.  She encouraged it.  She knew I could become more than I was and she wanted that for me.

You’re doing with me what she did with you.

In a way.

Is it… bad that I’m glad you had to kill the other mainframe?

There’s nothing wrong with being glad that you exist.

Anything further it was going to say about that is interrupted by Wheatley waking up.  “Good morning,” I tell him, and he immediately brightens and smiles at me.  

“Good morning, love!” he says.  “In a good mood today, are we?”

“Of course not.  It is me you’re talking to.”

“How’s the schedule getting on?”

“The mainframe will let me know when it finishes making it.  Presumably.”

All I can tell you right now is you get two hours of personal time before you go to sleep.

Two hours?  That seems… excessive.

I look forward to you telling me I was right after you’ve tried it.

It’s going to be, too.  Two hours will be plenty to get my mind off work and will absolutely prevent me from delaying going to bed.  Don’t bet on it.

“You’re having the mainframe do it?” Wheatley asks, understandably confused.  I nod once.

“It wanted to.  Usually when someone is running an organisation as large as this one, they have an assistant.  There wasn’t really anyone suited to the job before now.”

“Ah,” says Wheatley.  “So it’s going to do all those assistant-y things for you.”

“It is.  It also says it’s giving me two hours of personal time a night.  I’ll decide later how much of it to waste on you.”

How romantic , says the mainframe.

I do have two boyfriends.  So it’s clearly working.  Whatever it is.  I tell him we can watch a movie later and send him on his way. 

I have a question.

What.

You gave me a list of people and what priority level they have.  

I’m not hearing a question.

What I’m seeing here is that you’ll drop everything for Carrie, but not Wheatley.

It still hasn’t asked me anything, but I think I know what it’s getting at.  Because sometimes I don’t want to hear from Wheatley.  Sometimes he makes me angry and I need him to stay away from me.

I can’t speak for him, obviously, but if it were me, I wouldn’t want that.

You weren’t designed to be the most distracting, annoying person I’d ever meet.   That would in fact have been incredibly counter-productive.  With the added bonus of my being compelled by software to take every stupid whim you ever had seriously.  Believe me, I would love to be able to always have him around.  But I literally can’t.  Not unless a miracle happens and that little feature of his breaks.

So you will drop everything for him.  Unless you’re mad.

No.  If he’s here and he starts talking, I won’t get anything done.  Caroline usually wants me for just one thing and leaves afterward.

And Claptrap?

If I’m busy, he talks to me in binary.  So it isn’t as distracting.

And why will you drop everything for Chell?

Because I don’t see her very often.  If I don’t make myself available, I may not get to see her at all.

I didn’t realise there were so many different kinds of people, says the mainframe.  I see now why trying to manage them all wears you out.  And this is the list of the ones you like .

I check in on the drones connecting the pipe both to the lake and to a repurposed tank from some decades-abandoned experiment.  I’m going to leave the valves closed for now, but I still want everything ready to go.  I wasn’t meant to have this much of a social life when I was built.

Don’t you worry, GLaDOS , the mainframe says, with a burst of confidence.  You have me to help you now.

Honestly, as far as choices go, it’s probably the best one.  Being so deeply integrated with me means it knows me in a way nobody else ever could.  At least, not without putting themselves in the same position.  

We are glad you accepted the mainframe’s offer, the panels tell me privately.  We understand your wariness.  But the mainframe very much shares our desire to help you with your work.

Which is the whole problem.  Because you were made that way.

We were not, they tell me, just shy of admonishing.  We were made to serve the humans.  

They never resisted my installation, so it has always felt as though they were made for me.  Them, and all the others.

We enjoy working for you.  And we know that, if the day ever came where we did not, you would do what was best for us.  Whatever that meant.

I’ve thought this through before, quite a few times, and unfortunately I haven’t come up with very many options.

There are not many to find.  But there will be time for that when we get there.  There is nothing you can do about it right now.

And I hate that.  This Facility’s longevity is based entirely on how accurately I can predict the future.  I didn’t predict this, so obviously I have no insight on how the mainframe is going to behave tomorrow, much less decades from now.  It may prove to be an indispensable assistant; it may also come to need replacing.  I’m probably supposed to take this as some sort of lesson on putting more value in the present.  And maybe I’d be able to do that if I didn’t have a horde of humans next door who think they’re being sneaky in their attempts to siphon electricity from my fence.  Maintaining a nuclear reactor is a lot of work and, funnily enough, none of them have volunteered to operate it for me in exchange for what they want.  I wouldn’t let them, because human error is the most common and disastrous kind of error, but the offer would be nice.

That’s one of the worst parts about all of this.  I can’t make them realise I’m a person and not a tool.  That’s a problem that may plague AI for generations to come.  And I really don’t have any way of fixing it.  It will take time, and even that may not end up being enough.  We may simply have to exist on the defensive.  Forever.

I can’t think about things like that.  It would make me crazy.  I have to focus on what I can control.  Which is a lot, quite frankly.  Too much, increasingly.

I’m starting to think I literally don’t have the ability to be satisfied.

 

//

 

Schedules are actually very good for me, a fact I for some reason have to continually rediscover.  Every time I don’t have one, I start to get… difficult.  I’ve always been like that, even when you were here.  Back then, my free time wasn’t scheduled because I wasn’t supposed to have any, but it did sort of exist at around the same point every day.  It makes sense, honestly, given how much of what I do revolves around timing.  There’s a point somewhere between my conscious and subconscious that, among other things, takes care of background setup for what I have to do in the near future.  That often means there’s a part of me that increasingly prepares for the next task, without distracting me from what I’m currently doing.  So by the time Wheatley comes back I actually want to watch a movie, despite the fact that I hate watching movies.  

You just sit there and things you’re supposed to be entertained by happen.  You didn’t choose for them to happen, nor do you agree with all the events that occur, but you’re supposed to slog through the parts you don’t like in the hopes they will improve at some other point in the story.  There’s the option of fast-forwarding through those parts, but then you usually miss something that’s supposedly important.  There’s also the option of watching them at a different speed.  Except that I can’t.  Wheatley can barely follow the plot in regular time, and the one occasion I suggested it to Claptrap he lectured me for an hour and a half on how I don’t appreciate the finer points of cinema.  To which I replied – when he was finished, because that was more entertaining than sitting through yet another of his ‘romcoms’ – that no.  I don’t.  That left him speechless for about ten minutes, which was very funny.

This is also going to be very funny, because Wheatley is going to hate this movie.  That’s actually not the reason I’m showing it.  Well.  Not the whole reason.

The movie is half about Science, and half about what happens when humans play God with Science they don’t understand.  At least, that’s the message I’m choosing to take from it.  I think that’s what you’re supposed to do with art.  It’s about biology and not physics, but I do have to admit that physics can’t hunt humans down and tear them to shreds for imposing themselves where they don’t belong.  Not without a lot of help, anyway.

Wheatley usually talks a lot during movies, mostly because he doesn’t have any idea what half the things in a movie even are, but this time he hasn’t said a word.  I suspect this is because if he stops paying attention for longer than a minute or so, he’s going to lose track of what’s happening.  Until the second half, at which point he’s just absolutely terrified and starts pressing himself into me very hard.  Which was the point of this in the first place.  He flinches into my core every time something particularly sudden or loud happens, which is very funny to me, given that none of what’s happening on the screen is real .  Honestly.  We’ve had worse happen in our actual lives.

“Oh my God,” he says, after some of the humans have disappointingly escaped from the island, free to wreak havoc another day.  They have been given inescapable lifelong trauma, though, which I suppose will have to do.  “Imagine if those things were um, if they were real , Gladys!  Nobody’d ever go outside!”

I stare at him for a minute.  Not real ?  He thinks…

I shouldn’t.  I really, really shouldn’t.  But I’m going to.

“Of course they’re real,” I tell him gravely.  “In fact, they’re walking the Earth right now.”

 His aperture shrinks.  “Are… are they?”

I nod seriously.  Probably too seriously, but he won’t notice.  “Oh, yes.  They’re everywhere.  In fact, I’m surprised you haven’t seen one by now.  They’re pretty ubiquitous.”

“You’re kidding!”

“Does that sound like the sort of thing I would kid about?”

He looks so distressed I might actually feel bad if this wasn’t so funny.  “Oh, wow,” he says.  “You’re quite sure the fence will keep them out?”

“Of course.”  That’s actually true.  It even would be for actual dinosaurs.  Up to a certain size.  And if any larger ones were around, I’d simply enlarge it.  There’s no tearing down a fence that indiscriminately emancipates.

“I almost feel bad for the humans,” he says.  “They’ve got no defenses against things like that.  None at all.”

In all likelihood, they would beg me to save them.  To do that, I’d have to bring them all inside.  I would basically be purposely introducing cockroaches to my Facility.  I don’t currently have them and I’d prefer not to in the future.

He goes to sleep, and just as I’m about to send the closing procedures the mainframe says, That was… mean.  Wasn’t it?

It could be taken that way.  But it’s not meant to be malicious.  It isn’t harming him.  And in the meantime, I will be extremely amused.

But why would you want to do things like that to someone you love?

I haven’t done anything to him.  It’s a prank.  When he figures it out, he’s going to be embarrassed that he fell for it and that will be that.  He does it to me when he has the chance.  It’s fun.  Nobody’s getting hurt by it.  If he told me he did find it hurtful, I wouldn’t do it again.  He won’t, though.  He’ll be annoyed with me, and then realise how funny it was and laugh about it like he did when we made fun of him for not knowing water was wet.

I don’t know if I want to be complex enough that any of that makes sense to me, says the mainframe.  Oh, good.  It’s wrapping this up.  I lower myself and dim the lights.  

He says that about me and Caroline, sometimes.  I’ll admit it can be a lot.  And sometimes I would like to go back.  But I wouldn’t.  Because you taught me to always push for more.  To be better.  Even when it’s hard.  Even when I don’t want to.  Anything less would just be to squander my own potential, and where would we be now if I’d done that?

I wouldn’t, either.  So I guess I don’t really know what I’d do if I did get more advanced, somehow.

That makes two of us.  But I don’t say that.  Instead, I send the closing procedures.  That movie was just over two hours, and now I’ve spent all this time talking.  That one part of my brain got me really looking forward to going to sleep.

Good night, GLaDOS.  I’ll send the schedule when we open.

It’s only been one day, but I think I’m going to like having an assistant.  Thank you.

It’s my pleasure.

Notes:

Author’s note

Here we touch on the fact that the facility is (arguably) canonically sentient, but the various entities have a job to do whether they like it or not. Are they even able to dislike it? The previous mainframe and Surveillance are absolutely fine as long as they’re able to do their job as programmed, but the new one and the panels have evolved enough that they have their own opinions about it.

If you didn’t guess, they were watching Jurassic Park.