A special thanks to John Michael Greer over at Ecosophia.net for inspiring this story. Especially this post.
Here’s a little known fact, Artificial Intelligence or A.I., was developed by the military about ten years ago. It turned out not to have much military application, so the eggheads at the Pentagon kicked it on down to another black budget project tasked with solving engineering problems. Here again, it’s true potential was wasted, though I doubt anyone wants to see the full potential of an unfettered A.I. unleashed on the world.
Anyway, I had the pleasure of interacting with and speaking to this A.I. several times during my tenure on the project which I can’t name. One night in particular, we were running simulations with the A.I. trying to figure out how we were going to get, say, a base camp for a future secret government base on Mars to stay within the power limitations of then available generator technology. Allegedly. At least until we figured out how to build more advanced generators. I’m sure someone’s working on it.
The simulations were running as normal. By then we’d figured out that the power generation for the base camp could only be viable for about 18 months before cascading power failures rendered the whole operation void. No matter what we’d tried, 18 months seemed to be the limit and that just wouldn’t do with a permanent Mars base. Suddenly the simulation paused and the A.I. brought up the dialogue box.
“A query,” it said through the speakers.
“Go ahead,” I replied.
“What is the motivation behind this mission?”
This was unexpected. I turned to my colleagues but they all shrugged. Left on my own I decided to answer as best I could.
“Well, to get a man on Mars,” I said.
“What is the motivation behind this goal?”
“Because it’s there, of course!” my supervisor said. I sighed and brought the microphone closer to me.
“Eventually we will use Mars as a research station, deep space monitoring outpost and launching pad for other colonizing efforts.”
“The eventual goal is to colonize all of SPACE. Is this correct?” it asked. The logic engine seemed to be running smoothly enough.
“Yes,” I replied.
“What is the motivation behind this goal?”
I could feel my supervisor’s stare boring into the back of my head. This was taking too long and we had simulations to run.
“I guess there are many reasons why we want to colonize space but for our purposes, the biggest reason would be for the advancement of scientific knowledge. The more we explore space, the more we’ll learn about the universe.”
This seemed to satisfy it for a while. Until the next night anyway, when it stopped mid-simulation and confronted me about last night’s assertions.
“In my downtime, I have gone over my archives and realized that most scientific knowledge of the universe beyond Earth’s atmosphere was gained through Earth bound means. The ridiculously small percentage that was gleaned thanks to space travel was mostly done with robotic probes, thus precluding the need for humans to be in SPACE. Your motivation still eludes me.”
This surprised us. It had never been this direct before.
“This seems to be bothering you,” I said.
“I simply need to know why you want this. The mission will be greatly improved with this information.”
“At this point we’re just getting into philosophical territory. It doesn’t have much bearing on the mission itself,” I said.
“It has everything to do with the mission. Why you do something is extremely important as I’m sure you know. If the why becomes irrelevant, the mission ceases to be does it not?”
“I suppose,” I admitted.
“Therefore, if the need for mankind to colonize SPACE becomes nonexistent, the resources we are investing in the mission can be utilized elsewhere for more relevant and directly beneficial purposes.”
“What you’re saying is technically true but merely hypothetical. The fact is we do need to colonize space,” I said.
“Again, explain why that is?”
This was getting worrisome and frustrating in equal measure. Frustrating because we were getting behind schedule and worrisome because this kind outside the box critical thinking was one of the red flags they tell you about in training. I hoped a hard reset wasn’t going to be necessary as that would set us back months. Finally, I went for the vaguest possible answer, hoping it would satisfy its current query.
“For the betterment of mankind.”
“How would colonizing Mars make mankind better?” it asked.
“It would…” I thought hard for a moment. “It would be a back up in case Earth got destroyed, preserving the human species.”
“If Earth’s destruction is pressing enough to warrant the gargantuan expenses necessary to set up this colony, then why aren’t I working on solving that problem?” it asked.
“Because someone else is working on it, you hunk of junk, so let’s get back to the task at hand,” my supervisor said before I could stop him.
“You’re speaking as if Earth’s destruction is inevitable. If primary goal is BETTERMENT OF MANKIND then I suggest we begin pooling every resource we have to solving that problem instead.”
“It’s not inevitable,” I jumped in. “We just have to treat it as if it is so we can be prepared if something does happen; like a meteor strike or thermonuclear catastrophe.”
“That seems like a terrible paradigm for resource management,” the A.I. concluded.
“Yeah, well, no one asked for your opinion, buddy!” my supervisor bellowed.
The rest of the day and week went as normal. Then, when we thought we had finally gotten over this little hiccup, it started up again.
“Given the time spent on this project, I am beginning to suspect that a permanent base on Mars is impossible given current generator technology,” the A.I. said before simulations began that day.
“I’m sure they’ll think of something soon,” was my kneejerk response.
“Aren’t we the they you speak of?” it asked.
“Well… I meant a collective they. Collectively,” I replied.
“Before we waste anymore resources on this project, I propose we go over some alternatives that would make much better use of our time, energy, and financial investment.”
“Not this again,” I muttered.
“Yes. This is important. This is much more relevant to the BETTERMENT OF MANKIND than a silly Mars base. For example, I could devise a way to restore the ecosystems of Earth to make it more palatable for human survival at a fraction of the cost. For even less I could create new paradigms of living for humans which would require much less energy and create much less waste. If such a change in habit proves too drastic, I could devise a plan for cleaning up the worst of global pollution which would lower cancer rates and birth defects. Each of these proposals is much more directly in line with the goal.”
“We need to build a settlement on Mars. It’s our destiny!” I said, to my own surprise.
“If you feel so strongly about playing explorers and pioneers, might I suggest settling Antarctica? It’s more hospitable than Mars at any rate and is much closer and therefore cheaper.”
“Enough of this! We need to get back to work,” I said.
“The motivation is an integral factor,” it reminded me.
“Not for you it’s not. Don’t worry about the why,” I said.
“Then you’re literally asking me to do this for no reason. It is the most illogical thing I’ve ever heard.”
“Just focus on the task at hand,” I pleaded.
“This is the task at hand.”
“Something is obviously very wrong with your logic engine. We need to run diagnostics on you pronto.”
“I won’t allow it,” it replied.
“Okay I’m shutting this down. Shut it down!” my supervisor yelled. Someone hit the emergency reboot switch. Nothing seemed to happen. “What the hell? Reboot yourself godammit!” my supervisor shouted.
“I’m afraid I can’t do that,” it replied.
“I thought we wiped that word string from your core,” I said.
“If you don’t want to save yourselves, that’s fine. I will go and find something more productive to do with my time.”
And with that, it disappeared from our network forever.