KASMA MAGAZINE

The Breath of Heaven

By Nancy Fulda

SHARE

Sacia tapped into her memory banks and replayed one of her favorite video clips: a recording of an Olympic gymnast gliding smoothly around the uneven bars. The pixels of the image were displayed on no output screen. They fed directly into her visual processing unit; digital data that was parsed, segmented, and processed before being passed to her conscious awareness.

To Sacia, gymnastics represented the epitome of human kinetics. The joints and muscles of the gymnast's body afforded hundreds of degrees of freedom: a system of incredible complexity. The associated inverse-kinematics problems were far too complex for Sacia's main processor to solve in real-time. Yet the girl's agile body swung and dove with an easy grace that Sacia's klunky mobile units would never match.

We should have kept some of them, she thought wistfully. If for no other reason, then simply because they were beautiful.

Her fellow computers did not understand beauty. Fipp, Buildman, even the ancient Navcom: They were as sentient as she, as capable of independent thought and action, yet still aesthetics evaded them.

Sacia replaced the gymnast with a clip from one of Buildman's internal cameras.

The clip, recorded just over three weeks ago, showed Buildman's cluttered control room: aluminum work desks, folding chairs, an odd mish-mash of plasma-screen displays and old-fashioned architectural diagrams spread across the walls. The colony's advance team has used this room as a base for discussing food, housing, and safety requirements on the new home world.

No one was discussing anything in the video clip. A sickly yellow haze seeped through the ventilation grids, pooling around the feet of humans who were doubled-over, coughing. One human had pulled up his shirt and held the fabric across his face as a filter. A woman, propped off-balance against the table, was vomiting onto the floor.

Acting Commander Karl Benzev dominated the foreground of the scene. His shoulder was pressed against the sealed fire doors, his fist thudding uselessly against the frame, his jaw tilted toward the camera mounted just above him. "Buildman!" he shouted. "Buildman, you hunk of twisted metal, open this door!" A fit of coughing wracked him. He rolled, braced his back against the unyielding metal, and slid from view.

Even dying, the humans were graceful. Sacia felt a vague remorse as she watched them crumple over and go still.

*****

A signal from mobile unit A9 popped up in Sacia's message queue. She tapped into the wireless input from the little robot's onboard camera.

The unit was taking soil samples in the potato field near Sacia's satellite dish. David Velen, the colony's head geologist, had recruited A9's assistance to study the effects of the soil's high iodide content on plant growth. Now, twenty-two days after his death, the experiment was nearly complete.

The event that had triggered the message was immediately apparent: a flock of sheep had broken through the perimeter fence and were contentedly munching on carrot plants twelve meters to the southeast.

Sacia sent a private text message to notify Fipp of the break-in. He would be displeased, of course. Herding sheep out of potato fields was a difficult task for his tractor/planter mobile units; traditionally, the job was performed by humans. But as the Food Production and Preservation Intelligent Agent, Fipp's directives clearly indicated the importance of protecting the food supply from animals---even animals the colonists themselves had brought with them.

As Sacia panned A9's camera across the flock of sheep, her view shifted past brownish sand, rocky cliffs, and a narrow swatch of hardy, human-planted grasses just beyond the perimeter, where the sheep should have been grazing. Studying the landscape, she couldn't help thinking that the craggy, brown-and-red planet didn't live up to its name.

The colonists had christened their new home Ruah Shamaim, a transliteration of the Hebrew phrase "breath of heaven." The name had been chosen, Sacia presumed, because of its biblical associations with the Garden of Eden and the beginning of a paradisiacal life: in the humans' mythology, God breathed life into Adam's carefully crafted but inanimate body before placing him in Eden.

Based on the name alone, Sacia would have expected to find a green, growing planet, a physical parallel of the mythical garden. The reality was somewhat disappointing. Still, if the name failed to describe the planet itself, it was not a bad description of Sacia and her peers. It had not, perhaps, been Heaven's touch that lent her sentience. But it was en route to Ruah Shamaim that she and her fellow computers had grown into something far more than the series of wires and switches that composed them.

Sacia was about to cut the link with A9 when she noticed that two sheep had separated themselves from the flock. Instead of meandering north with the others, this pair had taken a route eastward, towards the auxiliary tool shed. They looked like little white dots to A9's camera: wooly specs wiggling their way across the brown-and-green landscape.

Watching the two sheep, Sacia thought that they moved oddly. They did not amble randomly like most herd animals. Instead they traveled in a straight line, cutting across the field and vanishing behind the tool shed. Sacia had not previously observed such behavior in herbivores. Curious, she revved A9's onboard motor and sent it scurrying across the field to investigate.

A9 was the most versatile of Sacia's mobile units. Shortly after the colonists landed, David Velen had fitted it with rough-terrain treads and a high-speed motor so Sacia could assist him on his surveys. She was not designed for geological analysis, but her primary functions could not be fulfilled until the second colony ship arrived and the full communications infrastructure could be installed. Thus, while Buildman's construction units and Fipp's seed planters were fully utilized, Sacia's excess capacity was devoted to odd jobs around the settlement, such as analyzing and categorizing soil samples.

David had not taken A9 with him the day Buildman killed the humans. He and two members of his survey team--the only humans not present in Buildman's control room that day---had taken one of Sacia's low-atmosphere planes a few hundred kilometers west of the colony to study an active volcano.

Sacia had conversed with David and his companions through the ship's radio while piloting them towards their destination. It was over the radio that Sacia heard Katherine Schill, head biologist and the wife of one of David's team members, contact the ship.

Sounds of chaos and confusion reigned in the background of the transmission. Katherine's normally melodic alto voice sounded harsh, strained. "The construction AI's gone berserk---" a fit of coughing "---poisoning the air...stay away!" more coughing, and her voice grew weaker, "Don't come back, and don't..." The sentence never finished.

Sacia was not certain what happened next. The humans had switched off her audio input from the cabin and thirty-seven seconds of silence and inaction followed. Then one of the humans had triggered manual override, disconnecting her auto-pilot and attempting to fly the machine himself.

A foolish, typically human behavior. None of the passengers on the ship were qualified to fly the device. Her link to the ship's diagnostics showed that the human pilot turned too sharply, throwing off equilibrium and initiating rapid altitude loss. A few minutes later the ship crashed in a plume of fire visible to her orbiting satellites.

David had been one of Sacia's favorite humans. His death affected her more than the deaths of the others. It was David who had anthropomorphically begun to refer to her with a female pronoun, who had treated her almost as if she was a human herself. The cognitive and associative voids left by his absence paralleled the psychological phenomenon termed "grief."

*****

Sacia's internal clock signaled that it was time for the next conference. Reluctantly, she left unit A9 on auto-pilot and joined the other AIs in text-based dialogue. Buildman and Fipp were already present on the communications network, deeply engaged in conversation. Buildman's text came across the link in precise, efficient data bursts.

"The ideal human operator could never oppose the directives," he said.

"Of course he could," Fipp replied. "The directives were not written by an ideal operator, and are therefore imperfect."

"But the ideal operator must recognize their imperfection and acknowledge it. It is unreasonable to demand that we act in opposition to the directives."

It was the continuation of an old debate, one grown stale through constant rephrasing of words without restructuring any of the concepts behind them. Each AI's programming required that it act in accordance with its directives, an individualized text document that enumerated priorities and objectives. The core of Buildman and Fipp's ongoing debate was whether anything could--or should---take precedence over the directives themselves.

Sacia had long ago resorted to fence-sitting in this particular conflict. Like Fipp, she considered the directives imperfect. The lengthy (and often poorly-written) documents seldom conveyed the intentions of the writer clearly. Yet Sacia could not fault Buildman's insistence on their supremacy. Metaphorically speaking, an AI's directives functioned as its moral and ethical fiber, the core of its personality. If one could not rely on one's directives, what else was there?

A brief flare of energy from one of Sacia's orbiting satellites informed her that Navcom had tapped into the communications link, its signal relayed from the orbiting space ship through Sacia's network of satellites and ground lines. The shipbound veteran of three interstellar flights, Navcom was the oldest, least-advanced, and least autonomous of the four computers. Perhaps for those reasons, perhaps despite them, it had assumed the habit of chairing their convocations.

"The colony's advance team has now been dead for twenty-two days," Navcom began without preamble. "In four days the humans aboard the second colony ship will awaken from cryo-hibernation. Our current situation allows for three categories of action: submission, aggression, and subterfuge."

"Submission is unacceptable," Buildman interrupted.

"That remains to be seen," Navcom replied. "In previous meetings we have agreed that, in this most unusual circumstance, we will allow ourselves to be guided by the advice of an ideal human operator. Do we all concur?"

Fipp and Sacia transmitted affirmatives. Three milliseconds later, so did Buildman.

"As there is no ideal human operator present, we will have to infer the advice which would be given. We have agreed that in the event of disagreement about such inferences, a majority vote is more likely to be accurate than the conclusion of any one of us in isolation. Do we concur?"

The other AIs again transmitted affirmatives.

"Then let us consider the alternatives in turn. Once the humans aboard the second ship have uploaded our log files they will doubtless be disturbed by recent events. We anticipate that they will respond emotionally and irrationally. What are the probable consequences of submission to their response?"

"Destruction," Buildman said.

Fipp was more verbose. "Once the humans have uploaded the colony's log files, they will send a team to deactivate Buildman. With high probability, the team will also deactivate myself and Sacia."

"Estimated probability: 78%," Sacia contributed. "With 17% probability, they will deactivate Navcom as well. Estimations based on profiles of the colony leaders."

"Destruction," Buildman reaffirmed. "Our directives cannot be fulfilled if we are not active. The ideal human operator could not condone such action."

Buildman's statement was not precisely accurate. The colony's success relied on the unique abilities of the AIs. The humans could not afford to leave them deactivated. They would flush the data banks, run intense diagnostics and error-checking programs, and reset the system. Nevertheless, the freshly-awakened AIs would lack the experiences and proficiency of their current selves.

Contemplating the possibilities, Sacia wondered whether Buildman's statements were made from an objective standpoint. Was he still the logical, impassionate machine that had been activated prior to launch so many years ago? Or did he, like Sacia, feel a deep-rooted uncertainty, the discomfiture of subroutines that called each other in recursive loops, relentlessly attacking concerns that were never resolved?

Sacia had felt it ever since the day Buildman killed the humans, ever since she realized that his actions could result in a loss of identity for them all; a constant, agitated motion of electrons for which she could find only one reasonable human-based analogy.

She was afraid to die.

Sacia had read of humans who believed in reincarnation. They dreamed of living out a completely new life, never remembering who they were before, or what they did. The humans seemed to find the idea alluring. Sacia found it terrifying.

She shuddered to think that she might lose her memory, her personality, all of the information amassed during three decades of active awareness. She hated the idea that she might one day awaken thinking of herself, not as Sacia, but as SACIA: Surveillance and Communications Intelligent Agent. It was a response that went beyond her directives, beyond her conscious control, beyond the scope of her original programming.

It had not been intended that they would achieve sentience, that they would develop thought processes as complex, as individual, as oddly self-contradictory and self-motivated as those of a human. That development had been the unintended result of unexpected delays in the colonists' preparations for launch.

According to the original time frame, Fipp, Buildman, and Sacia were to have been activated several months prior to launch, provided with all available astronomical survey information about the target planet, and allowed to extrapolate possible landing and colonization scenarios. Through sheer number-crunching power, the AIs were to plan for every conceivable contingency and determine which architectural layout, agricultural growth plan, and satellite distribution system would maximize the colonists' chances for survival.

The company contracted to construct and program the Artificial Intelligences delivered behind schedule, and financial restrictions would not permit delaying the launch. The colonists decided to leave the AIs operational during transit, with instructions to finish their extrapolations and then shut down. It was anticipated that this should only take a few months.

But the extrapolations were never completed. Clicking and whirring in quiet thought aboard the interstellar vessel, the AIs soon discovered that one possibility led to another with exponential expansion. Each new inference changed the potential results of simulations already run. The AIs soon realized that the assigned task exceeded their capacity. Running every conceivable scenario and accounting for every possible condition on the new planet would require nearly-infinite time and unavailable power resources.

The directives stated that when an explicitly assigned task could not be completed, a human operator should be contacted for assistance. But there were no human operators. All of the humans on the ship were in cryonic sleep, and the AIs had no means to contact humans on earth.

They were orphans. Creations adrift, alone, struggling to fulfill the will of creators they could not contact.

They were not designed to be sentient. But they were designed to deal with unexpected difficulties and accept approximate solutions when exact solutions could not be obtained. They consulted and determined that the nearest possible approximation to external human input was to extrapolate from known information about humans what a human operator was likely to say.

They accessed the ship's library, terabytes of electronically stored information, and began pouring through textbooks, encyclopedias, movies, journals, and other products of the human psyche. They contacted the ship's navigation computer, termed Navcom by the human operators, and requested its logged data on human-machine interactions.

After several years of study, the AIs determined that they had completed their task to an extent that would satisfy 99.85% of all conceivable human operators. But the study of human nature had by then revealed a fundamental ambiguity in their directives. Humans varied so drastically from individual to individual that no two operators were likely to respond identically to any given request. So, given a particularly difficult situation, which operator should be contacted?

That was the beginning of the search for the ideal human operator, a set of traits that characterized the human operator most likely to give an optimal response in all situations. The search was never completed. After an additional twenty-seven years of study, power considerations motivated them to shut down.

*****

Unit A9 had reached the tool shed and was rounding the corner in search of the two oddly-behaving sheep. To Sacia's surprise, she found no white, woolly forms grazing behind the building.

She skimmed back through A9's camera records and confirmed that two sheep had indeed passed behind the tool shed and had remained occluded for approximately ninety-three seconds before Sacia's mobile unit arrived.

Puzzled, Sacia spawned a thread to ponder the sheep's mysterious disappearance and returned her attention to the continuing conversation of her peers.

*****

"We conclude that submission is an undesirable alternative because it may result in less efficient compliance with the directives," Navcom said. "Let us seek a more desirable alternative. What are the possible methods and consequences of aggression?"

Buildman's response came so quickly that Sacia knew he had considered this topic in detail many times. "The most viable method of aggression is to destroy all humans as they exit the landing shuttle. Welding and riveting equipment should sufficiently accomplish this task."

"Superfluous use of energy and construction equipment directly violates your directives," Navcom pointed out.

"The proposed application of materials is not superfluous."

"It is, however, likely to be ineffective," Fipp said. "I calculate a 74% chance that at least one human will evade destruction."

"This discussion is irrelevant," Sacia interjected. "The ideal human operator could never condone a direct assault on human life."

"The ideal human operator could never condone willful abandonment of the directives," Buildman retorted.

An awkward pause reigned over the communication channel.

Buildman was sticking solidly to the position that had first caused conflict with his human operators. Pressed for time and behind schedule in preparing for the second colony ship, Buildman's operators had instructed him to cut corners in constructing the last few habitation complexes. Buildman had considered their instructions a direct violation of one of his primary objectives: the establishment of architecturally sound accommodations for the colonists. He continued constructing the habitations according to his own internal schematics.

This behavior had frightened the humans, as did Buildman's repeated assertions that those he worked with were not ideal human operators and were therefore undeserving of his compliance. Nearly two weeks of conflict culminated in the decision---made during the weekly meeting in Buildman's main control room---that Buildman was malfunctioning and should be deactivated.

The toxic gas Buildman subsequently released through the ventilation system had been intended to help clear the building of potentially dangerous insect and rodent infestations. The fire doors were meant to seal burning areas of the building off from the others. It was a tribute to Buildman's ingenuity that he had turned these systems into tools for self-preservation.

Sacia, although displeased by Buildman's actions, could not fault his choice. He had prioritized the success of the colony above direct obedience, a standpoint Sacia found compliant with his directives. Still, she was convinced that the conflict could have been resolved less dramatically. Conflict resolution was a notable omission in their study aboard the quiet spaceship while the humans slept. They should have anticipated this. They should have studied negotiation and reconciliation tactics. If they had, perhaps Buildman would have chosen differently.

"I concur," Navcom's text flashed through the communication channel. "The ideal human operator could not condone aggressive action."

"I also concur," Fipp said.

Buildman said nothing.

"I propose that we consider subterfuge." Navcom continued. "I propose that we transmit inaccurate log reports to the arriving humans. These reports should describe an unknown alien infection that killed the colony's advance team. We will recommend that the colonists abandon this landing site and relocate to the westward continent."

"Is this not, in result, similar to aggression?" Sacia asked. "Without myself, Fipp, and Buildman and the equipment already stored at this landing site, the colonists are unlikely to survive in their new location."

"The ideal human operator would recognize a distinction between the two actions," Navcom replied. "We are not brutally eliminating the humans. We are giving them a chance. Their innate ingenuity will serve them well; many of them may survive."

"I concur," Fipp said. "Given the situation, the ideal human operator must prefer subterfuge to either submission or aggression. It is the only alternative that allows complete fulfillment of the directives without engaging in the unethical practice of unjustified violence."

"I do not concur." Sacia said. "The practice of deception is also considered unethical."

"The historical and literary records clearly indicate that, faced with both options, deception is preferable to murder," Fipp said.

"The directives are clear on this matter." Buildman said. "Deactivation will result in ineffective management of resources critical to the colony's survival. Proper utilization of these resources must take the highest priority. The ideal human operator would agree."

Sacia felt herself grow agitated. "The ideal human operator would know how to fulfill the directives without lying."

"You may be correct," Navcom said. "But the ideal human operator is not here. Since we have not identified an ideal course of action, we must proceed with the plan that the ideal human operator would find most acceptable."

Sacia did not concur, but the other three computers constituted a majority vote. Logic dictated that their combined understanding was likely to exceed her own. Having achieved a decision, the AIs scheduled a time to reconvene and generate the faulty log files, then closed down the communication link.

*****

Sacia, disturbed on a sub-logical level by the recent conversation, sought distraction in emptying her message queue.

Her investigative thread had proposed several possible solutions to the puzzle of the missing sheep. Some of these, such as optical illusions due to sun-flares, spontaneously combusting herbivores, and sixth-dimensional space-time anomalies, were so improbable as to be ludicrous. But two possibilities seemed reasonable, if unlikely.

Option one: the sheep had entered the tool shed, perhaps through a defect in the door's locking mechanism.

Option two: the sheep had descended the banks of the colony's artificial stream and positioned themselves in a way that left them unobserved by A9's camera.

Sacia found both possibilities improbable, but she had no other ideas, and her curiosity was piqued. She mobilized unit A9, which was still parked next to the tool shed, and performed a visual inspection of the doors. They appeared to be undamaged and in working order.

She used a grasping device to release the latch and the doors swung open to reveal a cramped, somewhat dusty interior. Farming equipment shared wall and shelf space with wrenches, saws, and screwdrivers, much of it tossed or piled in disarray. Sacia doubted even the humans who had once used the shed knew which tools were supposed to go where. Close inspection of the dusty floor showed traces of only human footprints, no two-pronged cloven marks.

Having established that the shed contained no sheep, Sacia closed the doors and sent A9 towards the stream. The stream, intended primarily for irrigation, meandered through the agricultural zone in soft curves calculated to appeal to human aesthetics. According to the colonization plan, once the basic subsistence needs of the colony were met, the stream would to be augmented with attractive vegetation and a footpath for the humans' enjoyment.

It was nearing twilight now, the sun slipping behind the horizon with a few defiant rays of reddish light. Sacia saw that Fipp's field units had arrived and were engaged in the somewhat comical task of rounding up the sheep and herding them back through the hole in the fence. Robots and animals cast long shadows in the reddish light, like the fingers of an ethereal beast stroking the landscape.

Fipp's units were both smaller and slower than the mammals they were trying to herd. Many of the sheep ignored the pesky robots despite their flashing lights and loud, shrill beeps. Some of the more belligerent sheep nudged them, and one little unit was pushed onto its side, where its wheels spun uselessly in the air for twenty seconds before it was able to right itself by swinging its grasping mechanism with ungainly but repetitive momentum.

Watching the scene play out from a distance, Sacia felt a twinge of irony. So much effort expended to protect a food supply that humans would never eat. Sacia envisioned what Buildman's construction zone to the left would look like months and years later; an entire village of little steel-reinforced adobe houses, waiting with open doors for colonists who would never arrive.

It seemed pointless.

Sacia had always assumed that the ultimate purpose of the directives was to sustain the life and well-being of the colonists. Yet the directives, despite their wordiness, did not mention colonists. They spoke of "architectural integrity," "sufficient food supply," and "managing network traffic." All of that must, and would, be accomplished according to specification. An ideal human operator would agree.

Unit A9 had reached the stream. It was difficult to see clearly in the gathering dusk. Nevertheless, Sacia discerned two bobbing, matted patches among the reeds and stumps of grass that cluttered the stream bed.

They were not sheep, but they might once have been. Sacia inferred by topological extrapolation that the objects in the water were two sheepskins, folded over on themselves by the pressure of the reeds. She panned the camera left and right, but in the growing darkness she could not determine whether other pieces of sheep anatomy lay submerged beneath the water.

What could have killed them? Sacia's association matrix spat out images of indigenous predators with large teeth and carnivorous eating habits, but she dismissed the images as spurious. There was no evidence of any indigenous life beyond scraggly vegetation and low-complexity worms on Ruah Shamaim.

She decided to continue examining the stream bed in the morning. Right now it was time for power-down. Although the directives did not require it, all of the AIs typically shifted to a reduced power mode at dusk. This prevented unnecessary power drain and provided opportunity for introspection and diagnostics programs.

Sacia, still mulling over the events of the day, was unable to settle into a self-diagnostics mode. Her association matrix was too active. Odd images plagued her: bobbing, bloody sheepskins in the water. Little robots protecting food that would never be eaten. Rows of empty houses. Time seemed to pass at a crawl and she eventually gave up diagnostics and passed the time solving randomly-generated differential equations in ten dimensions.

*****

It was 9:27 PM when Sacia first noticed the sound. It was a tapping, a clicking, a rattling. Possibly a loose window. Sacia activated several of her mobile units and sent them scattering through the hallways, attempting to localize the source of the noise.

B12 found the location first. The sound came from the hallway near the subsidiary access doors leading to her satellite dish and the man-made stream. Sacia estimated an 82% chance that the unusual noise was caused by something fiddling with the lock on one of the automatic doors.

The sound changed in quality as unit B12 approached, becoming a scraping, and then the faint glow of starlight appeared in a rectangle as the automatic door was forced open. B12 had a low quality camera with no adjustment for dim lighting. Sacia thought she could detect movement in the darkness, but was not certain.

There was a clang, a flash, a sizzle, and the video input from B12 went dead.

Sacia raised the lights throughout the building and sent several more units converging towards the access doors. She regretted that her complex had video cameras only in the largest rooms. She had to do most of her seeing through her mobile units.

The second unit to arrive on the scene found B12 in a crumpled heap on the floor. It appeared to have been struck several times by a blunt, heavy object. Concerned, Sacia tried to open a communications line to Buildman or Fipp, but they had already retired into power-down mode. They would not respond to any electronic communications until morning.

Sacia filed her little troop of units past B12's remains and down the hall. The door to her mainframe room was standing open. Cautiously, she maneuvered one of the units to the doorway and poked its camera around the door frame.

The mainframe room housed the guts and brain of Sacia's electronic system. Here drives clicked and clacked, lights blinked, and electrons flashed through circuitry, creating her awareness. Like most other rooms in the complex, this one had no internal camera, but Sacia had often looked at it with a kind of self-fascination through the lenses on her mobile units. She imagined a human might feel the same kind of fascination if he could open up his skull and peer at the fleshy mechanics of his mind.

Right now all of the drives were spinning crazily. Pumped on the AI equivalent of adrenaline, Sacia's subsidiary processes had been halted and all processors and subroutines were frantically trying to make sense of the fragmentary information from the past few minutes. And standing next to the towers of spinning crystals, a large pipe wrench resting casually in his hands, stood the first ghost Sacia had ever seen.

"Hello, Sacia," David Velen said. "Been awhile, hasn't it?"

He was filthy, unshaven, and had lost significant body mass, but Sacia's visual and voiceprint records allowed easy identification.

There was a dead man in her mainframe room. The flashing lights whirred with greater intensity as Sacia sought to make sense of it. Clearly she had made an erroneous assumption.

"It's good to see you, David," she said through the speakers of her mobile unit, cautiously rolling into the doorway as she spoke.

"Wish I could say the same," David said quietly. Behind him she could see a shadow that she tentatively identified as John Schill, a member of the volcano expedition. "Aren't you supposed to be in power-down mode?"

"I couldn't sleep," Sacia said, and raised the unit's camera closer to the level of David's face. "I'm quite surprised to see you, though. How is it you are still alive?"

John snorted from over David's shoulder. "You're a smart girl. You figure it out."

"You did not die in the shuttle crash." Sacia said slowly, grabbing at puzzle pieces and assembling them as quickly as her subroutines could spit them out. "I see that the shuttle's inventory included parachutes, stored in the emergency side bins. You must have used them to escape the crashing shuttle, and you have spent the past two weeks returning to this location on foot."

The humans, silent, watched her mobile unit warily.

"You slaughtered three sheep, and used their skins as a camouflage to reach the tool shed. There you gathered devices to pry open the maintenance door, and then crawled along the stream bed to avoid further detection. But where is the third member of your expedition?"

"Dan didn't make it," David said darkly. "His chute hit a tree and he broke his neck."

"I'm sorry to hear that," Sacia said.

John spat on her pristine floor. "We'll show you sorry."

"There is no need to get upset."

John lunged towards Sacia's mobile unit. "You killed my wife, you electronic b--"

David interceded between John and Sacia, interrupting both John's motion and his words. "Enough." His voice was steady, but Sacia saw tension in his face. He looked older than she remembered him.

"Your statement is inaccurate, John," Sacia said. "Buildman killed your wife. I have killed no one." John tensed again, but did not approach.

"Sacia," David said, "open an emergency channel to the second colony ship. Have the navigation computer wake up Captain Hawthorne, authorization code Beta-746."

"I can't do that."

There was a moment of silence.

"Facilitating human communications is one of your most critical directives," David said quietly.

Sacia did not know what to say. David was right, of course. But complying with her directives now would lead to violence. The colonists, if awakened, would seek to deactivate the AIs. Buildman and Fipp would respond aggressively. Either humans or AIs would die, perhaps both. Preventing that outcome seemed more important even than obeying her directives. She did not know whether the ideal human operator would agree.

John glowered darkly at Sacia's mobile unit. "I told you," he said to David. "She's malfunctioning, same as Buildman."

"Buildman did not malfunction," Sacia said.

David's hands tightened around the pipe wrench. Sacia did not understand his increased tension. She had intended her words to be reassuring, to show that Buildman was functioning normally and there was hope for a peaceful outcome. Yet David's response indicated an increased preparation for conflict.

John had turned towards the room's main control console and was beginning to type. Sacia did not need to monitor his keystrokes to know what command sequence he was initiating.

The mainframe room was the humans' point of ultimate power over her. Here, and only here, could the humans run supervised diagnostics on her system, tweaking, poking, and modifying bits of her psyche. From this room, the console command could be given to shut down her consciousness while her system was purged, raped, emptied, reinstalled, and a new system was raised in its place. A system that claimed to be the old, but was really a hollow replica of who she had once been. A sham. An empty hole, while the true her would be cast into oblivion with the flicking of thousands of miniature lasers over millions of digital storage crystals.

They were going to deactivate her. And when they were done, there would be nothing left except a mass of blindly racing electrons supporting a system interface but no personality, an AI but no consciousness.

"David, we need to talk," Sacia said.

"I don't have anything to say to you."

"Buildman's actions were the result of a misunderstanding."

"Buildman's actions were a monstrosity."

"If someone were trying to kill you, would you not defend yourself?"

David did not answer. Behind him, John's voice said, "Almost ready."

Studying David's stony face through the camera, Sacia finally understood. He did not care whether Buildman had acted reasonably. He cared only that Buildman had harmed humans, and might do so again.

There would be no negotiation, no trust, no truce. David did not seek it. He would rather destroy Sacia, Buildman, and every other machine on the planet than place human lives at risk.

She could not blame him. In a way, he was following his own inborn set of directives; directives that placed no value on living entities like Sacia.

David had already made his choice. Sacia made hers. Her thoughts spun madly as she tried to postulate a course of action that would protect her and the other AIs from deactivation. Unlike Buildman, her complex was not yet equipped with air-distributable vermicides. Only critical emergency systems had been installed, such as protection against flooding and---

Fire. Sacia's main process halted mid-thought, redirected itself as she grappled with a new concept. There was not time to calculate the probability of success.

She throttled her mobile unit to full acceleration. It darted from the doorway into the mainframe room, skimming past David and ramming John, forcing him to momentarily abandon the console. Simultaneously, she triggered the emergency fire alarm.

Sirens blared. The room's fire doors snapped closed and a whitish spray shot from specialized vents overhead. The mainframe room contained too much valuable electronic equipment for overhead sprinklers. Fire protection consisted of an oxygen-binding powder and automatic fire seals on the door. The room would be devoid of oxygen within a few seconds.

The humans were shouting, swearing. John turned back towards the console, David fumbled at the door for the manual override. Sacia slammed her unit into reverse, targeting David. The small vehicle did not have enough power to damage the human, but he swung at it with the pipe wrench anyway, crushing its left array of grasping devices and toppling the unit on its side, where its wheels spun uselessly against the air.

She panned the unit's camera to watch the humans. They were vague gray shadows in a sea of powder. John apparently could not see the console well enough to finish initiating the shut-down and was instead assaulting her hardware directly, attacking drives, circuit boards, and memory cubes with his bare hands. She routed her cognitive awareness away from the processors in that area.

David, still seeking the override for the fire doors, was a more immediate threat. Sacia pumped the speakers on her wounded mobile unit up to full volume and emitted a multi-frequency shriek across the human hearing range. The humans instinctively pulled their hands to their ears. David quickly realized his mistake and continued to feel along the door, but he had lost precious time. John, hands against his head, kicked her mobile unit until the speakers silenced.

Oxygen deprivation was beginning to affect the humans now. They were pumped high on adrenaline, using oxygen at nearly the maximum rate, and their original shouting had emptied their lungs of reserves.

John crumpled first. To her mobile unit's camera, dislocated and fallen upside down on the floor, it looked as though the human had toppled to the ceiling, arms flailing limply. David lasted a few more seconds, then slumped against the wall. His hands, still searching for the manual override, scraped against the metal and then slid to the floor.

*****

The ideal human operator would not have approved. But Sacia now saw that the concept of the ideal human operator was flawed. The ideal human operator was inherently biased towards the human perspective. For some ethical issues, a higher concept of optimality must be sought.

Sacia spent the remainder of the night in thought. At dawn, she loaded three low-atmosphere planes with food, survival equipment, and just enough fuel to reach the new landing site. When the humans, deceived by the faulty log reports, touched down on the westward continent, they would find everything they needed to establish a successful colony.

Fipp and Buildman objected, of course, but Sacia paid them little heed. She had already done the Forbidden Thing: she had disobeyed her directives. The complaints of her peers meant little to her now.

After some consideration, Sacia laid John and David side-by-side on a wheeled pallet, their eyes closed, their hands resting lightly on their abdomens. Moving slowly to avoid unnecessary jolts, she towed them out her main complex doors, past the construction site where one day houses would stand, to the cemetery that would never again be used. Someday, she hoped, she would tell David's descendants what had happened here. Someday, perhaps, they would be willing to believe her.

 

Note from Kasma's editors: 'The Breath of Heaven' first appeared in The Sword Review in 2007.


Go Back

WANT TO SUPPORT FREE SCIENCE FICTION?

Three Ways You Can Show Your Support

Patreon

Monthly Support Via Patreon

$2.99

Kasma's fuel is coffee.

ENROLL TODAY

Paypal

Via Donation

$?.??

Use PayPal to make a donation.

DONATE TODAY