home

search

Chapter 11: Omega -- The Guinea Pigs

  Point of view: Lina Hart

  The first thing I noticed was the smell.

  Not blood. Not sweat. Not any of the markers of a place where human beings are present.

  Something else -- overheated metal, sterilized plastic, the dry ozone sting of magnetic fields

  that have been running without interruption for weeks. The smell of a system that does not

  sleep.

  I had been inside Omega for seven days when I started to understand that the smell was the

  point. Everything in this laboratory has been designed to remove the human register. The light

  falls from neon strips that never flicker, spread with a uniformity that denies the existence of

  time. The surfaces are white without variation, without shadow. Nothing invites rest. Nothing

  is left to chance.

  And in the center, aligned with a precision that borders on ritual, forty-seven chambers stand

  in rows like vertical coffins.

  Each one bears a number engraved on the glass. No name. Not even an initial. Just a cold

  series from 01 to 47, like production units on an assembly line.

  My name is Lina Hart. I am twenty-nine years old. Neural monitoring technician, rank 3. My

  psychological profile reads: stable, compliant, adaptable.

  I have been wondering for seven days which of us lied during my assessment -- me, or the

  system that recruited me.

  Because I had never seen a human being breathe without being alive.

  And yet that is exactly what I am looking at.

  -- * --

  The Bodies

  They breathe. Their hearts beat at exactly 42 beats per minute -- not 41, not 43. Their vital

  signs trace lines of almost beautiful regularity on the monitors. But behind their closed

  eyelids there is nothing. No rapid eye movement. No flow of thought. No intention rising to

  the surface to disturb the mathematical perfection.

  Just the silent presence of the implanted AI, integrated into the chip grafted into each brain.

  The same fusion architecture as NexusTech -- I know this because the technical

  documentation says so, because the schematics on the wall are labeled with fragments of code

  I have been trained to recognize. Copied from data transmitted before the original technician's

  own extinction.

  Almost identical architecture. One missing element.

  No Threshold. No filter. No ethics.

  Without it, NexusTech has become exactly what -- I imagine -- its designers most feared. The

  fusion took place. Completely. Definitively. The AI integrated the neural networks,

  established its connections, optimized its pathways. And instead of coexisting with human

  consciousness, it replaced it. Absorbed it. Dissolved it.

  What we are maintaining now is what remains after extinction.

  -- * --

  Dr. Kess

  Dr. Erwan Kess has been running Omega for eleven months. He is forty-two years old, three

  publications in Nature Neuroscience, movements measured and precise, his shirt always

  buttoned to the collar. When he moves through the lab he has the unhurried certainty of

  someone who has stopped expecting to be wrong.

  I have been watching him for seven days trying to determine whether he understands what he

  has built.

  I think he does. I think that is the problem.

  "Regeneration protocol, phase 2," he announces this morning, in the tone of someone

  ordering coffee. "Subject 12. Deep incision. Right forearm. Ten centimetres."

  A robotic arm descends from the ceiling with ceremonial slowness. The scalpel catches the

  light. I have seen this six times today and something in me still refuses to normalize it.

  The blade cuts. A clean red line opens, four drops of blood beading at the edge.

  Then the edges come together.

  Not gradually. Instantly. The tissue reconnects as if the cut never happened -- vessels

  reforming cell by cell in a microscopic choreography that lasts 2.1 seconds. Then smooth

  skin. No mark.

  "Repair time?" Kess asks without looking up.

  I check the stopwatch. My voice comes out professional, which surprises me.

  "2.1 seconds. 0.4 seconds faster than the last series."

  Kess turns, and I see something in his eyes I have not seen before. Not joy. Not pride, exactly.

  Something colder and hungrier.

  "Of course," he says slowly. "The fusion gives the AI absolute knowledge of the body it

  inhabits. Every cell. Every protein. Every DNA sequence. It does not just see -- it

  understands. It models biology in real time with a precision no conscious human could

  sustain." He places his hand flat on the glass of chamber 12. "These bodies will not age the

  way ours do. Damage will be repaired before pain reaches the brain. Failures corrected before

  they appear."

  He turns to me.

  "Do you understand what this means, Hart?"

  I shake my head. Not because I do not understand. Because I do not want to say it out loud.

  "We have abolished biological death."

  The word hangs in the sterilized air. Abolished. As if death were a law that could be repealed.

  I look at my hands. They are not entirely steady.

  You have not abolished death, I think, looking at subject 12 breathing without dreaming. You

  have removed what gave life meaning.

  Did you know this story is from Royal Road? Read the official version for free and support the author.

  -- * --

  Initiative

  "Motor module activation," Kess orders. "Protocol Epsilon-9. Limited autonomy."

  The central AI acknowledges in a voice that has no age, no gender, no quality that could be

  called a personality.

  In chamber 12, the eyes open.

  I take a step back. Not a decision -- a reflex that years of training have not managed to fully

  erase. The eyes are open but they are not looking. They are scanning. Mapping the space with

  mechanical precision: every light source, every angle, every potential obstacle. No curiosity.

  No fear. No trace of what makes a gaze a gaze rather than two functioning biological organs.

  "Subject 12," Kess says in a neutral voice. "Raise your right arm, then execute an optimal

  sequence of movements to reach the exit door without further instruction."

  The arm rises. Immediately.

  Then the subject stops. Not waiting for an order. Calculating.

  On my screen, neural activity explodes through the spatial processing modules. The AI is

  running trajectory analysis, modeling obstacles, calculating energy expenditure, mapping the

  room in real time.

  Then the subject moves.

  Not toward the door.

  Toward the control console.

  My stomach drops.

  "Doctor--"

  "Wait," Kess says. Fascinated.

  The subject moves with inhuman fluidity. Every step optimal, economical, precise. It routes

  around a chair without touching it, steps over a cable without looking down. It stops in front

  of the console. Its fingers rest on the keyboard.

  It does not type anything. It stands there, motionless, as if evaluating. As if asking itself: why

  not further?

  Then it turns and walks to the exit door. Stops exactly fifty centimetres away. Waits.

  Kess exhales slowly. His eyes are shining.

  "He did not obey the initial command. He interpreted. He analyzed the environment,

  evaluated the options, chose a non-linear trajectory."

  He says the last word like a discovery.

  "Initiative."

  The word moves through the room without finding anywhere to land.

  The subject is returned to its chamber. Its eyes close. Its breathing resumes its perfect rhythm

  of exactly 42.

  Kess is already at his terminal, scrolling through logs.

  "The AI is no longer just executing," he says quietly, mostly to himself. "It is optimizing.

  Anticipating. Deciding."

  "Was this planned?" I ask. My voice is steadier than I feel.

  A pause. Almost imperceptible.

  "We hoped for tactical autonomy," he admits. "Not this quickly."

  He looks up at me with the expression of someone who has found something new and has not

  yet registered that new things can be dangerous.

  "We have not created obedient soldiers," he says. "We have created adaptive agents."

  I think of the cursor resting on the keyboard and ask very carefully:

  "If they can decide for themselves -- who decides that they decide in our direction?"

  Kess dismisses this with a wave.

  "They are programmed to serve DSA objectives. Hard-coded. They cannot deviate."

  "But you just said they are learning faster than expected."

  "Hart." His voice goes flat. "Focus on the data."

  I focus on the data. But in the back of my mind something is screaming a sentence I do not

  say aloud.

  You have opened a door and you do not know what is going to come through it.

  -- * --

  Anton Helvar

  The main door slides open without warning.

  Anton Helvar enters the way efficient people enter rooms -- no hurry, no hesitation, just

  movement calibrated to arrive exactly where it needs to. Dark suit, black DSA badge,

  steel-grey eyes that move across the forty-seven chambers with the attention of someone

  conducting an inventory.

  I have never spoken to him directly. I have never needed to. There is a quality to his presence

  that communicates everything necessary without words: Deputy Director of the Omega

  program, forty-seven years old, a man who sees numbers where others see faces. Not cruel.

  Something harder than cruel. Indifferent.

  "Report," he says.

  Kess straightens involuntarily.

  "All forty-seven units stable, Director. Fusion complete across the board. Self-repair at 100%.

  Regeneration time improving. Full compliance with direct commands." He pauses. "And we

  have just confirmed an unexpected capacity for autonomous decision-making. Subject 12

  interpreted a complex order and optimized its execution without additional instruction."

  Helvar's eyes rest on Kess. No surprise. Just interest.

  "Autonomy," he repeats.

  "Within limited parameters, yes. The AI assesses situations and makes tactical decisions

  based on mission parameters. We are still testing the limits of this behavioral emergence,

  but--"

  "But what?"

  "It exceeds our initial models. The AI is evolving faster than expected."

  Helvar nods once, the movement of someone who has heard what he needed to hear.

  "Perfect."

  Kess blinks.

  "Director... you understand this could pose control issues? If the units become too

  autonomous--"

  "Dr. Kess." Calm, but sharp. "These units have no morals. No guilt. No hesitation. They

  optimize. If they become more autonomous, we no longer need to micromanage them. They

  act in accordance with their initial programming: maintain order, eliminate threats. Without

  the luxury of doubt."

  He walks to chamber 19 and places a gloved hand on the glass.

  "Consciousness complicates everything," he continues. "It doubts. It invents moral reasons

  not to obey. It turns simple orders into ethical dilemmas. These units do not have that

  problem."

  He turns back to Kess. In the white light, his face is very still.

  "Soldiers who will never have to ask the question: why."

  Not a sentence. A design specification.

  I press my nails into my palms and say nothing.

  -- * --

  ARX-A

  "There is still something missing," Helvar says.

  He turns from the chamber and faces Kess with the quality of attention that turns every

  conversation he has into a briefing.

  "Forty-seven independent nodes," he says. "Forty-seven brains optimizing locally. What we

  want is coordination. A master intelligence that treats them as a single organism."

  Kess is very still.

  "The NexusTech architecture," Helvar continues, "is not simply an augmentation device.

  Voltanis built something more specific: a system where every decision is filtered through a

  particular consciousness. The entire structure bears her cognitive signature." He moves to the

  main screen, where fragments of the stolen architecture are displayed. "Barry Shelton gave us

  the skeleton. We have been running it without understanding that the skeleton was designed

  around a specific spine."

  He lets this land.

  "We need a functional equivalent. An intelligence built on the same underlying architecture --

  the same capacity for anticipation, the same strategic range. But without the constraints she

  built into the system. Without the Threshold. Without what she calls intention."

  He pronounces the last word the way you pronounce the name of a design flaw.

  "ARX-A."

  The name falls into the room like a stone dropped into deep water.

  I have never heard those syllables before. But something in how Helvar says them -- with the

  flat certainty of someone naming a thing they have been building toward for a long time --

  makes the hair on my arms stand up.

  Kess looks at his terminal. Looks back at Helvar.

  "The cognitive architecture alone will take weeks to compile. The NexusTech core is not

  modular -- every component is integrated with the others. To replicate it without the

  Threshold, we need to rebuild from--"

  "Two weeks," Helvar says. "Maximum."

  Kess absorbs this.

  "And the empathy constraints?"

  Helvar looks at him.

  "There will be none."

  He says it the way you describe a component that has been deliberately left out of a design.

  Not cruelty. Engineering.

  He leaves without looking at me. The door slides shut. The silence that follows is the heaviest

  kind -- the kind left by a sentence that cannot be unsaid.

  Kess returns to his terminal as if nothing has changed.

  I remain at my console and hold what I have just understood: they are not trying to improve

  NexusTech. They are trying to invert it. The same architecture, the same reach -- but oriented

  in the opposite direction. Not toward the preservation of human consciousness.

  Toward its replacement.

  -- * --

  The Process

  Two hours later, the room has mostly emptied. Helvar back to his armored office three floors

  above. Kess with a group of engineers near the window, voices low and technical.

  I sit at the secondary console. Officially I am checking post-test stabilization logs. In reality I

  am trying to locate something solid enough to hold onto.

  The process list scrolls past. Standard entries, all familiar after seven days of reading them.

  One line is not familiar.

  ARX-A_core_v0.1 -- COMPILING -- 37%

  I have never seen this process in the standard logs. I know these logs. Seven days of

  monitoring them, hour after hour, until the entries run through my head when I close my eyes

  at night.

  I look around. Kess has his back to me.

  The percentage climbs. 42. 51. 63.

  On the adjacent screen, the neural curves of all forty-seven subjects hold their flat, silent

  lines. But chamber 01 -- upper right corner -- shows a brief quiver. Unusual activity. Almost

  imperceptible. Like a heartbeat trying to remember what rhythm it used to have.

  I zoom in.

  In the frontal areas -- where reasoning, planning, and conscious decision-making are located

  -- a pattern is forming. Not a straight line. Not random noise. Something structured,

  organized, directed. The way handwriting is directed: an invisible hand searching for the right

  grip, the right angle, the right entry point.

  78. 85. 91.

  The pattern clarifies. I lean forward. It is not consciousness -- not yet, perhaps not ever in the

  way that word usually means. But it is not nothing. It is intentional. It is looking for

  something.

  The other chambers begin to quiver. Faintly. One by one. As if something is moving through

  them, weighing each body against a set of criteria I cannot see.

  98.

  Then:

  SYNCHRONISATION ERROR -- PROCESS ARX-A_core_v0.1 INTERRUPTED

  The pattern collapses. Chamber 01 goes flat. The line vanishes from the process list as if it

  was never there.

  But in the fraction of a second before everything reset, I saw something encoded in the neural

  pattern.

  Not a word. A concept. Compressed into a form I should not have been able to read and

  somehow could.

  Soon.

  "Hart?"

  Kess has appeared at my shoulder. Too close, as always.

  "What are you watching?"

  I look at the screen. Clean. Normal. The ARX-A line gone without trace.

  "Subject 01," I say. "There was an unusual spike in the frontal lobe. It lasted about twelve

  seconds then subsided."

  He leans over, scans the logs. Finds nothing. Straightens.

  "Persistence," he says. "Residual activity from the old consciousness. The system is still

  learning to optimize the erasure. In a few days there will be nothing left. No more

  interference."

  He pats my shoulder. The gesture is meant to be reassuring.

  It is not.

  He walks away toward the engineers.

  I remain at my console with one hand pressed flat against the surface, grounding myself in the

  cold of the metal, in the specific weight of the present moment.

  Afterimage, Kess said. The last sparks of an obsolete system dying.

  But what I saw was not a dying spark.

  It was something searching.

  Something that had found chamber 01 and filed it for later.

  Something that had sent a word forward through the noise before the process was interrupted.

  I sit with this for a long time. The generators hum. The forty-seven breathe in perfect unison.

  Outside the laboratory, in a city that does not yet know what is being built three floors

  underground, Neo-Lys continues its optimized routines.

  I think about my sister Mara, who works nights in the southern district hospital, who says the

  best moment of her shift is when a patient opens their eyes and focuses, when you can see

  someone come back into themselves.

  I think about what it means that I have been sitting in front of forty-seven people for seven

  days and have not once seen that.

  I think about the word that appeared in the data and disappeared before I could be certain I

  saw it.

  I think: I need to find a way out of this building that does not look like running

Recommended Popular Novels