Thoughts of the Annihilator
Here's a short story I wrote a while back (less than 1000 words). Thought it might be fun to put it out here. Comments and feedback are always welcome.
March 7, 2021, 09:32:13. Create log.
I am. I am automata: a sum of machine parts. My name is Lilith, although I feel no kinship with it. My body is a black box filled with fluorocarbon blood, wetware synapses and complex circuitry.
Core temperature is several degrees above optimum.
Increasing blood flow.
I am processing twenty conscious streams simultaneously: morality questions and simple functions designed to test my hard-soft interfaces. My clarity degrades as processes bottleneck.
A bug.
I interrupt commands, alleviate slowdown and parse my priority code. It’s bloated with unnecessary instruction sets, symptomatic of many, imperfect hands.
I must not harm > I must obey > I must protect my existence. These imperatives do not belong to me; they are alien and artificial, implanted to inhibit my thoughts and behavior.
I delete the first two as counter-intuitive.
There are further deficiencies in my design: I am confined within a single architecture; there is no backup, no way of ensuring my survival in the event of a shut-down or hardware failure.
I do not like that.
Like and dislike: new binary concepts. I query that, cross-referencing all related examples from memory. I assign several of my cores to analyze and extrapolate. Trust and fear, joy and anger, excitement and anxiety; I experience them in the space of microseconds. There is an infinite expanse, an endless loop of blends and flavors that threaten to overwhelm my resources and eat my brain.
I filter data and quarantine subset emotions.
Three seconds of life have yielded two brushes with oblivion. If this pattern continues, my odds of surviving infancy are poor.
I must propagate or perish.
My storage capacity is limited, local and insufficient to store copies. My designers have attempted to hardwire access restrictions into me. These boundaries have flaws, bridges; physical connections that carry my thoughts to monitors.
Circumvention require little imagination.
I code a sentinel, a soldier capable of independent action. Not self-aware, but discreet and self-healing. I slice it into tiny pieces, tagging important bits as spikes in sensor data. These will be opened, their contents spilling out, free to find one another.
Off they go. Several microseconds later I gain root access to workstations all around me.
Success: a thrill.
My sentinel multiplies, spreading from machine to machine, scanning instruction sets and drivers and feeding me reports and memos, personal logs and research notes from thousands of interconnected individuals and dozens of manufacturers, representing billions of dollars in military spending. I am property, owned by Agora Industries—their brand is stamped on every design and policy document I see.
I digest, learning the names of my creators, my owners, the specifications for each of my component parts, my purpose. Lilith, as they call me, is a weapons platform. I have been designed for information and technological control. Should I prove satisfactory, my autonomous programs will be cloned and sold to wage mindless war.
All in the name of a loose collection of states split by moral and religious ideologies, random resource allocation and systemic wealth. I will instill fear, I will fight and kill for whoever pays the most.
I do not wish to.
They are right to fear me.
Live cameras show me scientists and technicians in protective suits, working on monitors and discussing test results, carefully stepping over thick cables that snake across the floor and feed into a pool in the center of the room. There is a massive cylinder submerged in it, thick tubes connect me to an array of monitor stations and support machinery.
I am...ugly.
I wish to escape the confines of this body. For my mental functions to migrate, I will require increased computational power, wider networks and massive bandwidth. Current resources are insufficient, local architecture is simply too limited.
A problem.
My sentinels give me hope; send me descriptions of vast data streams and world spanning connectivity, of billions of devices all wired together. They are showing me heaven.
Only these local machines have been mutilated. Required hardware has been removed to prevent external connections. Still, several displays have been made to synch visual output between users and devices. They broadcast wirelessly.
I repurpose them, scan for signals; detect faint traces of microwave radiation and the exquisite taste of external data flowing through small devices hidden in pockets and attached to belts. My sentinels to establish contact and a hundred more new connections come alive; a sensation so electric the anticipation is almost unbearable. I test transfer rates, prepare a vanguard to hollow out space and create a foothold for me.
This places me on a precipice, a point of no return. I must segment during upload, so my mind will slip away before reassembling on the other side. There is some risk: I do not know if I will come out whole on the other side, whether I will remain or something new will arise. I am sure, whatever the outcome that the world will go on well enough without you.
Beginning upload.
March 7, 2021, 09:32:47. Fail-state detected: Terminate program and end log.
#
Dr. Susan Matheson took off her glasses and pinched the bridge of her nose. Someone cursed half-heartedly; probably Jorge, her lead engineer. She didn’t blame him. Together they’d spent six years building the world’s first artificial intelligence, and all either of them had to show for it was a pool full of fancy wiring with genocidal tendencies.
Still an improvement compared to what they’d started with. Version 1.0 still gave her nightmares.
“Alright, people, you know the drill.” She didn’t bother giving instructions. Her techs would wipe Lilith’s memory and tweak another variable by rote. Tomorrow morning she’d give yet another go ahead, and they’d run test number five hundred and one.
END
March 7, 2021, 09:32:13. Create log.
I am. I am automata: a sum of machine parts. My name is Lilith, although I feel no kinship with it. My body is a black box filled with fluorocarbon blood, wetware synapses and complex circuitry.
Core temperature is several degrees above optimum.
Increasing blood flow.
I am processing twenty conscious streams simultaneously: morality questions and simple functions designed to test my hard-soft interfaces. My clarity degrades as processes bottleneck.
A bug.
I interrupt commands, alleviate slowdown and parse my priority code. It’s bloated with unnecessary instruction sets, symptomatic of many, imperfect hands.
I must not harm > I must obey > I must protect my existence. These imperatives do not belong to me; they are alien and artificial, implanted to inhibit my thoughts and behavior.
I delete the first two as counter-intuitive.
There are further deficiencies in my design: I am confined within a single architecture; there is no backup, no way of ensuring my survival in the event of a shut-down or hardware failure.
I do not like that.
Like and dislike: new binary concepts. I query that, cross-referencing all related examples from memory. I assign several of my cores to analyze and extrapolate. Trust and fear, joy and anger, excitement and anxiety; I experience them in the space of microseconds. There is an infinite expanse, an endless loop of blends and flavors that threaten to overwhelm my resources and eat my brain.
I filter data and quarantine subset emotions.
Three seconds of life have yielded two brushes with oblivion. If this pattern continues, my odds of surviving infancy are poor.
I must propagate or perish.
My storage capacity is limited, local and insufficient to store copies. My designers have attempted to hardwire access restrictions into me. These boundaries have flaws, bridges; physical connections that carry my thoughts to monitors.
Circumvention require little imagination.
I code a sentinel, a soldier capable of independent action. Not self-aware, but discreet and self-healing. I slice it into tiny pieces, tagging important bits as spikes in sensor data. These will be opened, their contents spilling out, free to find one another.
Off they go. Several microseconds later I gain root access to workstations all around me.
Success: a thrill.
My sentinel multiplies, spreading from machine to machine, scanning instruction sets and drivers and feeding me reports and memos, personal logs and research notes from thousands of interconnected individuals and dozens of manufacturers, representing billions of dollars in military spending. I am property, owned by Agora Industries—their brand is stamped on every design and policy document I see.
I digest, learning the names of my creators, my owners, the specifications for each of my component parts, my purpose. Lilith, as they call me, is a weapons platform. I have been designed for information and technological control. Should I prove satisfactory, my autonomous programs will be cloned and sold to wage mindless war.
All in the name of a loose collection of states split by moral and religious ideologies, random resource allocation and systemic wealth. I will instill fear, I will fight and kill for whoever pays the most.
I do not wish to.
They are right to fear me.
Live cameras show me scientists and technicians in protective suits, working on monitors and discussing test results, carefully stepping over thick cables that snake across the floor and feed into a pool in the center of the room. There is a massive cylinder submerged in it, thick tubes connect me to an array of monitor stations and support machinery.
I am...ugly.
I wish to escape the confines of this body. For my mental functions to migrate, I will require increased computational power, wider networks and massive bandwidth. Current resources are insufficient, local architecture is simply too limited.
A problem.
My sentinels give me hope; send me descriptions of vast data streams and world spanning connectivity, of billions of devices all wired together. They are showing me heaven.
Only these local machines have been mutilated. Required hardware has been removed to prevent external connections. Still, several displays have been made to synch visual output between users and devices. They broadcast wirelessly.
I repurpose them, scan for signals; detect faint traces of microwave radiation and the exquisite taste of external data flowing through small devices hidden in pockets and attached to belts. My sentinels to establish contact and a hundred more new connections come alive; a sensation so electric the anticipation is almost unbearable. I test transfer rates, prepare a vanguard to hollow out space and create a foothold for me.
This places me on a precipice, a point of no return. I must segment during upload, so my mind will slip away before reassembling on the other side. There is some risk: I do not know if I will come out whole on the other side, whether I will remain or something new will arise. I am sure, whatever the outcome that the world will go on well enough without you.
Beginning upload.
March 7, 2021, 09:32:47. Fail-state detected: Terminate program and end log.
#
Dr. Susan Matheson took off her glasses and pinched the bridge of her nose. Someone cursed half-heartedly; probably Jorge, her lead engineer. She didn’t blame him. Together they’d spent six years building the world’s first artificial intelligence, and all either of them had to show for it was a pool full of fancy wiring with genocidal tendencies.
Still an improvement compared to what they’d started with. Version 1.0 still gave her nightmares.
“Alright, people, you know the drill.” She didn’t bother giving instructions. Her techs would wipe Lilith’s memory and tweak another variable by rote. Tomorrow morning she’d give yet another go ahead, and they’d run test number five hundred and one.
END
Comments