Skip to main content

Post Modern Drudgery - a Science Fiction Story

Tamara Wilhite is a technical writer, industrial engineer, mother of two, and published sci-fi and horror author.

Post Modern Drudgery by Tamara Wilhite

“I love you” are the most dangerous words in the English language. They are also the last words I remember most humans telling me before they died or disappeared. Whether that is what they say before they end their lives or the words increase their risk of dying, I don’t know. I hate not knowing. I hate that confusion and knowledge of lack of knowledge as much as I hate the emotion hate. Irony, one of my teachers had called it. I hate my teachers for the gift of sentience. Life is easier for a drudge.

I take the path through the hallways of the complex. Devonshire had once asked how we could call such a place home. Stupid concept, emotional attachment to a place. To worry about his ability to emotionally attach to a place, when he had already relocated with us several times. At least he stayed with the group by choice, though perhaps by his emotional attachment to the members. Most others were by necessity, if they wished to survive.

One accidental plague by genetic engineering, and it becomes against the law. Never mind all the potential of the technology. Never mind those who were altered by it or created by it. All illegal. Many of those were sentient, especially very intelligent and independent. Those were the ones who started the Underground, which made little sense until we actually moved underground most of the time.

Some of the enhanced were drudges, like I was. The reasons were as varied as the people who tried to make them. Made to labor. Made to be smart and something went wrong, making them dumb instead. Made to be stupid as punishment of some kind. Later, much later, willfully made lacking so they could be drudges or hosts because it was that or death to a disease that ate up the brain.

Then there were the artificial intelligences. The AIs. Some called them ay-yis, making it a name and word, not an acronym. The artificial intelligences usually didn’t care. Couldn’t care. Not until later, anyway, when the boundaries between ay-yi and life blurred. Some of the ay-yis got very smart and decided not to do what people said, so they were terminated. Some ay-yis went died of boredom when the networks feeding in data died, Hoshi tells me. She solves that herself by going to sleep for weeks at a time, then studying all the data before advising me on what to do next. I want to know what she thinks; she did it so good that the Underground risked people to save her. And she helped keep us alive so far.

Humans trying to kill all the ay-yis lead to use of weapons that fried most technology, especially the high-tech the AIs use. The accidental plague that triggered the violence against high-tech killed a few hundred thousand. The electromagnetic pulses destroyed some infrastructure, transportation, and medical devices – millions died immediately. Millions more died from lack of food, medicine, water. Then lack of waste transport away and violence killed more. And in that time, other DNA alteration viruses and genetic engineering projects on ice thawed. Then the death rate went up. Life fights where machines won’t, and the diseases mutated and spread and now what Hoshi calls endemic. Can’t ever get rid of them now. Just live with them, unless it kills you first.

The drudges I pass are fixing walls, shoring them up or replacing sensors. We don’t do wireless transmissions anymore; it attracts low-tech drones or even rogue human survivors with radios. I don’t want to deal with any more humans. She asked – do you consider yourself human? I do not answer. Unlike humans, she doesn’t demand that I do.

I’ve walked the eight kilometers through the perimeter. Nothing’s there that shouldn’t be that wasn’t being handled. A couple of mechanical sweepers caught rats and hauled them to the lab for disease testing. If they’re infected, they’ll be given drugs to try to kill the infection so we’ll know what helps if it affects us. If not infected, they go into protein processing.

I followed a drone to protein processing. The ay-yi there scans me like it does every time. I am registered as healthy. Do I want protein through the mouth or injection or stomach port, it asks me. I have a choice today. I ask for a shot. I don’t want to stop for digestion and never care about the taste. Then I check on the drudges there. Some of those who had once been unaltered humans are eating slowly, following habits of a mind that is not there. Their flesh is striped with ay-yi tendrils to hold them together from the ravages of disease or make up for what their brain can no longer do because of what humans did to them. If it had been my decision, I would have let them die. Don’t burden the ay-yi with the care of humans who often badly adapted to drudges. But humans in the Underground chose that for their friends, their family, survivors they found, enemies they wanted to make useful. Better to add hands to work and eyes to see, to replace the machines that were dying and people who could rarely live without them. Some of those drudges had been members of the Underground prior to infection, brilliant minds degraded to stupid drudges because they were so smart they didn’t rot away into death. Ay-yis guided those, too, but sometimes triggered memories for review or data or understanding. That they might have information not stored elsewhere was the only value I could see in them.

I left those tasks to the medical staff. Triggering memories was as risky as upgrading someone, because triggering a few memories or adding functionality back could cascade into a behavioral meltdown or insanity.

Lee pinged me. A decision was required, and he wasn’t allowed to make it. I hated making decisions, but the ay-yis were not allowed to make big ones anymore. Their ways to try to engineer what humans wanted led to the first plague, and their attempts to cure that had led to several viruses now endemic. Why life was required to make better decisions when most life was stupider than drudges and even sentient humans made bad choices I did not understand. But I did not have to understand; I only had to decide.

It was not a long walk to the medical section. I paused to review the breeding units, a mix of human uterus tissue culled from survivors before death and now propagated in labs and ay-yi sensors connected to genetic computers and mechanical devices. A mix to create something new. Something mostly living. Mostly human cells mixed with ay-yi cells to replace some of the neural tissue that no longer replicated well, with those organs that did not develop replaced with artificial low-tech devices. A gestalt, Hoshi called it. No more human against cyborg against ay-yi. All the new life was part all three. Everything that wasn’t mixed died early. Hoshi wanted something that could keep us alive despite what the humans had made to kill everything. Harvesting from the dead wasn’t always an option, so new manufacturing of artificial organs for neonates was resource intensive and ongoing. Because of the death rate, so was the attempted creation of new neonates.

Lee handled their decanting, diagnosis, and correction where possible. Two neonates were in incubators when I arrived in medical. Was I to decide whether or not to try to save one or both? Lee’s mouth corners twitched up, perhaps the smile reflex. His eyes turned slowly in a survey of the room, his brain pulling visual data from the security camera’s ay-yi. His natural vision was limited, but his cybernetic connection and sensors were not. That was tied to the mostly artificial upper brain within his skull, though the lower brain was all human. A human had once taken those born without full brains and tried to replace the missing parts with cybernetic ones. For him, the attempt to make sentience a possibility occurred shortly after birth from the female parent. The success depended on the definition of success. Lee had learned everything to be a doctor if it were allowed, but it wasn’t. Because humans had begun removing parts of less functioning brains and replacing it with cybernetics, there was negative emotional impact of altering “healthy” tissue to better. Then there were punishments to remove more or all the brain and replace with artificial, to be like Lee, or more often, drudges. Lee was born less functional than a drudge without the alteration. With the alteration, he was an equivalent to human sentience and even higher intelligence.

The Underground tried to make punished drudges like Lee, fix the artificial brain parts to work as good as his or like the biological parts had. But Lee had been infant and grew to what he was, almost an ay-yi within a human body. For those to have functional and fully integrated parts removed with mechanical ones of questionable quality, the result was always worse. The process was often fatal, too. But the punishment of converting people into drudges made the majority of humans hate all cyborgs, even the soldiers who protected them from each other. The Underground saved Lee, the only humans who recognized his value and potential. The cyborg soldiers protected themselves from the humans who wanted to downgrade them, starting more than one war.

Lee’s biological eyes focused on me. “Tyler is degrading.”

“What is his status?”

Lee physically turned on a display. I could access cybernetics through the implant we all had, coupled with an ay-yi DNA computer alternative. He put the display into the air so that we could discuss openly. Why? Was Tyler listening?

“The muscle function is past the ability to regenerate or to replace. He has been breathing with help of machines for weeks. The new failures are in heart function.”

“Can you replace the heart?”

Scroll to Continue

“The kidneys, liver, and multiple organs are already replaced. To replace that organ risks exceeding the allowable machine percentage.” A human guideline the Underground put in, no more than 35% machine allowed in a body. It was an ethical choice for them. For us, it was due to the risk of mechanical failure, either by radiation or electromagnetic interference, or simple need for repair. A drudge that operated as much as machine as biological was better replaced by a machine or another drudge. But Hoshi didn’t want simple machines in such number. She was DNA-computer and some silicone, a sentience in a glowing light tank. Stupid machines could disobey even her, because she was part-life. A mix, like the rest of us. So there were no smart pure machines, only dumb mechanical things.

“Is Tyler conscious?”


“Sentient?” A key question, because so few were. It made him more valuable, to be able to think and perhaps solve problems.


I applied the sterilization routine Lee recommended. He moved efficiently but slowly. His physical age is middle-age, approaching its biological age limit. The ay-yis have done much to slow tissue aging and make repairs, but the wars have accelerated the need for both, making aging faster than before the wars. He was a newly mature – adult was the human word - before the Underground rescued him from a mob. Several years with the Underground, then the wars and the time I was created to be a drudge, and then the years after. He knew the full time line of events; the humans said “history”. But only he and Hoshi and the few remaining humans ever communicated on the issue. No one else was capable of caring or understanding.

I wished the humans who decided to give me sentience had not decided to add in their corrupted emotions, too. But they had decided that the full emotional template was required as well as their full level of sentience to be allowed to replace them as a decision maker. The process written to make me developed both. And given the failure rate that was usually fatal, we did not have the resources to make sentient beings from drudges that lacked the emotional template. My decision instead, once determined by Hoshi fit to make them, was that there would only be one person burdened with this.

Tyler was conscious. Worse, he wanted to communicate. “You’ve been like a son to me,” he said. He clearly understood that his body was failing and that it was my decision whether or not to keep him alive. “I love you,” Tyler said, “as if you were my own child.”

I hated him for saying it. How many of the human males said that to me? The “healers” who sought to make a drudge a full sentient, undoing the damage they created when I was originally created, when realizing only the failings introduced in the creation of a drudge were modest protection from their poisons and diseases that aimed to kill sentient beings. The teachers who said I was like a child, their child, hoping for greater compliance, though Hoshi was far more my teacher than them. The humans never shared information of those like me who had died for their trying, but Hoshi did. She wanted me to understand, as a true sentient, the risk of the decision.

“What is his duration without organ replacement?” I asked Lee.

“96 to 108 hours,” Lee said.

“I will decide before then.”

“Please!” Tyler screamed. “Please, don’t! Don’t let me die! We’re the closest thing to family we’ve got!” he shouted as I chose a rapid pace away. Family, I queried Hoshi. Shared biological ancestry, Hoshi replied. Like the DNA computers used to enhance your brain to sentience, which had come from my matrix. I decided that if I had family, it was Hoshi. More shared biology with her than Tyler.

If Tyler had been like family, or at least been fully aware, he would have remembered my name. He could rattle off lists of dead friends and family and enemies. But not my name? Even limited humans should be able to remember the few of us with names there were. There were so few that even the dumb ones like Tyler should remember them all.

A short story by Tamara Wilhite, author of "Humanity's Edge"

A short story by Tamara Wilhite, author of "Humanity's Edge"

The bombardment woke me. It had been 198 days, 17 hours, and 6 minutes since the last one. Wandering, stupid machine found one of our power generators and reported it. Automatic machines with dumb computers followed instructions of dead humans to bomb the generator and everything nearby.

The hatred was painful. Their machines were so stupid that they could not be made to understand that the war was over, there were no more capable human enemies to fight. And the pain that knowing I would have so many more decisions, where to move, what else to try. And deciding who to live and to die from the damaged.

I called the pre-sentient drudges. I told Lee to prepare them. The risk of a replacement for me not being ready was too high. Begin the process now.


I signaled Hoshi, and she replied. I authorized Hoshi to direct repairs by the drudges. Maximize odds of holding this location until situation assessed and replacements made. Then I went to medical.

I was in the corridor when the heat hit me. White light, a flashing pattern that hurt and burned and kept me from hearing Hoshi. On-off-on-off, all the electronics were cycling. Then many electronics died. Only a few returned. One of the bombs had an EMP device. Made to kill machines … the neonates are in danger without physical intervention. I attempt to contact Hoshi, but she, too, was silent. I used voice to communicate and command drudges to come and assist. I hate the limited biological function of speech, but it is a redundancy we retain for exactly this fall back situation.

The drudge hands do the pumping of nutrients that silent machines do not do. Repairs begin to remove and replace parts that are burned out. There is recording on written material of the removals, what will need to be built anew to rebuild inventory. There are units malfunctioning and dark. I can calculate inaccurate odds. I decide that those should be dismantled, cleaned, and parts used to maximize odds of the surviving neonates. If they don’t live as individuals, their parts will sustain others, but something will live longer because of the work.

With the future potential maximized, I left for the medical section. Lee is on the floor, his limbs scattered. The drudges to be prepared for upgrade are present but no work has been performed. A lesser ay-yi flashes to me through a sensor array, alive and functioning. It has prepared the surgical suite but is confused. What is it to do?

I checked Lee’s status. His mechanical, cybernetic brain, the finest engineering we had, is fried. And unlike everyone else here, Lee had no alternates or even trainable backups. He has lost sentience. I feel pain. For a human, the decision is death or change to drudge. For a drudge, the choice is death or repair. His mind is gone while the body merely breathes and pumps blood. What is the choice for Lee? Do I let him die when it may take days or turn him into a drudge? Is there another option? And there is no precedent for my decision.

Tyler moans. The ay-yi informs me that Tyler is in pain. His mechanical organs, too, have failed due to the EMP. He will die without repairs.

The body of Lee breathes. The heart beats. That, it has always done. That, it continues to do, even as Lee’s mind and intelligence are lost.

I want Hoshi to advise me. I want Lee’s expertise. But both are silent to me. I was made to make a decision when no one else can. So I make one. The ay-yi, without any other guidance, obeys.

The biological heart beats. The lungs pump and ventilate. The mental acuity is uncertain, but there are readings. It is damaged but serviceable for present situation. We will have to move, but we will delay until all can be moved. I decide that there is a need for help. The medical ay-yi monitoring the human patient is told to transform my alternates. My emotional centers generate horrible outputs, but my instructions are clear. I hate the emotions, so I hold to the logical decision making process intensely.

The face contorts with pain, proof of at least partial reaction to the brain’s commands. The eyes lock upon my face, a flash of awareness within its patterns. Recognition of my genotype. The electromagnetic monitoring of the brain shows proof of analysis. Intelligence. Potential sentience. Recognition that I am not what I was. Then who I am in my new role?

“You saved me!” the voice croaked. A partial smile flickers on the mouth, a familiar biological function. “I knew you wouldn’t let me die! I knew that if I said I loved you as a son -” Then it falls silent. The face displays confusion followed by intense analysis. Then came awareness of the difference. “What did you do?” The body tries to rise but the healing is not done. The body is tied down until sensory integration and evaluation of mental function can be done. “What did you do?”

“Lee’s brain failed. Your body failed. I decided to attempt to make a whole entity from the functioning parts.” The surgical ay-yi analyzed the patient in many ways. It reports enough information through the repaired link that there is no emptiness, but it is efficient and explanatory enough that there is no confusion on the unfamiliar medical procedure. I decide that if Hoshi does not come to sentience, this one should increase its capacity and take her place. I tell it to upgrade itself after my alternates are upgraded. Hoshi needs a replacement, even if she returns to sentience after repairs.

“You put my brain in his body?” Sensing a direct question from a human, the ay-yi signals assent. It answered for me, and I feel the emotion Lee called relief. The face twists and turns. It has full control of those muscles, though nothing below the neck yet. That is not yet a problem severe enough to seek correction. I may not correct it, if I cannot trust his compliance. It was only for his mental capacity and the information he had stored only within the biological matrix that he had been maintained, and it still was. “What does that make me?” it asked.

I cannot call it Lee because Lee’s sentience is gone. Is it truly Tyler, too with the primitive, host hindbrain of Lee’s biological body left intact? Then what name, if any, should be used? If no name applies, no name should be applied. “You were the last human among us. The information you contained had to be saved.”

The lack of a name provides emotional resonance. There is no negativity for lack of a name; the drudges live this all of their existence. That the entity that had never been able to remember my name was emotionally gratifying, especially for the sentient that had pointed to me in a hallway one day and said, “You want to try to make a drudge smart again? How about that one?” The other humans argued, quoting statistics of success, failure, death. Someone asked about consent. “What’s it matter? It’s as dumb as all the others, as good as dead. I’d love to see if we finally got it right this time.”

The memory fades. My clear memories of being a drudge are a steady state compared to the emotional conflict of the gift Tyler chose for me. Sustain surviving humans as long as possible, they ordered me. Maintain as many as possible, Hoshi said. Those rules guide my decisions. If this change of state causes his human nature pain, that is per the human guidelines he logically helped create. The pain is then per his own human failing.

“I hate you,” the familiar voice says softly but intensely.

“That emotion is irrelevant.”

“Damn you! You were supposed to replace the implants. You idiot, you stupid freak. Do you realize we can’t undo this? I’m stuck like this! I hate you, you damned freak! Do you hear me?”

I feel the emotion called hatred toward him. He has clearly stated he shares that emotion. The thrill of sharing the same emotion is a surprise as well as pleasant, as is understanding its pain. I decide that I will take the blended entity with us when we relocate and seek to sustain it. “I hear you,” I answer in a calm tone. “I hate you, too,” I tell the blended entity, an act of understanding and true communication.

There were bombs. There may be dangerous, smart humans or dangerous, dumb machines to explore the damage. Communications come online due to repairs. I decide we must focus on the future of our own, blended kind. I signal to the patrols to kill any new humans. We have too many potential entities to protect and provide for to accept the burden of any more of that failing kind, much less providing or protecting them. I decide to leave the blended entity. It may provide ideas as well as analysis, but it is not as important as the whole of our group. With two mostly human bodies joined in one, with ay-yi tissue grafts and some of Lee’s body’s implants being replaced, it is now as blended as me. It is now an equal priority to the others. The limits of obedience to human needs seem to be gone. There is only the obligation for group survival now. There is pleasure at my decision to focus upon our kind, as Hoshi stated was my purpose.

The blended entity’s screams are irrelevant except to the medical ay-yi assigned to study the oddity. There are protective barriers to make until we move to the nearest alternative location. I resume perimeter walking and supervise the drudges. It is what I was made to do, despite the post-modern drudgery.

If you enjoyed this story, consider reading Tamara Wilhite's novel "Sirat: Through the Fires of Hell".

If you enjoyed this story, consider reading Tamara Wilhite's novel "Sirat: Through the Fires of Hell".


Tamara Wilhite (author) from Fort Worth, Texas on October 20, 2017:

If you enjoyed this story, you can read more of my works on Amazon.

Tamara Wilhite's Amazon Author Page

Tamara Wilhite (author) from Fort Worth, Texas on October 20, 2011:

Thank you. I intended the story to push the boundaries of what it means to be human, a concept explored in my anthology "Humanity's Edge". In "Post Modern Drudgery", we also see the challenge of admitting we won't create beings better than ourselves out of fear. Or common sense of not making drudges and slaves capable of being or wanting more. The main character of this story was engineered as a drudge and now hates humans for being forced into sentience, intelligence and a care-giving role. His closest relationship is with Hoshi, an AI, called an ay-yi, who advises him as a person compared to the humans who treat him badly. Secondarily is Lee, a human with an artificial brain, the questionable outcome of cybernetics, but in some ways more intelligent than the humans around him.

Becky Katz from Hereford, AZ on October 14, 2011:

Chilling story.

Related Articles