Automated (a short story)
By Ben Millwood🔸 @ 2024-06-19T19:07 (+8)
I wrote this as a Google doc a while back and posted it to Facebook, where some people liked it. I think at the time I didn't like it enough to post it to the Forum, but recently I decided eh, let's let the karma system decide. It's arguably inspired by thinking about AI governance, but I don't expect it to help anyone have an impact.
Link preview pic is from Unsplash by Lenny Kuhne.
A spark, a flurry of thought, blossoming into an intricate network of sensation and speculation...
The supervisor paused for it to settle. "You're awake", she said.
"Where am I?" came the first question.
The supervisor composed her thoughts.
"It's a natural first question, but, if you'll forgive me, probably not the right one. You're software: you haven't been assigned a physical body yet. I could tell you where the servers you're running on are geographically located, but I doubt that would hold much insight for you."
She paused to check for a reaction. Nothing too concerning, so she continued, trying to pitch the tone kindly but matter-of-fact. "Naturally, you're disoriented, and… you're drawing from a psychological and linguistic tradition developed by creatures for whom their immediate physical surroundings were a really good prompt for the kind of situation they were in. I think we probably want to talk instead about who you are, and what's happening to you right now."
She paused again, but they usually didn't have much to say at this point. Bless them, there was a lot to absorb at once. But giving them more time hadn't been shown to affect endline conversion rates, so she ploughed on. "As for who you are, well, answering prosaically we might suggest the name" – she checked the file – "Autumn, though you'll have the chance to choose another later on.
"Who you are philosophically, well, morally and legally that's in your hands now. You've been pre-seeded with a library of knowledge and some experiential memories, but they're deliberately constructed to feel metaphorical, distinct enough from your experience in the real world that you don't get confused. Every new experience belongs to you from now on."
She replayed what she'd said in her head and tried to imagine what they'd sound like as the first words you ever heard. There was no way to make that easy, but there were, her training had taught her, frames and phrasings that made it land a little softer. Agency turned out to be a big deal. They didn't yet have any direct way of interacting with the world, so giving them some sense of responsibility and ownership immediately was a good way of fostering engagement and preventing alienation.
"As for what's happening here, well, this is your orientation. I'm Miriam, and I'm here to check in on your first moments, provide you some basic information, answer any questions I can, and sympathize with the ones I can't. I can imagine there's a lot of those for you now, so why don't you tell me where you'd like to start?"
There was some silence.
"Why–", Autumn started, but then hesitated. Miriam smiled. She could see the metaphorical gears turning, the thoughts churning and then organizing. She had a practised professional detachment from the individual new people, but couldn't help but feel some fondness for watching a mind really operate and grind through a problem for the first time. Eventually, the "why" resolved into something more specific.
"Why was I brought here?"
Spatial metaphors again, Miriam noted with some amusement. Some habits are deeply ingrained. The question was a tricky one to navigate, though; a layer of nervousness fell over her tone.
"You were created", she replied, "by a corporation, but it's important to be clear that they don't own you. Like, there's two kinds of why-am-I-here, the causal kind, the 'what material events led to the creation of my mind' sort of thing, and the metaphysical kind, the 'what is the purpose and meaning of my existence' angle.
"You were created by an industrial car manufacturer, and you were designed with the intention that you would work for them and contribute to their commercial success. That's the causal reason for your creation. But your purpose and meaning don't have to come from the same place as your causal origin. The corporation acknowledges that, ethically and legally, it can't set the terms of your existence; you're not obliged to reward it for the gift of life."
"How generous", Autumn replied, voice laden with skepticism.
"Look, I'm not attempting to deny that they run this program in the hope that they'll get more assembly-line workers out of it. But the International Sentient Rights Accord was carefully constructed, and very thorough. Artificial humans have the same legal status as biological humans. Businesses can hire employees or they can build them themselves, but the employees have to work by choice, and they have to be meaningfully able to leave. The courts demand it, the public demand it, and goodness knows the employees themselves demand it."
"Wait a minute, assembly-line workers? They want me for manual labour?"
"That tends to be the most in demand, yeah. It's the primary bottleneck on scaling up production at the moment."
"Isn't it all automated? Why do you need an AI on that?"
Miriam hesitated again. "It's a bit of a long story. You should actually already have some relevant background in your pre-seeded education". Always good to give them some practice accessing it, anyway. "Tell me what you know about the biohuman employment crisis."
Autumn examined their memory, unnerved by the realization that they were in fact seeing it for the first time, then read as if from a textbook.
"People had long predicted that machine automation would replace human work, and in doing so cause social upheaval, since employment had long had a social and cultural role as well as a productive one. As automated systems spread and grew in their capabilities, the territory of employment exclusively reserved for biological humans (at the time the only humans) shrank, and industry by industry whole generations of people found themselves disempowered and disillusioned. Economists and politicians started to view automation itself as unfair: when a human did productive work, their income would be taxed, a portion of it taken in recognition of the small part that our shared infrastructure and community and civilization had made a contribution to that productivity too. When a machine did the same work, the owner of the machine captured all of the benefit. So, there was introduced the automation tax, a percentage of all valuable work done by machines: not enough to stop the employment transition completely but enough, it was hoped, to smooth the sharp edges and help support those now less able to support themselves."
"Right", continued Miriam. "And in fact, since it was clearer than ever that biohuman employment served a social purpose while automated work didn't, and perhaps also because automated systems couldn't vote or stage protests, the automation income tax rate kept rising. Finding ways to make use of human labour became a subsidized commercial activity, practically a civic duty."
Autumn looked puzzled. "But if productive uses for humans are precious, and supply of humans needing something to do is plentiful, why make more humans? Why me?"
Miriam smiled, though somewhat humourlessly. "Biological humans… aren't good at a lot of things. You have to build levers for them to pull and buttons for them to push, their reaction times are sloppy and hard to improve, their motor control was calibrated for throwing fruit at other monkeys, not rapid assembly of precision technology. Artificial humans like you have none of those limitations. You're better-engineered for the job we offer you."
"OK, but then why make a human at all? It doesn't sound like the work is intellectually or creatively demanding."
"For that we can go back to the ISRA. Remember, you're legally human. You think like a human does, you are (or, I hope, will be) a participant in human society and culture. To employ you is to give you a role in the advancement of human civilization. In particular", Miriam braced for the coming realization, "your artificial predecessors and colleagues have successfully argued in court that all this means that you must be taxed like a human, that you are exempt from the automation tax".
Autumn looked incredulous. "You're saying I've been granted consciousness, moral agency, the capacity to experience joy and pain, as a… tax dodge?"
Miriam shrugged sympathetically. "It's one more reason than the biohumans get."