HomeSci-FiTalesSorry, You are not Human

Related Posts

Sorry, You are not Human

Jean De Meyere
Doctoral Researcher at KU Leuven's Center for IT & IP Law

Jean De Meyere is a Doctoral Researcher at KU Leuven's Center for IT & IP Law, where he focuses on questions related to Artificial Intelligence. He participates in the MANOLO project, which is concerned with the development of efficient, trustworthy AI for Cloud-Edge Computing. Jean began his PhD thesis at UCLouvain in 2022 under the supervision of Prof. Alain Strowel, entitled "Co-regulation and Transparency as Key Principles for Fighting Online Disinformation under EU Law." His research focuses on disinformation, AI, privacy, platform governance, and the distribution of normative power within the digital realm. Before entering academia, Jean worked for two years as a Data Privacy Consultant for Computer Task Group Belgium, serving various clients, including the DG IT of the European Commission.

Reading Time: 14 minutes
Print Friendly, PDF & Email

In his cozy city apartment, John’s alarm clock commences its daily ritual of ringing. John stirs from a light slumber, gently roused as his smart mattress calculates the optimal moment to extricate him from dreams. The curtains, synced to his sleep pattern analysis, have already retracted, flooding the room with natural light to facilitate the awakening process.

John rises, feeling foreseeably well-rested, and makes his way to the kitchen. There, a cup of coffee awaits, having finished brewing precisely as he’s ready for his morning dose of caffeine. Inside the smart oven, a breakfast tailored to his day’s requirements sits ready. Its nutritional content has been meticulously determined based on John’s agenda, ensuring he receives the exact nutrients needed for the day’s tasks.

Following his routine, John engages in a morning workout meticulously orchestrated by his e-training assistant. After completing his exercise, he takes a shower, puts on the clothes its artificial assistant selected for him, and steps onto the street. A smart car is already waiting to convey him to work. The journey, lasting twenty-five minutes, is shared with colleagues who reside nearby and have also just completed their morning routines. Conversations during the ride revolve around mundane topics, like the algorithmically elected Miss France and the outcomes of the previous day’s football match.
Upon arrival at the Global AI local branch, John joins his numerous colleagues in refining and fine-tuning the myriad decisions constantly made by the Global AI algorithm. As the largest employer worldwide, Global AI oversees the Big Data Society it has created, ensuring smooth operations. Its robust algorithms, the vast amount of processed data, and the contributions of millions of human ‘refiners’ have enabled Global AI to replace traditional decision-making mechanisms, ostensibly liberating humanity from the burdens of governance and fostering a society where individuals purportedly thrive and feel fulfilled.

At 5:00 PM, John exits the building. Another car awaits him while his colleague, Mark, prepares to mount a motorcycle.

“Wow, nice bicycle,” John remarks.

Mark turns, his expression a mix of confusion and unease, perhaps even a trace of fear. John quickly realizes his error, feeling a surge of awkwardness — confusing such a magnificent motorcycle for a mere bicycle could only be seen as foolish.

“Uh, yeah, I mean… nice ride, man!” John attempts to recover, seeking to regain his poise.

Mark nods, indicating that the mistake is of no great consequence. Relieved, John enters the vehicle and glances at the tablet screen in front of him, displaying his schedule for the evening. A sushi restaurant is on the agenda. It strikes him that he has been craving sashimi all day, a desire he hadn’t consciously recognized until now.

While eating his nigiri, John’s gaze drifts to the next table, where a couple is also enjoying sashimi. He muses about his own love life, wondering when he will find someone. Perhaps he’ll go clubbing later? The decision, after all, isn’t entirely his to make — the algorithm knows best, and he, as one of its custodians, should trust it implicitly.

11:00 PM. After the car has returned home, John lies in bed. After watching a movie and completing his evening routine, he feels relaxed and ready for another day at work. Tomorrow night he might go out, he thinks. If that’s what he desires, the algorithm will undoubtedly know.

In his compact city apartment, John awakens, not to the usual symphony of technological precision, but to an enveloping darkness. A dream, vague and unsettling, where he sailed with a childhood friend whose name now escapes him, clings to the edges of his consciousness. He’s still cloaked in drowsiness, but a glance at the clock ignites a spark of panic — he’s perilously close to being late for work.

Frantically, John dresses in haste, not realizing his clothes weren’t laid out by the assistant as per routine. Skipping breakfast and his exercise regimen, he hurries down the stairs, his footsteps echoing in the empty corridor, a stark contrast to the usual orchestrated morning ballet.

Upon reaching the street, John is met with an eerie silence — the automated car that invariably awaited him is conspicuously absent. Confusion mounts as he retrieves his phone, discovering the ride-sharing app’s automated mode disabled, likely a glitch, he assumes. His attempts to reactivate it prove futile. Typing in the address of his workplace, a wave of shame washes over him — he cannot recall it.

Pausing, John is struck by the gravity of the situation. It’s not just a minor oversight; he spends as much time at work as at home. His morning’s upheavals now seem like auguries of a deeper malaise. Yet, duty calls. A hasty web search for his employer’s address and a six-minute wait later, he finds himself alone in an automated car, heading to work.

Upon arrival, his day spirals further into disarray. The smart security system, which should have greeted him like an old friend, stares back blankly, unrecognizing. He watches the minutes tick by in the lobby, each one amplifying his growing unease. It’s only by sheer chance, spotting a colleague of his amidst the early morning crowd, that he gains entry.

Before he can even grasp a moment of respite, John finds himself before Angela, his boss. His attempts to explain the morning’s chaos fall short. His words, intended to be coherent explanations, devolve into a jumbled stream, tinged with frustration and a rising voice. His cheeks flush with embarrassment as he attempts a hasty apology, only to see Angela’s mild irritation morph into outright exasperation.

“Clearly, something’s amiss with you today, John,” Angela says, her tone firm yet not unkind. “You’re a valued member of our team, but such behavior is unacceptable. Take the rest of the week off. I’ll arrange for you to see one of our company’s personal therapy assistants.”

As he stammers an apology and stands to leave, Angela stops him. “Your professional devices, John. And your badge. It’s protocol.”

“But my personal phone isn’t with me. How will I get home?” John’s voice is a mix of disbelief and desperation.

Angela’s response is curt, almost robotic. “You’re off duty, John. It’s not a concern for the company.”

Shocked by her cold demeanor, John relinquishes his devices and badge. Clutching his old-fashioned home keys — a vestige of his father’s era — he steps outside, feeling more disconnected than ever. He spots Brad, a colleague, in the smoking area.

“Hey Brad,” John calls out, a note of hesitancy in his voice. “Could I borrow your phone? I’ve just been…” He pauses, the word ‘suspended’ feeling alien on his tongue, “…suspended, and I don’t have my phone to call for a ride.”

Brad looks at him with a mix of suspicion and curiosity. In their world, suspensions are a rarity, almost a myth. Nevertheless, Brad hands over his phone, along with an offer of a cigarette. John declines; he’s always disliked the smell of tobacco, a preference clearly reflected in his automated shopping lists, in stark contrast to Brad’s evident habits.

John opens the ride-sharing app, only to be confronted with another unexpected hurdle. His home address, a detail he should know by heart, escapes him. He knows the way to the park, to the city center, but the exact address? It’s a blank.

Attempting to log in, he’s faced with an antiquated captcha challenge — a relic from a past era. It asks him to identify traffic lights in a grid of images. John chuckles dryly; it’s been years since he’s seen such a captcha. His life, so intertwined with advanced AI, has never required him to pay attention to such mundane details. He carefully selects the images, a guesswork for someone who, like most citizens, has never driven a car due to prohibitive insurance costs.

The screen flashes an apologetic message: “We are deeply sorry, but we could not assess your human character in a satisfactory manner. Please retry in five minutes.” John stares at the message, feeling a growing sense of despair and disorientation. He hands the phone back to Brad, noticing the wariness in his colleague’s eyes.

“You know what, Brad? It’s a nice day. I’ll walk,” John says with a forced smile. “I’ll see you next week. Say hi to the wife for me!”

Back at his home, the afternoon slowly slips away as John navigates through a series of detours, still unfamiliar with the direct path to his own apartment. When he finally arrives, the absence of a prepared meal greets him, but this no longer surprises him. He hastily eats a breakfast bar, the only quick sustenance available.

In the shower, John struggles with temperature control, a task made challenging without the automated settings he’s accustomed to. The water is either too hot or too cold, a minor annoyance that nonetheless adds to his growing sense of frustration.

Feeling worn out, John rummages through his belongings for his old phone. This device, long unused, serves as his immediate link to the intelligent systems of his apartment. He discovers that a crucial update, which should have been installed automatically around 3 AM, failed to do so, leading to the day’s series of technological mishaps.

With a sense of relief, John manually initiates the update, addressing the problem that has caused so much chaos. It strikes him as odd that the automated security systems hadn’t flagged this issue earlier. He makes a mental note to discuss this anomaly with Angela, curious about the oversight. As the night falls on the city, he falls asleep.

The night still cloaks the city in darkness, and the curtains in John’s apartment remain drawn. Abruptly, his phone rings, jolting John from his sleep. Groggily, he reaches for the device and is surprised to see his boss’s name on the screen. He answers with a confused greeting.

“Angela? Hi, good morning. What’s the ma…”

John’s words are cut short as Angela interrupts him. Her tone is unusually tense, her words rushing out faster than normal. John senses a mix of unease and embarrassment in her usually composed voice.

“Listen, John… I don’t know how to say this, but… there’s been an update to your human index. It’s likely just a technical issue, something that’ll be fixed soon. However, until then, your work clearances with automated tools and social supervision are revoked. I’m sorry, but this violates your employment agreement with Global AI. We have to let you go, effective immediately. Since the human index is your responsibility, you don’t qualify for compensation. As a goodwill gesture, you can use the apartment for ten more days to get back on your feet… You’ll also hear from the AI-human evaluation authority soon… And if things get resolved, we can discuss reinstating you.”

John is stunned. Yesterday was rough, and he knew his office behavior was erratic, but fired? He hadn’t seen this coming, especially not for a technical glitch. As he tries to respond, Angela hastily concludes.

“I have to go now. I’ll connect you with HR for the details. Take care, John.”

Dumbfounded, John half-listens as the call is transferred to a monotonous HR representative. Midway through the HR spiel, another call comes in, this time from an unfamiliar number with a ‘000’ prefix – the Global AI and Society Council. John cuts off the HR rep, seeking answers about the “technical error.” The voice on the other end, smooth yet slightly robotic, is unmistakably from an automated Council agent.

“Citizen John McCormick. Your recent behavior review has led to an update on your human status. Please report to the Global Council Court at 9 AM for a decision review. For details and potential outcomes, refer to our Terms and Conditions. Have an excellent day.”

John glances at the clock; it’s almost 8 AM. The Court, likely in the city center, is a seven-minute drive – if he can get a car. Otherwise, it’s a half-hour walk. Rushing, John forgoes breakfast and heads to the shower.

John arrives at the Court, a sweat-drenched figure dwarfed by the neo-futuristic edifice towering above. The building, a stark blend of white and glass, asserts its dominance over the neighborhood with an almost aggressive architectural language. His awe at its grandeur is short-lived, overshadowed by an unsettling discovery: every Global AI application on his phone has been rendered inoperable, leaving cryptic error messages in place of familiar interfaces. Bereft of technological guidance, John is forced to navigate on foot, relying on the looming tower in the skyline as his beacon.

Upon entering the building, he traverses the expansive lobby to approach the reception. The receptionist, a tall man with penetrating brown eyes and an expression etched in stoicism, greets him in a voice devoid of warmth or inflection.

“Good morning, Sir. How may I assist you?”

John, breathless and disoriented, quickly recounts his ordeal – the unexpected phone call, the unresponsive AI services, his unfamiliarity with the route – all culminating in his slightly delayed arrival.

The receptionist’s face contorts into a disapproving sneer as he produces a badge from beneath the desk. He gestures dismissively towards the bank of elevators.

“I’m not whom you need to explain your tardiness to,” the clerk says, his emphasis on ‘late’ adding to John’s discomfiture. “The judge is in Room 40-234. Take the first elevator on your left, fourth floor, turn right. You’ll find the room at the end of the corridor.”

As John turns towards the elevators, the receptionist’s voice softens unexpectedly, prompting him to look back. The man’s eyes, previously cold, now hold a flicker of empathy.

“Sir,” he says gently, handing over the badge. “Don’t fret about being a bit late. It might even humanize you in front of the judge.”

The phone at the reception desk rings, cutting short the exchange. Left alone with these parting words, John feels as though a dam has burst within his mind. The day’s challenges, initially mere frustrations, now coalesce into a horrifying realization: society no longer perceives him as a human but as an algorithm inhabiting a human shell.

Rooted to the spot, John grapples with the enormity of this revelation. The receptionist, engrossed in a new call, snaps John out of his stupor with a wave and a glance at his watch. Aware of his lingering, John musters his composure and proceeds towards the elevator. Inside, he hits the button for the fourth floor, his mind racing with questions. Why has the infallible system deemed him non-human? What could have possibly led to this existential misclassification?

John sits nervously in front of the judge. He expected to be in some sort of courtroom, but he finds himself in a grey, lifeless office. The four walls are empty, with no sight of any decoration nor personal touch. Everything in the room, from the white acrylic furniture to the stern black outfit of the judge, emanates the striking impartiality of Justice. The judge, a middle-aged woman, scrutinizes John under her thick glasses.

Well, good morning.  It is… nine thirteen…

The disapproving tone of the judge let John think that the receptionist’s intuition might have consisted of an advantage was plain wrong. John starts to mumble some words of excuse, but the judge raises her hands, leaving him speechless.

No Sir, I do not need to hear any excuses. After all, I do not even need to hear you at all.  You have worked with Global AI for your whole career, so you probably already know this, but I would like to take some time to stress this out: the Global AI ecosystem do not make mistakes. If they have regressed your status, it is for a good reason. You are not here today to discuss these results, but merely to hear about how it was taken and what the consequences will be for you. So, please, relax and hear what I ha…

This is too much for John. Just as it happened yesterday in Angela’s office, he senses its frustration and fear reach a boiling point, and he erupts as he stands up to take a firm stance against what he considers to be a fundamentally unjust situation:

What?  What do you mean I cannot speak?  I have been living a total hell and received little to no information on my situation! No mistakes? But of course, there can be mistakes.  Well of course, they are fixed before consequences happen – I mean, that’s what I have always seen on the job.  But clearly, here, it’s not the case!  Can’t you try to understand me? Can’t you try to be human?

A mechanical noise suddenly fills the room, cutting John in the middle of his passionate speech. In the right-side corner of the room, a piece of the wall has been sliding, revealing to John something he had only seen in formations at the office: a military-grade peace robot – the CRX-34, one of the most brilliant innovations in automated peacekeeping. The judge is looking right into his eyes, and she does not need to use words to convey her thoughts: should John keep trying to disturb her authority, she will make sure that legitimate force is used to subdue his vociferations. John gulps and sits back.  Still, quiet, the judges press a button, causing the sliding panel in the wall to revert.

Sir. I urge you to sit down and stop opposing signs of resistance. If not, I shall have no other choice than to take harsh, yet necessary, measures.

As she says this, John notices her eyes locking on the button she just pressed. He feels a shiver running down his spine as he sits back in his chair, showing his open hands as a sign of acceptance and submission.  The judge smiles, satisfied, and opens a folder on her computer. John cannot see the screen, but he assumes all the explanations surrounding those past few days must be shown right there, on the other side of the black monitor.

So, let’s see… Two days ago, at thirteen past five, our global surveillance services notified the Global AI human assessment system of a first incident, which prompted a deepened investigation. This incident consisted in the non-recognition of elementary humanly identifiable elements. In the matter at hand, the difference between a motorcycle and a bicycle.  As I would like this meeting to be crystal clear, I would like to remind you that the difference between those two means of locomotion relies on the energy used by the vehicle: while motorcycle relies on electrical energy only, bikes usually rely on a mix of electrical and manual energy – requiring the user to perform the physical activity of “biking”, …

The judge goes on about the differences and characteristics of both a bike and a motorcycle, leaving John baffled. When the judge told him that he’d hear more about his cases, he would have believed it was related to the actual facts of the decisions and the reasoning behind him – not that he would get a lecture on the differences between two of the most widely-used vehicles in the city. As he opens his mouth to voice his concerns, the judge suddenly stops and coldly locks her eyes into John’s before nodding towards the revolving panel on the nearby wall.  John gets the message: he shuts up and waits for her litany to start again.

This “bike incident” prompted our investigative teams to open a file on you. Error on such a “real-life captcha” by humans are quite rare, justifying the intervention of Global AI Research Team on Non-Human Fraud. Here, it seems that there is some positive in your file : based on calculations based on the aggregated number of actions you have performed in your lifetime, the investigation calculated that you have a seventy-eight percent chance of being human.  This is quite a reasonable score, after all!

John is still sitting silently. He does not share the optimism of the judge. After all, she did mention right away that his case was settled. Furthermore, a little music has been going around for the past hours in his mind, growing stronger and stronger. A little music that questions the very fabric of his humanity. A little music that goes as such: “Are you sure you’re human.”  At first, John thought it was absurd. Of course, he’s a human.  He knows artificial intelligence systems very well, after all, and he’s quite sure he’s not one of them. He was one hundred percent sure of his human character. The judge just lowered his certitude by a significant margin, and John starts to consider that the little music at the back of his mind might tell a grim, but nonetheless unavoidable, truth.

So, now, let’s focus on what happened after the “bike incident.” The system tried to initiate a re-start, which caused you to be late for your job… Of course, they had to suspend you, you cannot employ someone as a human worker if he cannot provide the credentials vouching for his humanity…  I see there has been a report on some aggressive tendance – consistent with what we just saw, actually – but I don’t see how that would have prompted… I mean, this seems really human to me… But the system is never…

At this point, a little ray of shining hope makes its way to the dark clouds surrounding John’s mind. If the judge, who seemed so certain since the beginning of the talk, starts having doubt – maybe… maybe there is still hope? The more the incomprehension in the judge’s face grows, the more John’s hope starts to brighten his soul. But suddenly, the judge points a finger towards the monitor, and a triumphal sound eructs from her throat:

Aaaaaah!  Here it is.  I knew the system was right.  Well, it’s pretty clear sir: just the day after you missed the first captcha test, you failed at another one, a simpler one even. Within twenty-four hours! And of course, with just a 73% results… Yes, that makes perfect sense: you are indeed an automated system. Well, this means that you’re not so much my responsibility after all. You may leave your personal belongings – phone, and keys here.  You won’t need them after all.  Technical services will contact you shortly for your reconfiguration, so just wait for their message – they should be with you by the end of the day and give you a new mission, more suited to your artificial nature. 

The small hope that had been growing inside John’s mind suddenly explodes into a thousand pieces. Vanquished, he does not even try to protest further – what’s the point anyway – and leaves the room, not really knowing where to go next.

It’s been thirty minutes since John was awakened by nearby noises, but he’s still lying on his mattress. The weather is quite fresh today, but John can feel the sun heating up his little tent – he sure is glad the winter is ending. He opens the zipper of his makeshift home and steps outside. The sun is high – it just passed the huge overpass under which John sleeps and where thousands of self-driving cars are transporting people from their apartment to their job – maybe one of his former colleagues is driving above his head right now, John thinks.

John never did a reconfiguration. He did have a meeting with “technical services,” but the engineer quickly determined that he was not an artificial intelligence or, if he was, not one that was up to the current Global AI standards.  Therefore, John was dismissed. Devoid of any status, machine or human, he was left to wander around the streets of the world with no aim nor any kind of recognition. In the end, John has no choice but to exit the city he’d known his whole life to roam across the border of the civilized world run by the Global AI corporation.

‘morning John – can I interest you in a cup of coffee?

John smiles as he hears the trolley of one of his neighbors, Brittany, rolling up towards him.  Brittany’s coffee cart is famous around the tent neighborhood.  She is always on the lookout, jumping on you to offer you a cup of dark, flavorless coffee she scraps from God knows where. John smiles and grabs a cup from the bag inside his tent. As he smells the odor of the liquid pouring into the cup, he looks at the sun above his head. He can still hear the little voice in his head that wonders whether he’s a humane or a machine – how could he ever know?

Here you are Johnny!  The sun’s well out today, we’re gonna have a good day I think!

Yes, yes, you’re right.  It looks like it will be, answers John, still smiling.

John takes a sip of coffee.  Terrible, yet it fills him with a sense of true happiness.  After all, he thinks, if he can feel happy, maybe he’s not a machine after all?

Previous article
Next article

Featured Artist