Programmable Emotion: What Sci-Fi Gets Right (and Wrong) About Human Code

We’re used to the idea that technology can shape our behaviour. That much is undeniable. UX designers talk about “nudges,” “dark patterns,” or “frictionless flows” as if they were neutral mechanics. Programmers write functions that filter, sort, prioritise. Designers choose colours and shapes that invite one click over another. Human-Computer Interaction (HCI) has always been about more than buttons and screens, it’s about guiding attention, anticipation, trust.

But what happens when the logic of UX, design, and code collides directly with the most unprogrammable thing of all: human emotion?

This is the question I’ve been circling in my work as a designer and coder — and the obsession that fuelled my novel The Siren’s Code. In it, I explore a near-future world where emotion itself is treated as a weakness, something that can be “upgraded out” of the brain. Neural code becomes just another system to patch, tweak, and recompile.

Fiction gives us the freedom to exaggerate. But here’s the strange part: when you peel back the layers, sci-fi’s imaginings about programmable emotion aren’t all that far from the everyday UX choices we already make. Some predictions are spot on. Others fall flat. And a few are quietly terrifying in how plausible they feel.

In this article I unpack what sci-fi gets right, where it misses, and what it means for those of us designing, coding, and living inside digital systems right now.

What Sci-Fi Gets Right: The Fragility of Choice

Pick up almost any speculative novel with tech at its core, and you’ll find one recurring theme: the fragility of human choice.

In Black Mirror’s “Nosedive,” social ratings determine access to homes and flights. In Brave New World, chemical mood regulators ensure social order. In my own novel, an augmentation called QR2 — Question Rationale Removal — cuts out the mental hesitation before making decisions. No second-guessing. Just action.

What these stories get right is how tiny nudges compound into tectonic shifts. That’s something UX designers know instinctively.

  • The difference between a red or green button changes click-throughs by double-digit percentages.

  • A notification ping at the right (or wrong) moment can derail hours of attention.

  • An algorithm adjusting your feed by 0.01% shifts your worldview without you noticing.

Designers and coders already wield this influence. Sci-fi just dials it up to eleven. When we imagine neural implants or emotion-suppressing drugs, we’re not starting from scratch, we’re extending a truth we live every day: our emotions are entangled with the systems we use.

Where Sci-Fi Goes Too Far: The Binary Illusion

If sci-fi is right about fragility, it often stumbles in how it depicts control. Too often, writers imagine emotions as switches: on/off, love/hate, fear/calm. Insert a chip, flip a toggle, and you’ve “solved” a character’s humanity.

But anyone who has built even the simplest interaction knows human response is never binary.

Think about colour palettes. In one project I worked on, Chorus Families, I began with muted, gentle tones to feel safe and calm. But feedback revealed it felt flat, even clinical, ‘like a waiting room wall colour’. So I moved to brighter primary colours with clearer accents. The shift made the site feel warmer, accessible and hopeful, and user engagement rose as a result.

This is what sci-fi often misses: you can’t program emotions like setting fear = false;. You can only mediate them. Shape their expression. Create conditions where certain reactions are more likely than others. Emotions are not bugs in the code. They are the code, recursive, messy, adaptive. If you try to flatten them into binaries, you end up with cardboard people, not characters (or users).

Emotional STATES IN Design

Where speculative fiction is at its best is in the messy middle ground: not total control, not total chaos, but emotion as a design element.

In my book, the protagonist Cassie accidentally triggers an electromagnetic pulse with her pendant, scrambling the neural upgrades of a mercenary. What interested me wasn’t just the sci-fi spectacle, but the emotional fallout. He’s supposed to be a perfect, emotionless operative, and now he finds himself leaking feeling in ways that impact his abilities.

That’s fiction, but the principle is real. As UX designers, understanding our users’ emotional state when they arrive at a product is of utmost importance. Are they exhausted after a long day? Have they just received bad news? Are they distracted, stressed, or perhaps ecstatically happy? Each of these states changes how they perceive colour, wording, timing, and interaction. Designing well means walking in their shoes, not assuming a neutral baseline.

  • The pause before you can re-send an angry email.

  • The “are you sure?” prompt before deleting all your files.

  • The one-click buy that removes pause altogether, leading to regret.

These moments matter not because of friction alone, but because they intersect with a user’s state of mind. Sci-fi at its best recognises this truth: emotion isn’t eradicated by tech; it’s rerouted through it.

Programmers, Designers, Anthropologists

One of the most compelling ideas I’ve carried into both my fiction and design practice is the notion of the Future Anthropologist — a figure in my novel tasked with preventing humanity’s extinction, enslavement, or radical alteration by technology.

It sounds lofty, but I’d argue that’s what we’re all doing in UX and HCI. We are, whether we admit it or not, anthropologists of the present, shaping the future.

  • When I choose whether a signup flow defaults to “share my data,” I’m not just coding; I’m deciding what privacy means in practice.

  • When I design a community platform, I’m not just arranging buttons; I’m defining how belonging feels online.

  • When I test colours, typography, or motion, I’m not just choosing aesthetics; I’m building associations that will echo years later in someone’s memory.

Sci-fi exaggerates this into neural chips and dystopias. But the core is the same: we are already programming emotional landscapes.

What We Risk Getting Wrong Now

If sci-fi’s mistake is treating emotion as binary, ours might be treating it as secondary. Too often in tech, emotion is relegated to “UI polish” or “delight.” The serious work is considered logic, function, performance.

But emotion is not frosting on the cake. It’s the flour in the dough. Strip it out, and the structure collapses.

Consider how many startups talk about “optimising productivity” while their users quietly burn out. Or how AI companies herald “human-like conversation” without asking what it means to feel heard. Or how endless scroll designs insist they are “neutral,” even as they drive loneliness and envy.

The sci-fi stories that linger — from Blade Runner to Her to The Siren’s Code — are the ones that take emotion seriously. They don’t ask, “what if we could remove it?” They ask, “what happens when we forget it’s always there?”

Lessons for UX and HCI

So what do we do with all this as designers, coders, researchers, storytellers? A few thoughts that have emerged from my own work:

  1. Design for hesitation. Build in moments of pause. Not everything should be one-click. Give space for emotion to catch up with action.

  2. Respect ambiguity. Don’t assume emotions map neatly to colours, icons, or words. Test, listen, adapt. What reads as calming in one culture reads as cold in another.

  3. Assume entanglement. There is no “neutral” system. Every function, every nudge, every flow is also emotional design.

  4. Tell the stories. Fiction isn’t escapism here. It’s rehearsal. It helps us ask, “what if?” before we ship features into the world.

  5. Be the anthropologists. Think not just about today’s user metrics, but tomorrow’s cultural impact. What will this design mean when someone looks back in ten years?

Why This Matters

At its heart, the question of programmable emotion isn’t about whether sci-fi is accurate. It’s about whether we are paying attention. The line between speculative fiction and design practice is thinner than we admit. Sci-fi writers borrow from tech’s trajectory; designers borrow from sci-fi’s warnings. In between, users live with the consequences.

When I write about Cassie and Jay — one chaotic, emotional, instinct-driven; the other a controlled, augmented mercenary — I’m not just spinning a thriller. I’m asking the same questions I ask at the whiteboard or in code:

  • What is gained when we suppress hesitation?

  • What is lost when we strip friction away?

  • What kind of humans are we building towards?

Sci-fi’s answer is often bleak. Ours doesn’t have to be. But only if we recognise that emotion isn’t the soft, irrational part of the system. It is the system.

Closing Thought

Sci-fi gets it half-right. It’s right to fear the manipulation of human code. It’s wrong to think we can ever fully tame emotion with tech. For those of us working at the intersection of design, code, and story, the task is not to fantasise about perfect control, it’s to stay humble in the face of complexity. To remember that a lavender button can change someone’s day. That a pause can save a life. That a story can shift how we see ourselves. Because if we don’t think like anthropologists now, we may find ourselves living in the dystopias we once dismissed as fiction. And no amount of toggles, chips, or upgrades will code our way back.

Abi Fawcus. Designer, coder, author — I help people navigate technology today, and imagine where it might take us tomorrow. Say Hi