The Science (and Fiction) of Mind Cloning
- Mike Lamb

- Jul 30
- 5 min read
Black Mirror’s “USS Callister” first beamed onto screens in late 2017, wrapped in the familiar aesthetics of a Star Trek tribute – high-necked uniforms, retro sets, and a Shatner-esque captain at the helm. But beneath the camp was something far darker: a lonely, resentful tech genius – Robert Daly (Jesse Plemons) – trapping digital clones of his co-workers in a sadistic simulation of his own design. A classic descent into techno-horror from the brilliant mind of Charlie Brooker.
Seven years on, fans of the anthology series have been treated to a sequel – USS Callister: Into Infinity – an equally dark and unsettling meditation on power, consent, and what it means to be alive inside a machine. And while the show remains fiction, the science behind it creeps closer to reality every year.

From Mimicry to Emulation
Digital DNA cloning might sound like pure sci-fi – and for now, it is. But the ethical questions raised by USS Callister feel increasingly relevant. Daly’s crew aren’t just virtual lookalikes – they’re self-aware. They think, feel, and suffer. So, if a digital replica can experience everything you do, when does it stop being a copy and start being you?
When we talk about digital cloning, we’re usually referring to one of two things:
Surface-level mimicry: Chatbots, avatars, and virtual assistants that sound like us, act like us, maybe even flub punchlines like us. They’re already here – and getting eerily good at mimicking our every beat.
True mental emulation: A digital brain that thinks and feels as you do. That remembers your childhood, makes new memories, gets bored, loves Back to the Future. That’s the dream – or the nightmare – at the heart of USS Callister.
In the episode, Robert Daly generates fully sentient digital doppelgängers using a strand of hair or a swab of saliva. Thankfully, in the real world, the science is a touch more complicated.

DNA is just a blueprint for your body, not your mind. It doesn’t store memories, emotions, or personality. It can’t capture the pain of losing your childhood pet, your caffeine addiction, or your penchant for Sudoku-solving.
To copy a mind, you’d need a lot more than an outlawed digital cloning machine:
A full brain scan at microscopic resolution – every neuron, every synapse, every connection.
A way to decode your brain’s neural “language” – how it converts experience into thoughts and feelings.
A computer powerful enough to simulate your mind in real time – essentially, to run you digitally.
If the notion sounds oddly familiar, you might have Friends to thank. Over two decades ago, Ross Geller made a prediction:
By the year 2030, there’ll be computers that can carry out the same amount of functions as an actual human brain, so theoretically you could download your thoughts and memories into this computer and live forever as a machine!
It’s 2025 now… so how close are we?
The Hard Problem of Consciousness
The concept of scanning a human brain in such microscopic detail that a computer could replicate it neuron by neuron is known as Whole Brain Emulation (WBE). For now, though, it remains entirely theoretical.
WBE is the Mount Everest of neuroscience – and we’re still at base camp. The human brain is the most complex object in the known universe, with an estimated 86 billion neurons and trillions of synaptic connections. The first step to emulation is actually building a map of the brain. But after decades of research, even the most advanced initiatives have barely scratched the surface. Mapping the brain of a fly and tiny portions of a mouse’s brain are the best scientists have managed up to now.

Associate professor Dobromir Rahnev from Georgia Institute of Technology explains:
In a few decades, a complete map of the human brain may be possible … [yet] uploading this information by itself into a computer won’t accomplish much. That’s because each neuron constantly adjusts its functioning, and that has to be modeled, too.
He concludes:
As a brain scientist who studies perception, I fully expect mind uploading to one day be a reality. But as of today, we’re nowhere close.
Even if a simulation ran flawlessly, we’d still face the biggest unknown of all: what, if anything, would make it “conscious”?
This leads us to the so-called “hard problem of consciousness” – a term coined by philosopher David Chalmers. It describes the mystery of how physical processes in the brain produce subjective experience: what it feels like to be you.
We can trace how light hits the retina, how neurons fire, how the brain processes the colour red. But we still can’t explain why that leads to the experience of redness – or the taste of chocolate, or the feeling of joy.
Chalmers explains:
Consciousness poses the most baffling problems in the science of the mind. There is nothing that we know more intimately than conscious experience, but there is nothing that is harder to explain... Many have tried to explain it, but the explanations always seem to fall short of the target.
We can’t even agree on what consciousness is, let alone recreate it. And that’s more than just a philosophical puzzle. It’s a massive roadblock for mind uploading. Because even if we recreate a brain perfectly, simulate every synapse, and run it on a supercomputer… would that digital mind actually feel anything? Or would it just act like it does?

What Is Life?
Let’s say science cracks the code. A conscious digital copy of you becomes possible. Then what? Is it still you? Does it have rights? Can it be deleted?
These are the same dilemmas explored in USS Callister: Into Infinity, where the digital crew must navigate leadership, identity, and ethics in a world where their creator has long since lost control.
One of the most affecting moments in the sequel comes when a character faces a choice: escape the simulation and return to embodied life – or stay behind to protect the others. It’s a dilemma that mirrors the Season 2 finale of Ben Stiller’s hit show Severance.
In Severance, a biotech company severs employees’ work memories from their home lives, essentially creating two versions of the same person. In the climax, protagonist Mark has the chance to leave the severed world – and chooses not to. Why?
Because for Mark S – the version of Mark who exists only in the severed world – those endless white corridors are his whole reality. His friendships, memories, and identity live entirely within it. Walking away wouldn’t mean freedom – it would mean annihilation.
These stories ask the same essential question: what do we owe to minds that exist in confinement – whether they’re split by surgery or written in code? And in a world of AI companions, digital twins, and neural data collection, we may one day be forced to answer.
And that’s what makes shows like Black Mirror and Severance so impactful. They don’t just speculate about tomorrow – they reflect the world we’re already building today. Not just asking what we’re creating, but who we’re becoming – and whether we’ll still recognise ourselves when we get there.





Comments