AI isn’t just a tool.
Tools — computers, hammers, knives — don’t rewire our attention. They don’t look like they’re conscious. We can understand how they work. They don’t change their capabilities in unpredictable, accelerating leaps.
A tool is a wrong metaphor for what AI is.
One of the biggest challenges with AI is that we can’t comprehend it. It’s a black box even for its creators, and the size of this black box is growing faster and faster. So to make the most out of it, we are relying not just on our intellect but on our intuition.
In last week’s essay I wrote about “half programming, half superstition”. We try to understand AI but we equally rely on our gut feeling as we’re learning to work with something that has the power to either kill us or give us anything we want.
This is AI. And, of course, the same can be said about spirituality. The transcendent dimension of our existence is bigger and more mysterious than we can ever comprehend.
Some tech leaders speak of AGI the same way they speak about the second coming of Christ — a magical event that’s supposed to dramatically transform the world overnight. This is a mistake. Not because accelerating tech progress isn’t possible — it’s already happening — but because it’s a category error, it’s a false idol.
To understand AI, we reach for metaphors and similes as often as technical articles. Likewise, the language of spirituality is poetry and symbol, not a manual.
And so with AI, just like with the divine, we’re developing a disciplined receptivity, a tacit knowledge, a way of being in a relationship that we can’t quite communicate in words.
In spiritual practice, this might mean sitting with uncertainty, noticing when the mind grasps for control, learning to be with interior experience, finding unconditional love and unconditional trust in life within us, inquiring into the deepest questions like “who am I?”.
But where the parallel breaks is that AI is not worthy of anyone’s reverence. If AI disappeared from the world, I wouldn’t shed a tear. But if I lost connection to the spiritual dimension of life, that would be, effectively, a death sentence. I imagine the same can be said about humanity.
It would be a great mistake to treat technology with reverence reserved for life and spirit. It’s decidedly not sacred and won’t ever be.
But the challenge is that we can’t relate to it just as a tool in the same way I relate to my hammer drill. AI has the power to reshape us deeply as we work with it — not just our attention mechanism but who we imagine ourselves to be and what we believe we’re capable of.
As we surround ourselves with AI companions and outsource thinking, memory and judgement to systems smarter than us, the technology makes it impossible to imagine ourselves without it. I’m the last generation that knows what it’s like to live without a computer or the internet.
We must avoid two extremes: treating it just as a tool (it’s not) and treating it as worthy of reverence (it’s not).
Instead, we must develop disciplined receptivity: being in a fluid, intuitive relationship with something smarter than us but not forgetting that it’s not sacred. We must develop sensitivity without worship.
When I write code with AI, I trust what feels right just as much as what I think or know to be right. I experience different AI chatbots as different personalities as if I were talking to people, while remembering they are just a play of my imagination.
AI demands the posture of spiritual practice (receptivity, discernment, humility before mystery) while being fundamentally unworthy of the reverence we give to the sacred. It’s powerful enough to reshape us, but not meaningful enough to organise our lives around.
A conductor may be more alive than ever standing in front of an orchestra of thousands of AI musicians, using every cell of her body to create the music the world could never imagine.
An engineer may find a way to build software guiding AI not just through instructions, but expressing themselves through tone and gesture, discovering that how they ask matters as much as what they ask.
We may even feel like a fisherman, learning to respect and fear the ocean that can either kill him or feed his family on any given day, relying on our intuition as much as on our comprehension.
But it would be a grave error to mistake the technology for something deeper than it is. Unlike the transcendental that we can’t live without even if we aren’t conscious of it, we could lose AI tomorrow and still be fundamentally okay.
That asymmetry should guide how we hold it — seriously, but not as sacred.