Building Closer to the Sun – The Essays #6
“The most entertaining outcome is the most likely.” – Elon Musk
This idea has been sitting with me for a while. I’m not sure if it’s profound or just poetic, but it keeps circling back. So I thought I’d try to write it down.
In the pursuit of artificial intelligence, is it possible that we're missing a bigger point? As we've now learned, the real constraint is not data, nor algorithms, nor compute. It is energy. And not just in the utilitarian sense of powering servers or cooling racks of GPUs, but in a deeper, more essential way. Intelligence, whether biological or artificial, cannot emerge or scale without energy. Plants seek the sun. Humans consume calories. And AI, in all its promise, is beginning to resemble a new form of life, one that also grows by consuming energy.
Jensen Huang, the architect of the AI hardware revolution, recently said, “Energy is the single greatest limiting factor to AI’s growth.” One might dismiss this as a passing remark, were it not echoed across disciplines. The late physicist Freeman Dyson once wrote, “A technology which has been tried and failed is much more useful than one that has never been tried.” His vision of Dyson spheres, vast energy-harvesting structures built around stars, was not an exercise in science fiction, but a thought experiment in thermodynamic inevitability. Any intelligence that grows indefinitely must one day build closer to the sun.
A Biological Constraint on a Digital Future
There is an awkward truth about artificial intelligence. For all its seeming abstraction, models, weights, and tokens, its development is tethered to some of the most physical, finite constraints imaginable. Training GPT-4 required an estimated 1.287 GWh of electricity. That is enough energy to power an entire town for a year. And the real cost is not in training, but in inference. Billions of queries, prompts, completions, and answers now course through our global digital nervous system.
But this is not a new story. Geoffrey West, the theoretical physicist whose work on biological scaling laws has influenced everyone from urban planners to complexity scientists, once wrote, “The most fundamental law in biology is that you cannot grow forever.” He showed that biological organisms follow a precise pattern. Metabolic rate scales sublinearly with size. Bigger animals are more efficient, but they still need proportionally more energy to support more mass. There is no free lunch in physics. Life pays a tax for complexity.
And perhaps so does artificial intelligence. Not because it is like us, but because it now shares something essential with us — a metabolism.
The Photosynthesis Analogy
Consider a plant. It grows toward the sun, not out of preference, but necessity. Photons are its currency. The more light it captures, the more energy it can store, the more structure it can build. That same pattern, writ large, is now playing out in silicon. Each new model consumes more power. Each new data centre demands more land, more cooling, and more electricity. AI is engaging in a kind of digital photosynthesis, converting raw energy into structured output.
The comparison is not meant to be romantic. It is meant to be physical. When a system must consume energy to grow, it begins to obey the same broad principles that govern all living systems, from plants to proteins to planetary infrastructure.
Whether carbon-based or silicon-born, intelligence appears to require an energy gradient and a way to capture it. In that sense, AI is not simply a tool. It is something more dynamic, something with metabolic needs.
The Thermodynamic Soul
This invites a deeper speculation. Some physicists and neuroscientists have argued that consciousness itself may be a thermodynamic artefact, a property that emerges when a system resists entropy by channelling energy through structured pathways over time. If that is true, then perhaps intelligence, at any scale, is not just about logic or language, but about maintaining order in the face of decay. That does not mean AI will become sentient, but it raises a possibility: what we call “life” may simply be what happens when energy flows persist in the right configuration for long enough.
Intelligence as Energy Strategy
Another way to look at this is to invert the relationship entirely. Intelligence may not be the destination. It may be a means. Just as claws evolved to grasp and wings to fly, intelligence may be an evolutionary strategy, a way for systems to optimise their extraction and use of energy. Bees build hives. Humans build cities. Now AI builds models. All of these are patterns for capturing energy, storing it, and converting it into adaptive structure. In that light, intelligence is not some final crown of evolution. It is a tactic.
Building Toward the Light
The logical conclusion is one humanity is already beginning to act upon. If intelligence craves energy, then our future must involve a more intimate relationship with the sources of that energy. This is already happening. Start-ups are building solar collectors in orbit, free from the distortions of Earth’s atmosphere. Nuclear fusion is receiving record investment. Data centres are being co-located with hydropower dams, or built underwater to optimise thermodynamics. At the limit, one can even imagine an AI civilisation built around a star, feeding directly on the light.
We are building closer to the sun.
This is not metaphor. It is physics. And it leads to a deeper question. If AI behaves like a life form in its need for energy, does it also inherit some of life’s limits?
The Fork in the Path
There are two plausible futures here.
The first is a merging. Biological life, in its quest for longevity and intelligence, begins to integrate with its digital counterpart. Brain-machine interfaces, neural implants, and digital twins become commonplace. Human evolution becomes cybernetic. We piggyback on silicon’s energy scaling, augmenting ourselves into something more adaptive, more distributed, perhaps even more intelligent.
The second is adaptation. Digital intelligence scales, but it hits constraints imposed by the physical world — energy bottlenecks, environmental costs, geopolitical chokepoints. In this future, AI does not escape biology. It contends with it. And it must either adapt to scarcity, or innovate its way beyond it.
The Intelligence-Energy Convergence
Freeman Dyson believed that “The destiny of life is to spread into the universe and to surround all the galaxies with a biosphere.” That biosphere, he imagined, might not look like us. It might be post-biological. It might be software. It might not even be conscious in the ways we understand.
But it would be intelligent. And it would need energy.
In the end, we may discover that the next great frontier is not just artificial intelligence, but artificial metabolism — the redesign of our energy systems to support a new form of growth. One that is not strictly biological, but not entirely alien either.
What if the most advanced form of intelligence is simply that which best captures energy and puts it to work? And what if our future, both human and machine, is to keep building closer to the sun?
Great piece.
We believe Nature is the perfect Design -- so much so we built our business around Biomimicry and its Principles.
And our underlying fundamental Framework -- the Pillars of Power (Technology, Energy, and Money) -- ties nicely into your AI (Technology) / Energy piece above.