Photo by Tam Hunt

How AI and crypto will eat the world

Tam Hunt
6 min read1 day ago

--

Current trajectories for AI and crypto energy demand are a runaway train that will in just a few years require more energy than the entire world currently consumes

The digital revolution promised to dematerialize our world, replacing physical goods with weightless bits and bytes. Instead, we’re hurtling toward a physical impossibility: if current trends continue, by 2032 artificial intelligence (AI) alone would require more electricity than our entire planet produces.

Current trends in AI “compute” energy requirements are growing at roughly one order of magnitude (10x) every two years. Here’s what that means in concrete terms:

AI compute requirements through 2030 are based on Aschenbrenner (2024), with extrapolation to 2035 following the same growth rate. Numbers beyond 2030 are mathematical extrapolation ignoring physical constraints.

Read those numbers again. By 2032, just seven years from now, continuing current trends would require more electricity than the entire world currently produces. By 2035, it would require about 60 times current global electricity production.

To put this in perspective: if we somehow managed to capture 1% of all solar energy hitting Earth’s land surface with perfect efficiency — which would mean covering massive swaths of the planet in solar panels — we’d still only have enough power for about 4 years of this trajectory.

The Physics Wall

We’re not facing a mere environmental crisis or sustainability challenge. We’re racing toward the hard limits of physics. The implications are stark: the current paradigm of AI development, based on ever-increasing compute power, is quite literally impossible to sustain.

We’re already seeing early signs of this constraint. Data center projects are being cancelled or delayed worldwide due to power limitations. Ireland expects data centers to consume 30% of its electricity by 2030 — and that projection doesn’t even account for the exponential growth in AI compute. In the United States, locations with abundant power like Eastern Washington are already maxed out, their electrical grids strained by existing data centers.

A number of AI companies are trying to revive mothballed nuclear power plants — including Three Mile Island! — or build new ones.

The tech industry’s standard responses to energy concerns become almost laughably inadequate when faced with these numbers. Switching to renewable energy or improving cooling systems are band-aids on a bullet wound when your trajectory requires multiples of times more energy than exists. Even the most optimistic projections for renewable energy deployment would barely make a dent in this equation.

Why Efficiency Won’t Save Us

Some argue that improvements in computing efficiency will solve this crisis. After all, we’ve seen remarkable gains in AI compute efficiency:

  • Specialized AI chips like Nvidia’s H100s that are vastly more efficient than general-purpose processors
  • Algorithmic improvements that have delivered order-of-magnitude efficiency gains
  • New architectures that can achieve similar results with smaller models

But even these impressive gains pale against the sheer scale of compute growth. If we look at historical trends, efficiency improvements in computing have delivered perhaps half an order of magnitude per year in the best cases. Against compute demands growing by an order of magnitude every two years, efficiency improvements merely buy us a small delay before hitting the physics wall.

The basic logic of the AI arms race requires that we never have enough — there is literally no point at which participants will be able to say “ok, now we have enough power to stay ahead of the competition.”

Even quantum computing, often proposed as a solution, won’t help here. The types of calculations used in current AI training — primarily massive matrix multiplication — don’t benefit from the kinds of quantum speedups we know how to achieve. Even if they did, quantum computers would still require physical resources and energy to operate — and, again, the AI arms race “logic” would require ever more quantum computing even if we did switch entirely to quantum computers.

The Coming Crisis

We’re not talking about a distant future problem. The contradictions in our current approach will force radical changes within the next 3–5 years. Something has to give, and there are only a few possibilities:

  1. Fundamental shift in AI development: AI research might be forced to abandon its reliance on ever-increasing compute power. The current approach of scaling up models and training data would need to give way to radically more efficient approaches. We’re seeing early research in this direction, but it’s far from clear whether such approaches can match the capabilities of large-scale models.
  2. Energy resource conflict: If development continues on its current trajectory, AI compute demand will soon compete directly with basic human needs for electricity. This could drive unprecedented conflict over energy resources and fundamentally reshape global politics.
  3. Policymakers get smart: We figure out earlier rather than later how to head off this coming crisis.

The Policy Challenge

The most likely outcome is that AI development will hit a wall — not from regulation or ethical concerns, but from energy supply limits. This could happen gradually through escalating power costs, grid constraints, and the time it takes to develop new power projects, or suddenly through cascading infrastructure failures.

But it exactly what it looks like when we hit the wall could be very ugly, due to the massive pressures fueling the AI arms race, such as national security and gargantuan economic rewards. How much of the planet’s surface area will be consumed by the need to fuel AI and crypto before the whole system collapses under its own weight?

These trends demands immediate policy action, but we face a serious challenge: the enormous profits from AI create powerful incentives to push against any constraints. Our political systems, particularly vulnerable to financial influence (can we say “pay to play”?), may fail to implement necessary limits until we hit the wall at full speed.

What would reasonable policy look like? Here are a few possibilities:

  1. Mandatory efficiency standards: Rather than allowing unlimited compute scaling, we need efficiency requirements that force development toward more sustainable approaches.
  2. Power usage limits: Just as we regulate other forms of industrial power consumption, we need hard caps on AI training energy use.
  3. Research incentives: We need massive investment in fundamentally new computing approaches that could break free from current energy scaling requirements.

Cryptocurrency: A Parallel Crisis

While AI’s energy trajectory is the more dramatic challenge, cryptocurrency mining presents a parallel crisis. Bitcoin’s proof-of-work system deliberately wastes energy as a security measure, with energy requirements growing steadily as mining difficulty increases. While this growth is more linear than AI’s exponential appetite, it’s still unsustainable when projected forward decades.

The solution here is clearer: we already know how to run cryptocurrencies without massive energy consumption. Ethereum’s shift to proof-of-stake demonstrated this conclusively. Bitcoin and other proof-of-work cryptocurrencies must follow suit or face inevitable constraints from physics and energy availability. The problem, yet again, is money: the existing Bitcoin algorithm has created $trillions in new wealth and there is little incentive for Bitcoin miners to change the system. Miners would need to vote en masse, at 51% or higher, to change the current proof-of-work algorithm. Perhaps if countries started (again) to ban crypto mining it would force miners to change the algorithm. Don’t hold your breath.

Beyond Magical Thinking

The tech industry loves to talk about exponential growth and infinite scalability. But we live in a finite world with finite resources. The laws of physics don’t care about our aspirations for artificial general intelligence or decentralized finance. They set hard limits that we’re rapidly approaching.

This reality forces us to confront difficult questions: How do we transition to fundamentally different approaches to AI development? What capabilities must we sacrifice in the name of sustainability? Which applications justify extreme energy expenditure, and which don’t?

The digital revolution promised to dematerialize our economy. Instead, it’s taught us that even digital technologies are constrained by physical reality. The sooner we accept and adapt to these constraints, the better chance we have to prevent AI and crypto from eating the entire world.

The time for incremental changes has passed. We need a fundamental rethinking of how we approach computation-intensive technologies. And yes, banning specific high-energy applications should be very much on the table. Physics will force this reckoning whether we plan for it or not. The only question is whether we manage this transition thoughtfully or crash into the limits at full speed.

--

--

Tam Hunt
Tam Hunt

Written by Tam Hunt

Public policy, green energy, climate change, technology, law, philosophy, biology, evolution, physics, cosmology, foreign policy, futurism, spirituality

No responses yet