The Dream of Sovereign Compute Is Dying

March 26, 2025 · archive

I’m going to write about one of those third rail topics I don’t usually discuss on Bluesky (or Twitter) because there are too many partisans who miss the forest for the trees. In doing so, I’m trying not to become one of those Substack authors who posts the hoary cliche of “talking with a friend”, even though in this case it’s true: I often talk about this stuff with friends, and I know people who’ve been inside of tech for as long as I have if not longer. AI discourse is a minefield, but the reality is, while everyone argues about AGI, the real problem is happening in the background.

For years, computing has swung between two poles: centralized and decentralized, locked-down and open, cloud-based and local. The network is the computer—until suddenly, it isn't. We've seen this cycle before, but AI might finally be the thing that breaks it for good.

Right now, the AI discourse is completely off-track. It exists in a parallel world that doesn’t always have much in common with the material one: The conversation is dominated by AGI hype, existential risk debates, and whether AI will replace your job. Meanwhile, the greatest immediate risk is happening in the background, and almost no one is talking about it: AI is becoming the ultimate Trojan horse for permanent lock-in.

The Trap Is Already Being Set

The push for “AI PCs” is the clearest sign of where this is headed. Intel, AMD, Apple, and Qualcomm are all rolling out laptops with low-power NPUs—dedicated AI accelerators baked into the CPU package. But these aren’t some grand revolution in local AI compute. They’re platform control mechanisms.

  • AI will be locked to hardware. Just like how Intel tied AVX-512 to its high-end chips, AI performance will be gated behind specific devices. Want better AI features? Buy new hardware.

  • Cloud AI will still be required. These NPUs aren’t powerful enough for serious workloads, meaning Microsoft, Google, and Amazon will still get their cut by offloading heavier tasks to their cloud infrastructure.

  • The OS layer will be the choke point. AI will be tightly integrated into Windows, macOS, and mobile ecosystems. Want to run your own models? Good luck fighting whatever software restrictions they put in place.

  • This justifies higher prices for worse user control. The pitch will be “your PC is smarter than ever,” but the reality is that it’s just another excuse to remove user control, push subscriptions, and restrict local compute.

The more people rely on AI for everyday tasks, the harder it becomes to escape this cycle. It's the same playbook that made cloud storage and SaaS unavoidable—but this time, it’s your entire computing experience that’s getting trapped.

Sovereign Compute Is at Odds With Affordable Compute

For years, personal computing enthusiasts could at least fight back. Build your own PC. Run your own server. Find ways to keep control. But AI makes that increasingly difficult.

  • Want local AI? You’ll need high-end GPUs or Apple Silicon. And even then, the best models are locked up behind APIs controlled by OpenAI, Anthropic, and Google.

  • Want to escape the cloud? Too bad—AI inference is expensive. Running a decent model locally either costs a fortune in power-hungry GPUs or locks you into Apple’s ecosystem.

  • Want to avoid proprietary AI? There’s no real alternative yet. Open-source models exist, but they’re years behind the best commercial ones, and they don’t get cheaper to run over time.

AI is making the economics of computing worse for anyone who wants control. The dream of a fully self-hosted, local-first, privacy-respecting AI setup is slipping away—not because it’s impossible, but because it’s being made impractical by design.

The Future We’re Sleepwalking Into

This isn’t some far-off, hypothetical concern. The next generation of computing is being decided right now.

If the trajectory doesn’t change, we end up in a world where:

  • Local computing is hobbled unless you’re inside a walled garden. The only “real” AI experiences happen inside Microsoft, Google, and Apple ecosystems.

  • Open-source alternatives become niche and underpowered. The best models, the ones that actually matter, will be closed-source and tied to cloud subscriptions.

  • Consumer hardware gets worse, not better. Thin clients will make a comeback because why bother giving users real power when everything important is locked behind a cloud AI backend?

  • Self-hosting will be for hobbyists, not real users. The knowledge gap will get bigger, and truly local AI will become a boutique interest rather than a viable mainstream alternative.

This is the real discussion that needs to happen, but we’re too busy debating AGI ethics and AI art theft to notice the walls closing in.

Most AI discourse is stuck debating surface-level concerns. Meanwhile, the real problem—computing lock-in—is unfolding beneath our feet.

::: captioned-image-container

:::

Most AI discourse is stuck at this level. Meanwhile, the real problem is happening in the background.

And what’s worse? Many of the most vocal AI critics don’t even see the trap for what it is. Some actively want AI to be locked down behind corporate walls because they think it will make people stop using it. But that’s a huge miscalculation. When AI becomes an unavoidable part of Windows, macOS, and every major software suite, these same people won’t be fleeing to Linux or open-source alternatives—they’ll be complaining about AI from inside the very walled gardens they cheered on.

The end result isn’t a world where AI is weakened—it’s a world where AI is controlled by a handful of corporations, with no meaningful alternatives left.

What Can Be Done?

Realistically, the window for action is closing. The more AI gets baked into the OS level, the harder it will be to claw back control. But some things are still possible:

  1. Push for real local AI options. Support projects working on open-source, locally runnable models. Demand better consumer hardware that isn’t shackled to proprietary AI ecosystems.

  2. Resist cloud dependence. If you use AI tools, prioritize ones that give you the option to run locally. The more people depend on cloud-based AI, the worse this gets.

  3. Call out the shift before it’s too late. AI is being used as a wedge to break the sovereignty of personal computing, and most people don’t see it yet. Make them see it.

Big Tech is fumbling AI right now, and that might make it seem like there’s nothing to worry about. But fumbling doesn’t mean failing—it just means they haven’t figured out how to fully monetize and entrench it yet. IBM fumbled the PC, and Microsoft fumbled the Internet, but both still became dominant forces that reshaped computing.

The real danger isn’t that they’ll execute AI perfectly—it’s that even when they screw up, their sheer scale ensures they’ll still define the landscape by default. Once AI becomes another cloud service, another locked-down feature, another thing you rent instead of own, the fight will already be over. Sovereign compute won’t just be dying.

It’ll be dead.