Moloch Is a Timing Problem With Great PR

December 22, 2025 · archive

A gift for the rationalist in your life who has everything except a clock

Here’s something I keep noticing: brilliant people, confronted with systemic failure, reach for mythology instead of mechanics.

The rationalist community has spent over a decade refining their demon. Moloch—borrowed from Ginsberg, reified by Scott Alexander, invoked whenever coordination collapses under competitive pressure. Arms races. Multipolar traps. Race-to-the-bottom dynamics. All attributed to this hungry god who devours everything that can’t coordinate fast enough to resist him.

It’s compelling mythology. It’s also completely unnecessary.

Everything they attribute to Moloch—every perverse incentive, every coordination failure, every system that optimizes itself into oblivion—has a simpler explanation that doesn’t require supernatural entities.

You just need to notice that different parts of systems run at different speeds.

The Rationalist Blind Spot

The LessWrong tradition is obsessed with epistemic hygiene at the individual level. Bayesian updating. Bias correction. Prediction markets. Careful reasoning about AI risk. They’ve built an entire intellectual apparatus around making better decisions faster.

What’s missing? Any systematic attention to cadence.

They model agents. They model incentives. They model utility functions and game theory payoffs. They model posterior beliefs updating on evidence.

They don’t model the fact that these updates happen at radically different speeds in different parts of the system.

Individual decision-making operates at one timescale—seconds to days. Collective coordination operates at another—months to decades. Institutional change operates slower still. And the gap between these layers isn’t just an implementation detail. It’s the entire problem.

When your individual inference layer updates orders of magnitude faster than your collective evidence layer, you don’t need a demon to explain what happens next. You just need to understand what systems do when their components run at incompatible clock speeds.

What Moloch Actually Is

Strip away the poetry and here’s what “Moloch” describes:

A situation where everyone can see the collective outcome will be bad, everyone would prefer to coordinate, and yet everyone defects anyway because the individual incentive to defect arrives now while the collective mechanism to prevent defection arrives later.

Or never.

This isn’t demonic possession. It’s a pacing problem.

The “demon” is just what temporal desynchronization feels like from inside. Every agent making locally rational decisions at their natural update rate, while the shared reality they’re embedded in operates too slowly to stabilize before the next round of optimization.

The fast layer iterates itself into catastrophe while the slow layer is still loading.

Consider the canonical Moloch example: an arms race. Two nations, both worse off if they compete, both better off if they cooperate. But each faces an immediate incentive to build weapons while the coordination mechanism to prevent the race requires treaties, trust-building, verification protocols—all operating at geologic speed compared to the military-industrial complex.

The nations aren’t possessed by demons. They’re just running different clocks.

Defense procurement cycles run at one speed. Diplomatic consensus-building runs at another. And when the fast layer can iterate multiple times before the slow layer completes a single update, you get runaway dynamics that feel inevitable.

Moloch is the phenomenology of being trapped in positive temporal debt—racing forward faster than the ground beneath you can stabilize.

Why They Need the Myth

The rationalist community can’t acknowledge this because it would require confronting something uncomfortable: their prized epistemic tools might be part of the problem.

Fast, sharp, individual cognition is the rationalist ideal. Updating quickly on evidence. Thinking clearly about complex problems. Out-reasoning slower, less rigorous thinkers.

But what if individual rationality, operating at superhuman cadence, actively destabilizes collective sense-making?

What if the very thing they’ve optimized for—fast, confident updating—creates the conditions for coordination failure when everyone does it simultaneously?

Look at their community dynamics:

  • Blog posts cycle faster than institutional memory

  • Twitter arguments cycle faster than actual research

  • AI timeline predictions cycle faster than the models they’re predicting about

  • “Rationalist consensus” forms and dissolves faster than evidence can accumulate

  • Confident takes on COVID, cryptocurrency, politics, and existential risk propagate at memetic velocity while ground truth updates at epistemic velocity

This is exactly how you generate what I’ve been calling temporal debt in my other work—when confidence updates outpace evidence accumulation, systems hallucinate themselves forward.

The rationalist community might be accumulating this debt while blaming Moloch for why their brilliant ideas don’t save the world.

But acknowledging this would mean admitting that “being more rational faster” isn’t automatically beneficial. That speed itself can be pathological. That you might need to slow down certain processes rather than accelerate them.

Hence: the demon.

Blame Moloch. External force. Tragic inevitability. Not a design flaw in the rationalist project itself.

The Missing Ingredient

I’ve been building toy models of various phenomena for a while now—institutional decay, platform collapse, AI behavior, market dynamics. Different domains, but I kept noticing the same underlying structure.

Systems fail when their layers run at mismatched speeds.

Not because of bad incentives, though those matter. Not because of insufficient intelligence, though that matters too. But because when your update rate persistently exceeds your evidence accumulation rate, you drift into metastable states that look fine until they catastrophically aren’t.

Time itself is the missing ingredient in most analysis of complex systems.

Once you see it, Moloch dissolves. There’s no demon. There’s just:

  • A fast layer (individual decisions)

  • A slow layer (collective coordination)

  • Widening desynchronization between them

  • Cascading consequences nobody individually chose

The solution isn’t transcendence or heroic rationality. It’s embarrassingly mundane:

  • Slow down certain decision processes

  • Speed up certain coordination mechanisms

  • Design interfaces between layers that account for their different natural frequencies

  • Build in buffers, delays, cooling-off periods—all the things that look like inefficiency until you understand they’re stability mechanisms

But that requires engineering, not eschatology. Actual system design, not cosmic battle narratives.

Demons Don’t Survive Dynamics

Here’s the ruthless part: once you introduce temporal mechanics into your model, Moloch becomes unnecessary.

He’s explanatory theater. A narrative prosthetic for a community that built sophisticated models of everything except their own clock speeds.

Every “Molochian” phenomenon collapses cleanly into dynamics once you track update frequencies:

  • Incentive gradients and their propagation rates

  • Evidence bandwidth and integration delays

  • Coordination overhead and its scaling properties

  • Stability domains and metastability thresholds

  • The conditions under which fast local optimization destabilizes slow global equilibria

No demons required. Just differential equations with time in them.

The rationalists built an entire mythology to avoid admitting they’d introduced a zero-time-delay assumption into every model that mattered. They treated “updating on evidence” as instantaneous while coordination mechanisms operated in calendar time, then acted surprised when the system exhibited runaway dynamics.

Moloch is a pacing bug mistaken for a metaphysical villain.

The Uncomfortable Gift

So here’s your holiday present, rationalist community:

Your demon was always a placeholder for missing temporal analysis.

Everything you’ve attributed to Moloch is just what happens when you model agents but not their cadences. When you track beliefs but not their update frequencies. When you analyze coordination problems without measuring the actual speed at which coordination can occur versus the speed at which defection can propagate.

You don’t need better arguments against Moloch. You need clocks in your models.

You don’t need to raise the sanity waterline. You need to measure the bandwidth between your layers.

You don’t need superintelligence to save you. You need systems that can’t update faster than they can integrate.

The tragedy isn’t that you’re locked in demonic games. It’s that you’ve been debugging mythology instead of dynamics.

The good news? Once you see the temporal structure, you can actually do something about it. Design institutions with matched timescales. Build in deliberate delays where fast optimization would destabilize. Create coordination mechanisms that operate at competitive speeds with defection incentives.

The bad news? This requires admitting that raw intelligence and fast thinking aren’t automatically beneficial. That sometimes the right move is to be slower, more conservative, more willing to wait for evidence rather than race ahead on priors.

That your community’s defining virtue—quick, sharp, independent thinking—might itself be a destabilizing force when everyone does it simultaneously in a shared epistemic space.

I suspect that’s harder to accept than a demon.

Epilogue: What I’ve Been Building

The formal framework that makes all this precise is something I’ve been developing for a while now. It has a name and mathematical notation and applications across domains from AI architecture to institutional analysis to platform dynamics.

But the core insight doesn’t require the full apparatus: systems fail predictably when their components run at incompatible speeds.

That’s it. That’s the pattern underneath everything.

Once you see it, you can’t unsee it. Markets that hallucinate. Institutions that maintain narrative coherence while degrading functionally. AI systems that confuse fluency for accuracy. Social movements that optimize themselves into irrelevance. Companies that mistake activity for progress.

All variations on the same theme: fast layers outrunning slow layers, confidence accumulating faster than evidence, systems drifting into states they couldn’t have deliberately chosen but can’t escape.

Moloch is just what that looks like when you tell it as mythology instead of dynamics.

You can keep your demon if you want. He makes for better poetry than “your inference cadence exceeds your evidence bandwidth.”

But if you actually want to solve coordination problems rather than just name them dramatically, you might want to start tracking what time it is in different parts of your system.

The demon was always optional.

The clocks were not.


This is part of an ongoing project examining temporal dynamics in complex systems. If you’re interested in the formal framework, it exists. If you’re just here to watch me kick rationalists in the shins during the holiday season, I hope this was therapeutic.

Merry Christmas. Your demon is a differential equation. You’re welcome.