Data Immiseration
We generate more data today than at any other point in human history—but it hasn’t made us freer, safer, or more empowered. If anything, it does the opposite.
The term Data Immiseration captures this growing phenomenon where data is no longer just a tool for control, but a mechanism for active deprivation and suffering—immiseration, a term that explicitly refers to the deepening of misery and hardship. The idea isn’t new—but it needs a name. It’s an idea that emerged in the mess of the digital space, and it’s exactly this kind of real-time, emergent thought that often fuels our most critical frameworks today.
::: {.bluesky-wrap .outer style=“height: auto; display: flex; margin-bottom: 24px;” attrs="{"postId":"3lhubwijrm224","authorDid":"did:plc:dki5xu3vgyo7ubl7vaw55zzq","authorName":"Internet Hauntologist","authorHandle":"neutral.zone","authorAvatarUrl":"https://cdn.bsky.app/img/avatar/plain/did:plc:dki5xu3vgyo7ubl7vaw55zzq/bafkreiatfsiaurf42wc47rtfm6tvkt7qujuzvcgjeil5htrq66f3a6e6pq@jpeg","text":"We accidentally started a new mode of critical theory, but we’re too online to write a paper about it.","createdAt":"2025-02-10T23:21:47.624Z","uri":"at://did:plc:dki5xu3vgyo7ubl7vaw55zzq/app.bsky.feed.post/3lhubwijrm224","imageUrls":[]}" component-name=“BlueskyCreateBlueskyEmbed”} ::: iframe ::: {#app} ::: ::: :::
In a previous piece, I argued that AI is quietly killing the dream of sovereign computing—the idea that individuals and communities could control their own digital futures. Sovereign compute is about control; data immiseration is about consequences. One shows us how we're losing our digital autonomy—the other shows us what we lose as a result. We’re not merely losing control; we’re actively suffering under the mechanisms that data systems reinforce. Data isn’t just extracted—it immiserates.
The More We Consume, the Less We Digest
Data Immiseration is the systematic use of data as a means to deepen existing inequalities, suffering, and disempowerment. This concept goes beyond mere surveillance capitalism—it’s a framework for understanding how data exacerbates social harm.
We can break it down into several core mechanisms:
Surveillance-driven Oppression
Governments and corporations use mass data collection primarily to control, restrict, or penalize, rather than to improve society.
- Example: China's social credit system and predictive policing technologies in the U.S.
Algorithmic Exclusion
Opaque, AI-driven decisions marginalize and exclude individuals, embedding discrimination in seemingly neutral data-driven processes.
- Example: Credit scoring algorithms reinforcing racial and economic biases; opaque "shadow bans" on social media.
Information Overload as Paralysis
The sheer volume of contradictory, manipulated, or overwhelming data prevents meaningful action and informed decision-making.
- Example: Public health confusion during COVID-19 due to misinformation and contradictory guidelines.
Deliberate Data Withholding
Withholding or restricting information becomes a direct tool of power, maintaining control through strategic ignorance or censorship.
- Example: Defunding public research initiatives or erasing inconvenient historical records from public access.
Economic Disempowerment Through Data
Personal data is weaponized to deepen financial precarity, enforce discriminatory pricing, and exploit vulnerable gig workers.
- Example: Gig economy platforms adjusting pay rates dynamically based on algorithmically-determined “productivity” metrics.
Why "Data Immiseration" Matters
Thinkers like Zuboff (surveillance capitalism), Han (psychopolitics), and Graeber (bureaucratic violence) have addressed aspects of this, but there hasn’t been a comprehensive term that captures the cumulative impact. "Data Immiseration" fills that gap by explicitly naming data as a vector of active harm, rather than simply passive extraction.
This isn’t just about naming a problem—it’s about reclaiming the discourse. Data-driven systems don't just observe or extract—they deliberately deprive. They determine who thrives, who struggles, and who disappears entirely.
We must recognize that information overload is as powerful a tool for maintaining power as outright censorship; that algorithmic deprivation doesn’t merely profit—it actively compels compliance and shapes society.
Data shapes our reality—but more crucially, it shapes our capacity to act within that reality. Unless we actively challenge these mechanisms, we risk living perpetually in a world defined not by our own sovereign choices, but by systems deliberately designed to immiserate us.
This idea deserves more than a thread on Bluesky, and probably more than a single piece here—there’s a lot left unsaid. Expect me to return later, dig deeper, and probably change my mind at least once along the way. For now, let’s call this the start of a conversation.