I.

The numerical language of control is made of codes that mark access to information, or reject it. We no longer find ourselves dealing with the mass/individual pair. Individuals have become “dividuals,” and masses, samples, data, markets, or “banks.”

— Gilles Deleuze, Postscript on the Societies of Control

In the 1970s, Foucault posited that we live in a society of discipline, in which every organizational structure (or enclosure) is essentially hierarchical, abstractly resembling panopticons stacked on top of each other. But Deleuze updated this thesis, arguing that the technologies developed in the last few decades have allowed the mechanisms of discipline to move outside the enclosures. We leave a trail of data that allows the mechanisms of discipline to follow us everywhere; this is the society of control.

I bring these up because I think it's become obvious that even Deleuze's argument has become subtly outdated. We don't live in just one society of control, we live in multiple overlapping ones. And in many ways the path by which such discipline is imposed is no longer top-down, but bottom-up, Molochian.

Not even beyond George Orwell's wildest nightmares would he have dreamed of a society in which the people essentially voluntarily give themselves up to be closely observed. We are watched over not by a single omniscient Big Brother, but a multitude of competing near-omniscient Big Brothers. And we don't have telescreens forcibly installed in our homes, we buy a new one with our own hard-earned money every year. It's not surveillance, it's sousveillance.

II.

All models are wrong, but some are useful.

— George Box

I sincerely think that the problem with most of Silicon Valley is that we think we could solve all problems, if only we had more data. If only there were more tables and foreign keys to join against, if only there were more samples to train the model, if only there were more sensors to feed the data pipeline...

In a way, I see this attitude, this "algorithmical imperative", as endemic of the kind of high modernism that James C. Scott observes (in Seeing Like a State) crops in history every now and then. It's the ideology that says, in short, to make a better territory, you start by making a better map. And thus, you collect more data, sometimes even blindly, because you might be able to do something useful with it in the future.

Only the problem is that, as Scott observes, thus far we're never really able to make maps that are good enough, that are granular enough, that are up-to-date enough. When we try to actually use these maps, we actually hurt people.

III.

But maybe it's all justified in the name of progress. In some accelerationist circles, there's a saying: the only way out is through. Remove the brakes from capitalism, and it'll crash itself. That's why everything that capital touches needs to happen harder, better, faster, stronger.

If the accelerationists are right, then post-Fordist capitalism is the same thing as the "runaway train" model of collapse that Joseph Tainter talks about in the Collapse of Complex Societies, in which the complex society has no choice but to grow, consume everything it touches, and become more complex. Even if some people realize what's happening, all the pieces of the positive feedback loop have already fallen into place.

But if post-Fordist capitalism is a runaway train, and I don't know where it's going, then what I want to do isn't to stay along for the ride, but to get off this wild ride. Is it too late?

Arthur C. Clarke famously said that sufficiently-advanced technology is indistinguishable from magic. What happens when you slam the pedal on the road to magic? Is the road straight or winding?

IV.