It never talks about AI, but it talks about something a bit more fundamental to it and to all ideas about futurism. That as technology — specifically, the computer — develops that we are increasing the variety in our world system and we are powerless (or at least, confused) to regulate it. That situations like homeostasis are rare and unappreciated.

Of course, it also proposes the cybernetics approach of how we should be using computers Cybersyn-style to regulate our variety instead of producing more of it. How that’s executed is not super clear in this piece, it’s not super clear to me in general. In general, I don’t think the cybernetic practices are particularly good at all. The identification of the problem and treating things as a dynamic system which one is to regulate is pretty good though.

He’s just definitely a futurist. A frustrated one, but like, try not to appreciate this:

But the first and gravest problem is in the mind, screwed down by all those cultural constraints. You will not need a lot of learning to understand what I am saying; what you will need is intellectual freedom. It is a free gift for all who have the courage to accept it. Remember: our culture teaches us not intellectual courage, but intellectual conformity

I once told sama that the optimal time to build AI is definitely less than a hundred years from now, as I was worried about other things. I wish I could accurately quote him, his answer was a much more complete reason to the conclusions I was toting. Something like “the amount of entropy building up is really scary, and we need something to regulate it”. Damn! My precious topic of AGI development just happens to be the imminent piece in the big entropic timeline. It also happens to be one that can be the regulator.

We need to look for the people hiding behind all this mess; the people who are responsible for the system itself being the way it is, the people who don’t understand what the computer is really for, and the people who have turned computers into one of the biggest businesses of our age, regardless of the societary consequences. These are the people who make the mistakes, and they do not even know it. As to the ordinary citizen, he is in a fix—and this is why I was so furious. It is bad enough that folk should be misled into blaming their undoubted troubles onto machines that cannot answer back while the real culprits go scot free. Where the wickedness lies—and wickedness is not too strong a word—is that ordinary folk are led to think that the computer is an expensive and dangerous failure, a threat to their freedom and their individuality, whereas it is really their only hope


Surely, I project. But surely if I locked Beer in a room with Hamming in 1974, Beer’s references to computers would be replaced with machine intelligence and his fixation on organisations and governments would be replaced with grander, undefined systems.

For the first time in the history of man, science can do whatever can be exactly specified. Then, also for the first time, we do not have to be scientists to understand what can be done. It follows that we are no longer at the mercy of a technocracy which alone can tell us what to do. Our job is to start specifying

I read this while with family, who are telling me I needn’t be worried about AI because they were once worried that IDEs would take their jobs and disrupt the economy. I had some trouble expressing my explanation without phrases like "process for automating scientific and technological advancement” and “existential risk”. Beer has a reasonable low-principles point here, that we’ve hit this curve in technological development where we have a new problem, one of specification (alignment) that is uniquely difficult and quickly getting out of hand. Not one that I’d use at the dinner table, but certainly simpler on some dimensions.

Enslavement, is a word that Beer uses to describe our relationship to computers today. Sometimes its more about hedonic adaptation, usually its used in more of the TikTok sense. “An electronic mafia lurks around the corner”. We do not contain the variety to regulate the technology we’ve created. He gets political! Says that we could do better than computers than produce TVs. Who would pay for it? A beautiful perfect cybernetic government of course! Very endearing. Reminds me that this whole oppressive-use-of-technology shenanigan is quite old. Beer does use the word “oppressive”


Psychedelics, mysticism and psychosis are described as things our shared delusion filters out. The ideas about rates of change in the world and how the (Chilean) government are too slow apply to individuals as well. Maybe this variety overload we experience is a novel human experience, and like the institutions that can’t keep up with variety perhaps our neurons can’t either. No idea what that means for us, but it’s provocative?



The penultimate chapter is titled, “The Future Can Be Demanded Now”. Disappointing on the technoptimist futurist front, but correct insofar as we haven’t aligned our current technology. It is at least a pleasant idea. Again reminiscent of supposed present age technological revolutions, there’s talk of centralisation vs decentralisation! Beer is quite coherent on this, that the dichotomy is false because a fully centralised body would probably forget to beat and die and a fully decentralised one would be [extremely adhd]. I really like the human body as an example of a good system.

Pyramid management systems, rather naturally, do not easily fulfill the laws of requisite variety. There’s this model of amplifiers and attenuators of variety, and they must be balanced in the equation. As you climb up the ladder, the equation imbalances. He claims the managers are much less resistant to change, because they specialise as they try to attenuate with their skills. This jives poorly, managers probably suffer via institutions just as much as ics, but ICs get more of their variety from outside the system whereas managers might get more variety from their reports. But then Beer complains that cybernetics is hard, because we never agree on the abstraction (recursions, he calls it) of the system.

People, considered as individuals, it seems to me, like change rather a lot. Don’t you get bored when nothing changes? I know I do. Then just why do we go around saying that there is a resistance to change? Of course, the answer is simple. It is not the living, breathing human who resists change in his very soul. The problem is that the institutions in which we humans have our stake resist change

Oscillation has always been something in my head as a great tool for calibrating. Beer introduces it as something a good system should not do when perturbed, it should be “ultrastable”? Institutions should be unperturbed by both expected and unexpected disruption. I mean, the homeostasis arc stops holding up here because getting stabbed is pretty perturbing.