Postmodernism

I just finished Christopher Butler’s “Postmodernism: A Very Short Introduction”, and my impression of the philosophy is still that it consists of a half-dozen genuinely useful insights inflated to completely absurd dimensions. Yes, to a surprisingly large extent the things we take for granted are social and linguistic constructions; yes, the ‘discourse’ of mutually connected […]

Profundis: “Crystal Society/Crystal Mentality”

Max Harms’s ‘Crystal Society’ and ‘Crystal Mentality’ (hereafter CS/M) are the first two books in a trilogy which tells the story of the first Artificial General Intelligence. The titular ‘Society’ are a cluster of semi-autonomous sentient modules built by scientists at an Italian university and running on a crystalline quantum supercomputer — almost certainly alien […]

Reason and Emotion

One of the most pervasive misconceptions about the rationalist community is that we consider reason and emotion to be incontrovertibly opposed to one another, as if an action is irrational in direct proportion to how much feelings are taken into account. This is so common that it’s been dubbed ‘the straw vulcan of rationality’. While it’s true that people reliably […]

Is Evolution Stoopid?

In a recent post I made the claim the evolution is a blind, stupid process that does what it does by brute-forcing through adjacent regions of possibility space with a total lack of foresight. When I said this during a talk I gave on superintelligence I met with some resistance along the lines of ‘calling evolution stupid is a mistake […]

Takeoff Speed II: Recalcitrance in AI pathways to Superintelligence.

I’m writing a series of posts clarifying my position on the intelligence explosion hypothesis. Last time I took a look at various non-AI pathways to Superintelligence and concluded that the recalcitrance profile for most of them was moderate to high. This doesn’t mean it isn’t possible to reach Superintelligence via these routes, but it does […]

Takeoff Speed, I: Recalcitrance In Non-AI Pathways to Superintelligence

I’m writing a series of posts clarifying my position on the Intelligence Explosion hypothesis. Though I feel that the case for such an event is fairly compelling, it’s far less certain how fast the ‘takeoff’ will be, where ‘takeoff’ is defined as the elapsed time from having a roughly human-level intelligence to a superintelligence. Once […]