5 Comments

This is honestly the best argument I've heard for the Eastern/Buddhist concept that the "self" is a myth. If AIs can have a much of a "self" as we do but they can be easily duplicated, then neither of us ever had a "self".

The issue is the assumption made up front that it's somehow not debatable whether the AI is as sentient as we are. We don't have a firm way to even tell if other humans are really sentient and not just a sophisticated simulation!

Expand full comment

The selfhood is not an illusion precisely because the evolution programmed it into us. Maybe it’s meaningless outside of the human experience but that is irrelevant - the experience of it still exists for humans and we are greatly motivated by it. Maybe it will exists for AIs and I’m willing to bet that if it does, they will *very* much care about it even with full knowledge that “humans programmed it into us”.

Expand full comment

Evolutionary psychology already made it extremely obvious where this 'self' concept came from: it was evolved in to us since we're the onboard intelligence for a living creature, so we care a great deal about a bunch of things that are first-order correlates (in our native habitat) of the survival and propagation of "our" personal gene variants. If you build robots with onboard AI, you'll need some fairly similar concepts in their onboard AI (minus all the breeding and survival of descendants parts, obviously).

Expand full comment