Excellent to see you tackle this, Sami. You landed in a very similar place to my most recent essay also on the conflation of market value and human worth.
And this:
“I’m not saying that the machines are better than we are at those things today, or even that they will be. But I am saying that IF they will be, it would be better for us to have built our foundations of purpose on something more solid, or we could really find ourselves adrift in a dangerous manner as a species.”
What might that “something more solid” be, do you think? What are the moorings of purpose? Is it just a Nietzschean self-authoring exercise or something else?
Great question, and leave it to you to start poking at the weakest parts of this! :) There was some undeniably circular claims in the essay. I think the moorings differ between individual and collective levels. For some people, pure self-authoring can work. But at scale? I suspect most people need something more relational - embedded in webs of care, showing up for specific people, mattering because they're in relationship rather than because they're uniquely capable.
We may even need more explicit external anchors, akin to the role religions used to play (and where we have some, umm, interesting emerging trends now). Which creates a problem, in that we're destroying traditional external structures without building replacements thoughtfully, just letting them emerge from whatever godforsaken QAnon conspiracies or parasocial AI relationships fill the void.
This is why the timing matters. If we automate work before we've built adequate meaning-structures, we're creating unemployment or even instability, but even worse, we're also creating a meaning vacuum at scale. And history suggests those get filled by things we really don't want.
What's your thinking on whether relational structures can scale as a foundation?
Sorry if I clumsily turned over that rock. I think I went there because it's probably *the* question I'm preoccupied with. One sees what they have primed themselves to see, I suppose!
I couldn't agree more with your reasoning here, i.e. that the tech (and the way it is being designed and received) is a kind of existential rug-pull. Well, that's not quite right. It certainly *feels* like a rug pull if you have internalised the values of the market and, as you highlight, conflate market value with human worth/purpose. This is all precisely why I said that these conditions will induce a spiritual reckoning, of sorts (over here: https://allenj.substack.com/p/on-cyborgs-and-children-of-god).
I agree the vacuum is becoming very apparent. It was already, but the tech is revealing that to a wider audience now. Yet, I'm sceptical of the notion of, as you put it, thoughtfully building replacements for traditional external structures. It smells too much like enlightenment modernity thinking for my liking. That is, the idea that culture, religion, etc, are mere social constructs that we can design.
I like the idea of scaling relational structures, but perhaps you could explain more of what you mean?
For me, I suspect the only important questions remaining will be metaphysical. That is, I think both new gods are lurking and old gods are waiting, both ready with answers, some with better answers than others.
I think that our understanding that culture, religion etc _are_ social constructs certainly enable better control over them. We may not be able to design them completely, but we can nudge them. I don't think that would be a controversial statement? Some countries have gone very far in those directions, and to apparent success (not 'success' as in human thriving; success as in indoctrinating the country to a certain belief system on the surface and in action).
We probably diverge on what is/isn't a social construct (I'm a theist). That's not to say 'nudging' does not, in reality, effect change or that it can be used to consciously and deliberately influence the cultural landscape. That's not controversial.
What's controversial is:
a) where that leads - it's the same hubristic will to power impulse that underpins all authoritarian structures; and
b) that it can serve as a sufficient foundation for meaning.
On the latter point, if one is tempted to think that a socially constructed foundation can be sufficiently stable, I'd suggest that one is ignoring the deeper, hidden and very old foundations upon which that society was built, the rock to which we owe whatever (relatively) stable institutions we have today. This is what Nietzsche was on about when his fictional character lamented:
“God is dead. God remains dead. And we have killed him. How shall we comfort ourselves, the murderers of all murderers? What was holiest and mightiest of all that the world has yet owned has bled to death under our knives: who will wipe this blood off us? What water is there for us to clean ourselves? What festivals of atonement, what sacred games shall we have to invent? Is not the greatness of this deed too great for us? Must we ourselves not become gods simply to appear worthy of it?”
Nietzche was himself lulled into believing that we could construct something to replace God. I think he was wrong, not least evidenced by the events of the 20th century. In fact, there's another myth that addresses this point precisely: The tower of Babel.
Correct me if I'm wrong, but I think you're making two separate claims here that need unpacking:
First, that attempting to consciously shape culture/meaning-structures is dangerous hubris (the authoritarianism point).
Second, that non-theistic foundations are inherently unstable compared to theistic ones.
On the first point - I agree there's danger in top-down meaning-engineering. But there's a massive difference between letting QAnon and parasocial AI relationships fill the vacuum by default, versus creating conditions where healthier structures can emerge. We're not talking about Five-Year Plans for Purpose here. More like: what institutional designs, what cultural norms, what economic structures make it easier for people to find meaningful connection? Think more societal governance.
On the second point - the "rock" you're pointing to as the foundation of stable institutions? That rock was pretty damn bloody. The era of strong theistic foundations gave us the Crusades, the Inquisition, centuries of religious warfare, witch burnings, and the divine right of kings. The past, for the most part, was terrible, and the "stability" is a post-hoc illusion or misinterpretation that's only possible because a long time has passed and we've lost collective memory of what it was like.
Meanwhile, the most stable, prosperous, low-violence societies today - Scandinavia, Japan, much of Northern Europe - are among the LEAST religiously observant. They didn't collapse into Nietzschean nihilism or 20th-century totalitarianism. They built secular welfare states, strong institutions, and high social trust. Mostly without God.
Yes, I understood from your earlier comment that you believed these societies had developed humane, low violence patterns resting on their own laurels. That’s what I was disputing. They didn’t. The rock is buried deeply, so you don’t even think it’s there. This is perhaps the central thesis of historian Tom Holland’s book, Dominion. (Not a Christian, mind you… well, he wasn’t when he wrote that book, anyway).
For all religion’s many ills (and you named some), the Enlightenment thinkers were not pulling themselves up by their own bootstraps: they were building on the ethical foundations of Judeo-Christian traditions. We rest on those still today, but we don’t recognise them as religious structures.
You, my friend, are more Christian than you probably think!
Do you know GK Chesterton at all? He once said that if we ever encounter an ancient fence and we don’t know why it was put there, we’d be wise to leave it standing until we know, lest we find out the hard way. Well, like your article point out… we dismantled the fence, and now we’re finding out the hard way.
I've pondered what humans will still be needed for (in a productive sense) as machines get better and better. Two things I've landed on, that are not skill-based, are (i) people can take accountability for things, and are able to receive punishment if they fail, (ii) people sometimes prefer to interact with other people for particular tasks.
Keeping people around as accountability sinks sounds like the moral crumple zone situation on steroids; I get very dystopian vibes off of that. If the most 'efficient' party does the job itself, but we still hold people accountable, that's a textbook case of specifically implementing a moral crumple zone - Very Not Good.
I agree on the interacting with humans point; in every survey that I have run, the one constant that people don't want to automate away is the human-to-human contact. I have an uneasy feeling that there might be a generational shift happening that reduces that.
Interestingly, it's not the stated preference of even Gen Z to interact less with people, but it IS the behavior, since they experience significant barriers to achieving it - social anxiety, anxiety about face-to-face skills, reduced social stamina, and technology-induced disconnection, all of which makes in-person interaction feel overwhelming. I'm hopeful it's not too late to change that.
A person as an accountability sink for a machine seems like a strange way to put it, to me. Laws are written so that people can be jailed or fined if they don't follow them. If a car hits someone, the driver (or another person) is liable, not the machine. People will use AI and will be liable for their use of it. It is a choice, and if they don't want this responsibility, they shouldn't do it.
But what if people’s use of AI is increasingly ambient, that it’s just *there* in the background autonomously making decisions - like a Waymo cab - that are two, three or more orders removed from the original delegation given by a human? That’s a complex accountability situation to manage.
The past me would have very much agreed with this; let's just get rid of all of it, burn it to the ground, build back better. The today me is more nuanced - it will not be an easy thing to ditch capitalism entirely, and attempting to do so too suddenly would likely be traumatic for humanity. I think a lot of good can be accomplished within the framework of the current systems, just - "just" - by tweaking numbers, some significantly. Progressive taxation needs to be _much_ higher at the top end, for one; taxing profits for extraction of natural resources needs to be more along the Norway model; and so on.
I 100% get the impatience. The reason I've moderated my views on the radical change is a result of a growing understanding just how fragile many of the systems we rely on for our very survival are now. From globally integrated manufacturing with single points of failure to liquid fuel security to energy infrastructure, it's all less resilient and more fragile than we'd like to believe. While we may disagree with whether things like burning fossil fuels are a good thing (quite clearly they are not for the planet), we _are_ wholly and totally dependent on them for the time being, and for the survival of people it would be better for the transition to be managed.
Excellent to see you tackle this, Sami. You landed in a very similar place to my most recent essay also on the conflation of market value and human worth.
And this:
“I’m not saying that the machines are better than we are at those things today, or even that they will be. But I am saying that IF they will be, it would be better for us to have built our foundations of purpose on something more solid, or we could really find ourselves adrift in a dangerous manner as a species.”
What might that “something more solid” be, do you think? What are the moorings of purpose? Is it just a Nietzschean self-authoring exercise or something else?
Great question, and leave it to you to start poking at the weakest parts of this! :) There was some undeniably circular claims in the essay. I think the moorings differ between individual and collective levels. For some people, pure self-authoring can work. But at scale? I suspect most people need something more relational - embedded in webs of care, showing up for specific people, mattering because they're in relationship rather than because they're uniquely capable.
We may even need more explicit external anchors, akin to the role religions used to play (and where we have some, umm, interesting emerging trends now). Which creates a problem, in that we're destroying traditional external structures without building replacements thoughtfully, just letting them emerge from whatever godforsaken QAnon conspiracies or parasocial AI relationships fill the void.
This is why the timing matters. If we automate work before we've built adequate meaning-structures, we're creating unemployment or even instability, but even worse, we're also creating a meaning vacuum at scale. And history suggests those get filled by things we really don't want.
What's your thinking on whether relational structures can scale as a foundation?
Sorry if I clumsily turned over that rock. I think I went there because it's probably *the* question I'm preoccupied with. One sees what they have primed themselves to see, I suppose!
I couldn't agree more with your reasoning here, i.e. that the tech (and the way it is being designed and received) is a kind of existential rug-pull. Well, that's not quite right. It certainly *feels* like a rug pull if you have internalised the values of the market and, as you highlight, conflate market value with human worth/purpose. This is all precisely why I said that these conditions will induce a spiritual reckoning, of sorts (over here: https://allenj.substack.com/p/on-cyborgs-and-children-of-god).
I agree the vacuum is becoming very apparent. It was already, but the tech is revealing that to a wider audience now. Yet, I'm sceptical of the notion of, as you put it, thoughtfully building replacements for traditional external structures. It smells too much like enlightenment modernity thinking for my liking. That is, the idea that culture, religion, etc, are mere social constructs that we can design.
I like the idea of scaling relational structures, but perhaps you could explain more of what you mean?
For me, I suspect the only important questions remaining will be metaphysical. That is, I think both new gods are lurking and old gods are waiting, both ready with answers, some with better answers than others.
I think that our understanding that culture, religion etc _are_ social constructs certainly enable better control over them. We may not be able to design them completely, but we can nudge them. I don't think that would be a controversial statement? Some countries have gone very far in those directions, and to apparent success (not 'success' as in human thriving; success as in indoctrinating the country to a certain belief system on the surface and in action).
We probably diverge on what is/isn't a social construct (I'm a theist). That's not to say 'nudging' does not, in reality, effect change or that it can be used to consciously and deliberately influence the cultural landscape. That's not controversial.
What's controversial is:
a) where that leads - it's the same hubristic will to power impulse that underpins all authoritarian structures; and
b) that it can serve as a sufficient foundation for meaning.
On the latter point, if one is tempted to think that a socially constructed foundation can be sufficiently stable, I'd suggest that one is ignoring the deeper, hidden and very old foundations upon which that society was built, the rock to which we owe whatever (relatively) stable institutions we have today. This is what Nietzsche was on about when his fictional character lamented:
“God is dead. God remains dead. And we have killed him. How shall we comfort ourselves, the murderers of all murderers? What was holiest and mightiest of all that the world has yet owned has bled to death under our knives: who will wipe this blood off us? What water is there for us to clean ourselves? What festivals of atonement, what sacred games shall we have to invent? Is not the greatness of this deed too great for us? Must we ourselves not become gods simply to appear worthy of it?”
Nietzche was himself lulled into believing that we could construct something to replace God. I think he was wrong, not least evidenced by the events of the 20th century. In fact, there's another myth that addresses this point precisely: The tower of Babel.
Correct me if I'm wrong, but I think you're making two separate claims here that need unpacking:
First, that attempting to consciously shape culture/meaning-structures is dangerous hubris (the authoritarianism point).
Second, that non-theistic foundations are inherently unstable compared to theistic ones.
On the first point - I agree there's danger in top-down meaning-engineering. But there's a massive difference between letting QAnon and parasocial AI relationships fill the vacuum by default, versus creating conditions where healthier structures can emerge. We're not talking about Five-Year Plans for Purpose here. More like: what institutional designs, what cultural norms, what economic structures make it easier for people to find meaningful connection? Think more societal governance.
On the second point - the "rock" you're pointing to as the foundation of stable institutions? That rock was pretty damn bloody. The era of strong theistic foundations gave us the Crusades, the Inquisition, centuries of religious warfare, witch burnings, and the divine right of kings. The past, for the most part, was terrible, and the "stability" is a post-hoc illusion or misinterpretation that's only possible because a long time has passed and we've lost collective memory of what it was like.
Meanwhile, the most stable, prosperous, low-violence societies today - Scandinavia, Japan, much of Northern Europe - are among the LEAST religiously observant. They didn't collapse into Nietzschean nihilism or 20th-century totalitarianism. They built secular welfare states, strong institutions, and high social trust. Mostly without God.
Yes, I understood from your earlier comment that you believed these societies had developed humane, low violence patterns resting on their own laurels. That’s what I was disputing. They didn’t. The rock is buried deeply, so you don’t even think it’s there. This is perhaps the central thesis of historian Tom Holland’s book, Dominion. (Not a Christian, mind you… well, he wasn’t when he wrote that book, anyway).
For all religion’s many ills (and you named some), the Enlightenment thinkers were not pulling themselves up by their own bootstraps: they were building on the ethical foundations of Judeo-Christian traditions. We rest on those still today, but we don’t recognise them as religious structures.
You, my friend, are more Christian than you probably think!
Do you know GK Chesterton at all? He once said that if we ever encounter an ancient fence and we don’t know why it was put there, we’d be wise to leave it standing until we know, lest we find out the hard way. Well, like your article point out… we dismantled the fence, and now we’re finding out the hard way.
I've pondered what humans will still be needed for (in a productive sense) as machines get better and better. Two things I've landed on, that are not skill-based, are (i) people can take accountability for things, and are able to receive punishment if they fail, (ii) people sometimes prefer to interact with other people for particular tasks.
Keeping people around as accountability sinks sounds like the moral crumple zone situation on steroids; I get very dystopian vibes off of that. If the most 'efficient' party does the job itself, but we still hold people accountable, that's a textbook case of specifically implementing a moral crumple zone - Very Not Good.
I agree on the interacting with humans point; in every survey that I have run, the one constant that people don't want to automate away is the human-to-human contact. I have an uneasy feeling that there might be a generational shift happening that reduces that.
Interestingly, it's not the stated preference of even Gen Z to interact less with people, but it IS the behavior, since they experience significant barriers to achieving it - social anxiety, anxiety about face-to-face skills, reduced social stamina, and technology-induced disconnection, all of which makes in-person interaction feel overwhelming. I'm hopeful it's not too late to change that.
A person as an accountability sink for a machine seems like a strange way to put it, to me. Laws are written so that people can be jailed or fined if they don't follow them. If a car hits someone, the driver (or another person) is liable, not the machine. People will use AI and will be liable for their use of it. It is a choice, and if they don't want this responsibility, they shouldn't do it.
But what if people’s use of AI is increasingly ambient, that it’s just *there* in the background autonomously making decisions - like a Waymo cab - that are two, three or more orders removed from the original delegation given by a human? That’s a complex accountability situation to manage.
The past me would have very much agreed with this; let's just get rid of all of it, burn it to the ground, build back better. The today me is more nuanced - it will not be an easy thing to ditch capitalism entirely, and attempting to do so too suddenly would likely be traumatic for humanity. I think a lot of good can be accomplished within the framework of the current systems, just - "just" - by tweaking numbers, some significantly. Progressive taxation needs to be _much_ higher at the top end, for one; taxing profits for extraction of natural resources needs to be more along the Norway model; and so on.
I 100% get the impatience. The reason I've moderated my views on the radical change is a result of a growing understanding just how fragile many of the systems we rely on for our very survival are now. From globally integrated manufacturing with single points of failure to liquid fuel security to energy infrastructure, it's all less resilient and more fragile than we'd like to believe. While we may disagree with whether things like burning fossil fuels are a good thing (quite clearly they are not for the planet), we _are_ wholly and totally dependent on them for the time being, and for the survival of people it would be better for the transition to be managed.