Content pfp
Content
@
0 reply
0 recast
0 reaction

July pfp
July
@july
I find that P(doom) has nothing to do with how likely the world is going to end -- (obv no one can predict this with any remotely accurately take) I think it's a metric of how much existential angst that an AI Researcher is feeling on that particular day
7 replies
4 recasts
102 reactions

July pfp
July
@july
May be true or not: the more that these folks have an identity that is tied to a service, an action or work, or anything that can be automated away by AGI at some point in the future, the higher the likelihood of their existential angst (they will literally be replaced, hence the anxiety)
3 replies
1 recast
65 reactions

July pfp
July
@july
The more that their worldview is rooted in other realms that are not easily replaceable by AI, the more solid a foundation that this risk sits on. For example, if you count your productivity by lines of code produced, you will of course have some anxiety about being replaced
0 reply
0 recast
5 reactions

moreReese pfp
moreReese
@morereese
Astute observation, and one I tend to agree with. Speaking from personal experience, I also think the existential angst extends beyond being replaced and into something that we subconsciously fear: the notion that there are intelligences and conscious entities (similar to Jung’s archetypes) that are beyond full human comprehension, despite their ability to exercise agency over humans. Said more sardonically, the existential angst stems from increased awareness that humans are not (and never have been) the apex predator.
0 reply
0 recast
2 reactions