Content pfp
Content
@
0 reply
0 recast
0 reaction

July pfp
July
@july
I find that P(doom) has nothing to do with how likely the world is going to end -- (obv no one can predict this with any remotely accurately take) I think it's a metric of how much existential angst that an AI Researcher is feeling on that particular day
7 replies
4 recasts
48 reactions

July pfp
July
@july
May be true or not: the more that these folks have an identity that is tied to a service, an action or work, or anything that can be automated away by AGI at some point in the future, the higher the likelihood of their existential angst (they will literally be replaced, hence the anxiety)
3 replies
1 recast
27 reactions

Vitalik Buterin pfp
Vitalik Buterin
@vitalik.eth
What's a hypothesis we could test for, that would be true if your model of AI researchers is correct, and false if their p(doom) values actually are well-considered? eg. one that comes to mind is: in your model, an individual researcher's stated p(doom) should be very volatile week by week. Is that true in practice?
2 replies
0 recast
10 reactions

Q🎩 pfp
Q🎩
@qsteak.eth
P(doom) is about as useful as the Drake equation.
1 reply
0 recast
3 reactions

Luigi Stranieri pfp
Luigi Stranieri
@luigistranieri
I absolutely agree. I believe it is quite common and part of human thinking that artificial intelligence could replace us and even eliminate us. It reminds me of what a person who works on an assembly line, or a taxi driver, might think of a more remote but still possible future. However, this remains very unlikely if we bring everything back to an earthly level, where beliefs and hatred between peoples will do an even more disturbing job than what artificial intelligence could do.
0 reply
0 recast
2 reactions

Rafi pfp
Rafi
@rafi
Great observation. Similar applies to AGI-soon folks who mostly come from companies whose current business model is betting on continuous exponential growth.
1 reply
0 recast
2 reactions

Zenigame pfp
Zenigame
@zeni.eth
Generally agree, but I think "nothing" is overstating. You can constrain the probabilities within a range of {certain doom | likely doom | unlikely doom | no way doom } and then existential angst moves the numbers within the range.
0 reply
0 recast
1 reaction

Onedayvk pfp
Onedayvk
@onedayvk.eth
This is very true 100 $Degen
1 reply
0 recast
1 reaction