Content
@
0 reply
0 recast
0 reaction
July
@july
I find that P(doom) has nothing to do with how likely the world is going to end -- (obv no one can predict this with any remotely accurately take) I think it's a metric of how much existential angst that an AI Researcher is feeling on that particular day
7 replies
4 recasts
48 reactions
Zenigame
@zeni.eth
Generally agree, but I think "nothing" is overstating. You can constrain the probabilities within a range of {certain doom | likely doom | unlikely doom | no way doom } and then existential angst moves the numbers within the range.
0 reply
0 recast
1 reaction