Chu Ka-Cheong
@kc
AGI is an existential risk as much as climate change, but the discourse is totally fucked up. We should be articulating facts, promoting mutual understanding, and proposing workable policies, but we are talking fiction and nuclear exchange to stop AGI.
2 replies
0 recast
0 reaction
Alberto Ornaghi
@alor
Climate change is already devastating our agriculture. AGI can help us adapting and find new ways to cope with the change. Why do you see agi as existential risk?
1 reply
0 recast
0 reaction
Chu Ka-Cheong
@kc
I think AGI is analogous to nuclear physics, which is both beneficial and destructive. But while I agree it entails some kind of supervision, it doesn’t mean I agree on the timeline and measures proposed by others (such as pausing research right now).
1 reply
0 recast
0 reaction
Alberto Ornaghi
@alor
if an AGI will came to existence, would you consider it a technology or a life form? imho AGI cannot be compared to any other kind of technology since it will not be a tech anymore the moment it will become sentient.
1 reply
0 recast
0 reaction
Chu Ka-Cheong
@kc
If we consider a superintelligence AGI, history told us when an more advanced civilization meet a less advanced one, it always results in domination, violence and genocide. But nuance is that AGI doesn’t have to be superintelligence, and there are different risks other than AGI killing humans all.
1 reply
0 recast
0 reaction
Alberto Ornaghi
@alor
When humans creates a single AGI, it’s not a civilization against another one. Why should AGI be conquering lands like humans had done in the past? AGI don’t need to reproduce or gain money, commodities, resources etc… what I’m saying is that AGI will behave completely different from humans.
1 reply
0 recast
0 reaction
Chu Ka-Cheong
@kc
“Risk” is something that has >0 chance to happen in the future. What you said may be right but we don’t know now. The question is our assessment about the chance and whether we should do something about it.
1 reply
0 recast
0 reaction