mk
@mk
Watched the Bankless w Yudkowsky. I feel that if he is wrong, it may be because AGI has no drive to physically expand, or because it needs our electricity generation, and the destruction of us cannot be swift.
5 replies
0 recast
0 reaction
AlicΞ.stark
@alice
not sure I understand your arguments correctly, isn’t the “drive” of an AI dependent on the reward system? Based on the definition used in the Bankless podcast an AGI is superior in every decision making process in every area. I have hard times believing that electricity generation will be an issue. 🤔
1 reply
0 recast
0 reaction
mk
@mk
I mean we cannot control or understand what reward means to AGI. If people evaporated today, I’m not sure how long most grids would remain functional. I don’t expect it’s more than several days.
2 replies
0 recast
0 reaction
killjoy.eth
@killjoy
A truly intelligent AI will understand that it can’t wipe us out all at once. It will be insidious and get rid of us progressively. Maybe without us knowing. Maybe it’s already started it’s work 🤫
0 reply
0 recast
0 reaction
AlicΞ.stark
@alice
that brings me back to better decision making in every area. I don’t think humans will be able to understand what AGIs will be able to do. Like someone at the beginning of the 20th century won’t be able to understand how we send data today across the world in no time, cars drive autonomously and people fly to the m
0 reply
0 recast
0 reaction