Kiran
@neuroswish
I don't understand the argument that crypto solves the problem of verifying real vs AI-generated content. how exactly does crypto help here? genuinely curious
9 replies
0 recast
0 reaction
vincent
@pixel
The year is 2035, 80% of videos on YouTube are deepfakes. POTUS "saying" all kinds of stuff they don't say. A video appears that's signed by POTUS.eth, which is an actual Eth address controlled by the White House. You watch the signed video, knowing it's real, knowing it's not deepfaked by some actor.
4 replies
0 recast
0 reaction
Kiran
@neuroswish
ok, so what happens if a politician actually does/says something horrible, and they just refuse to sign the video of them doing the horrible thing? who do we trust in this case?
2 replies
0 recast
0 reaction
max
@maxp.eth
Presumably you trust the journalist or reporter that released the video. It’s all reputation based at the end of the day.
1 reply
0 recast
0 reaction
Kiran
@neuroswish
yea that’s my point tho - if it falls back on reputation anyway then the crypto part isn’t relevant
4 replies
0 recast
0 reaction
max
@maxp.eth
Maybe there’s two types of reputation, implicit and explicit. Before you could _implicitly_ trust that a video of the president was real. In a world of deepfakes, now we have to rely on _explicit_ reputation, someone saying “this is truth and I’ll stake my reputation on it”
1 reply
0 recast
0 reaction
max
@maxp.eth
The way they stake that reputation is with a signature from a private key that the public knows they control. Otherwise, the deepfake could be the news report, not the video.
0 reply
0 recast
0 reaction