Content pfp
Content
@
0 reply
0 recast
0 reaction

Charlie pfp
Charlie
@chcharcharlie
There are already cameras being built with secure chips in it so that every photo being taken has a digital signature saying that it went through the optic sensors so not digitally faked. Together with ZK (for image processing), Is this good enough to protect us from being surrounded by fake AI images?
1 reply
0 recast
4 reactions

CarCulture.eth๐ŸŽฉ ๐Ÿ”ต pfp
CarCulture.eth๐ŸŽฉ ๐Ÿ”ต
@drivr.eth
C2PA and the Content Authority Initiative are definitely steps in the right direction. Zk can build upon their authentication. I canโ€™t think of a single thing that could more effective in onboarding creatives. Farcaster should become a signing authority for CAI.
2 replies
0 recast
1 reaction

EmpiricalLagrange pfp
EmpiricalLagrange
@eulerlagrange.eth
You also have to @witness-bot those signatures in case those cameras ever get compromised ๐Ÿ˜‰
2 replies
0 recast
1 reaction

Charlie pfp
Charlie
@chcharcharlie
Can you help explain a bit more on what the attacking surface would be and how witness would help here?
1 reply
0 recast
0 reaction

EmpiricalLagrange pfp
EmpiricalLagrange
@eulerlagrange.eth
If the enclave of a camera is broken at any point. Then you can fake photos arbitrarily. Like a selfie with Obama on Inauguration day. Witness lets you timestamp data very cheaply in a way where it canโ€™t be done retroactively. So it guarantees some image existed at a certain block.
0 reply
0 recast
0 reaction