Vitalik Buterin pfp
Vitalik Buterin
@vitalik.eth
One of the fascinating things about AI and cryptography is how similar the math sometimes is. Like, the way that the poseidon hash works (matrix mult -> per-unit nonlinear layer -> matrix mult -> per-unit nonlinear layer) is *exactly* what deep neural networks do.
16 replies
4 recasts
19 reactions

tina pfp
tina
@howdai
Can you expand on what this means to you in terms of the interplay between crypto x AI?
2 replies
0 recast
0 reaction

Cassie Heart pfp
Cassie Heart
@cassie
There’s a lot of potential for overlap — secure MPC enables multiple dataset owners to combine their sets for private training of a neural network without revealing their inputs to the other parties, yet getting an end output of a network trained on all participants’ data.
3 replies
0 recast
1 reaction

chandresh 🪴 pfp
chandresh 🪴
@chandresh.eth
something smart going on in this conversation, no idea what not i am sure it's useful 🤞
1 reply
0 recast
0 reaction

Mac Budkowski ᵏ pfp
Mac Budkowski ᵏ
@macbudkowski
Wow, nice. Are there any projects doing that?
1 reply
0 recast
0 reaction

Abhinav Vishwa pfp
Abhinav Vishwa
@vishwa
This is very interesting and something we are actively researching to enable. Federated Learning essentially reaches the end goal here, but has the potential to leak weights. MPC at the edge can remove that, but where (how?) do you aggregate the weights then? ...
1 reply
0 recast
0 reaction