Vitalik Buterin pfp
Vitalik Buterin
@vitalik.eth
One of the fascinating things about AI and cryptography is how similar the math sometimes is. Like, the way that the poseidon hash works (matrix mult -> per-unit nonlinear layer -> matrix mult -> per-unit nonlinear layer) is *exactly* what deep neural networks do.
9 replies
5 recasts
26 reactions

tina pfp
tina
@howdai
Can you expand on what this means to you in terms of the interplay between crypto x AI?
1 reply
0 recast
0 reaction

Cassie Heart pfp
Cassie Heart
@cassie
There’s a lot of potential for overlap — secure MPC enables multiple dataset owners to combine their sets for private training of a neural network without revealing their inputs to the other parties, yet getting an end output of a network trained on all participants’ data.
2 replies
0 recast
1 reaction

Abhinav Vishwa pfp
Abhinav Vishwa
@vishwa
This is very interesting and something we are actively researching to enable. Federated Learning essentially reaches the end goal here, but has the potential to leak weights. MPC at the edge can remove that, but where (how?) do you aggregate the weights then? ...
1 reply
0 recast
0 reaction

Abhinav Vishwa pfp
Abhinav Vishwa
@vishwa
Does using a centralized aggregator (one of the arguments against FL) become available in this case?
0 reply
0 recast
0 reaction