Cassie Heart pfp
Cassie Heart
@cassie
Working on a test case to confirm an idea would improve the performance of the RPM paper and reduce round complexity – it also has a neat side effect that it also enables massive parallel gradient descent compute for secure ML in a completely MPC context
3 replies
0 recast
0 reaction

Cassie Heart pfp
Cassie Heart
@cassie
The most immediate applicability would be social media applications running on Quilibrium, allowing efficient but fully private recommendation systems
1 reply
0 recast
0 reaction

Cassie Heart pfp
Cassie Heart
@cassie
In the spirit of building in public, I'll share the approach I'm evaluating – the RPM paper referenced the SecureML paper, which suggested the use of Beaver triples to keep the polynomial degree bounded to the shares of matrices when computing dot products. This works great under 2PC but requires log(n) rounds...
1 reply
0 recast
0 reaction

Syed Shah🏴‍☠️🌊 pfp
Syed Shah🏴‍☠️🌊
@syed
https://i.imgur.com/PsxRQFU.jpg
1 reply
0 recast
0 reaction