Lefteris Karapetsas pfp
Lefteris Karapetsas
@lefteris.eth
Ran an experiment with my blood panel: converted it to CSV and sent it to 2 doctors and 3 LLMs. Not only did the LLMs match the doctor conclusions, they had more details and were easier to digest than the doctors. We are entering a new era of accessible diagnostics. But how do we query LLMs with sensitive data like health records without compromising privacy? Anyone working on solutions? Local inference, encrypted computation, zero-knowledge pipelines anything else??
4 replies
0 recast
9 reactions

s5eeo pfp
s5eeo
@s5eeo
For privacy purposes, it’s easy to download and set up a model for local inference with ollama for example. It’s especially easy as a developer and takes just a few lines of code. Open source models small enough to be run on a laptop (e.g. Gemma 3) have gotten very powerful. This is not the most user-friendly option but has been working well for me.
1 reply
0 recast
0 reaction

Lefteris Karapetsas pfp
Lefteris Karapetsas
@lefteris.eth
I have done it but the results were not nearly as good as the big ones (chatgpt, claude, gemini). Perhaps in a few years time? I surely hope so as that would be the easiest solution
0 reply
0 recast
1 reaction