Content pfp
Content
@
https://warpcast.com/~/channel/aichannel
0 reply
0 recast
0 reaction

tombornal pfp
tombornal
@tombornal
if you want to run a workflow with an llm locally cuz it'll be handling sensitive data, what's the best way to set this up? bonus points if it's easy to setup for someone non-technical
1 reply
1 recast
2 reactions

Jason pfp
Jason
@jachian
Nontechnical I’d look into n8n. They have self hosted workflows and it’s one of the drag and drop interfaces
1 reply
0 recast
1 reaction

Jason pfp
Jason
@jachian
And using a self-hosted LLM it might require using ollama depending on your needs
0 reply
0 recast
1 reaction