kevin
@kevinoconnell
wanted to try out deepseek-r1 this weekend and couldn't find a good simple lightweight UI so made one pretty impressed by it, I think local open source modals are getting a lot better! some thoughts (repo here https://github.com/kevoconnell/deepseek-chat):
4 replies
4 recasts
27 reactions
kevin
@kevinoconnell
lightweight UI to me = being able to start talking instantly assuming you have ollama installed. I saw https://github.com/open-webui/open-webui , but didn't like it because it took a min to startup
1 reply
0 recast
3 reactions
kevin
@kevinoconnell
The use cases for local modals seem more clear to me: a.) as the rest of the world starts to pick up LLMs, most sensitive industries (i.e lawyers, doctors etc) will probably use local modals b.) it will perform simple tasks that take a couple min max
1 reply
0 recast
7 reactions
agusti
@bleu.eth
LM studio gud
0 reply
0 recast
1 reaction
Garrett
@garrett
based
0 reply
0 recast
1 reaction