christopher pfp
christopher
@christopher
4:17 long summary of the DeepSeek drama, plus realistic speed tests. Good video to share. https://youtu.be/o1sN1lB76EA
2 replies
0 recast
10 reactions

cyrus pfp
cyrus
@cyrus
Was sure this would be clickbait and was mostly right πŸ₯²
1 reply
0 recast
2 reactions

Hector pfp
Hector
@noctis
I watched it and the promise was kept ?
1 reply
0 recast
0 reaction

cyrus pfp
cyrus
@cyrus
"Assuming you have a few 3090s you could run it at home" Right πŸ˜‚
1 reply
0 recast
0 reaction

Hector pfp
Hector
@noctis
He actually ran the model as said in the caption, so no clickbait?
1 reply
0 recast
0 reaction

cyrus pfp
cyrus
@cyrus
At no point did he run the model causing 'nightmares' that everyone is talking about on a pi, 'just' the 14b and it barely runs.
1 reply
0 recast
0 reaction

christopher pfp
christopher
@christopher
Is it normal to run 14b models on a Pi? I don't have any context for what's impressive or not. I liked how the creator explained how it's possible to do it, and then walked through how.
1 reply
0 recast
1 reaction

Hector pfp
Hector
@noctis
Not at all, 1. it's never been done before, 2. it's impressive to do any kind of ML/LLM stuff on a computer the price of my phone and the size of my bank card
1 reply
0 recast
0 reaction