Dan Romero
@dwr.eth
Everyone is focusing on the $6M training number. The pertinent issue is the 90% cheaper / more efficient per use cost. "your margin is my opportunity"
20 replies
42 recasts
259 reactions
AoK
@mlevinson
Not me. I skimmed the paper already. It’s an application of an old technique in a new way. Distillation, I love it. They didn’t reinvent the wheel. They showed that openAI created a moat instead of working with constraints. Now, I think if VC is smart they will dump the mentality of not funding new models. Top it off: So many laid off from tech for the last two years (myself included). Surely in that pool they can fund a whole slew of new companies. This can be a good wake up call
1 reply
0 recast
1 reaction
marv 🎙️
@marvp
I love your point about constraints. @wanderloots.eth and I reached the same conclusion when chatting about this
1 reply
0 recast
2 reactions
AoK
@mlevinson
Why thank you. When. I started my career in AI back in 2014. I was part of a 1M$ raise in grants to work on medical based NLP. I could have worked with big powerful machines. But, hospital networks were very protective. So everything I was building needed to work encapsulated in their network. So I built online models, and worked constrained by my laptops 16GB memory resources. I processed 500k records with millions of pages for my paper on severe sepsis back in 2016. It ran using online logistic regression, and using global vector representations from Stanford. My entire career was a career built on solving complex problems with constraints like this. Because it’s rare to have an unlimited budget. And, even when you do. A war chest for lean economic times is more valuable than a depreciating piece of technology.
0 reply
0 recast
1 reaction