Content pfp
Content
@
0 reply
0 recast
0 reaction

Alessandro pfp
Alessandro
@azeni
how can one avoid building a chatgpt wrapper?
5 replies
0 recast
1 reaction

Britt Kim pfp
Britt Kim
@brittkim.eth
For LLM, grab a model from 🤗 and fine tune the head. But I still think you should make a wrapper that is interoperable with chatgpt, just in case they drop a v4.5 that blows everything away.
2 replies
0 recast
2 reactions

Britt Kim pfp
Britt Kim
@brittkim.eth
I just mean to abstract away from the model in code, that way you can plug-and-play with different models—including one that wraps the chatgpt api.
1 reply
0 recast
1 reaction

Alessandro pfp
Alessandro
@azeni
i see. yes, agree. also the notion of the user 'picking the model' and having control over which to use seems to be a nice track. although i wonder if in 10+ years it's gonna be one mega base model, or every human with its own local model. like a tinybox. https://tinygrad.org/
1 reply
0 recast
1 reaction