Content
@
0 reply
0 recast
0 reaction
Dan Romero
@dwr.eth
Let's say you have a corpus of text — 10 million words — about a specific topic. 1. What's the best way to "train a model" on that text? 2. Is that even the right term? Or is it using an existing foundational model and then augmenting it? Fine-tuning it? Something else?
18 replies
2 recasts
117 reactions
wizard not parzival (shoni)
@alexpaden
training is good for it replies with sarcasm or it replies with prepared and filtered responses. you're talking about embedding new data for unique results i think i am starting with openai gpt but then others hence the name unbias
0 reply
0 recast
0 reaction