Content
@
0 reply
0 recast
0 reaction
blobs
@blobs
i'm figuring out how to train a LLM and documenting each step in this blog series. for anyone who is also curious, here is part 1: https://michaelhly.com/posts/train-llm-one
5 replies
5 recasts
29 reactions
vincent
@pixel
blobs, are you doing AI for blobs (the product) or is it exploration? also: what limit should i reach before i consider tuning Llama? what does it look like?
1 reply
0 recast
0 reaction
blobs
@blobs
a). just exploration b). i'm not sure what you mean by limit. but hugging face has pretrained llamas that you can grab off the shelf: https://huggingface.co/meta-llama my belief is that you should tune a model to find a fit for your dataset ... otherwise you probably don't need to ...
1 reply
0 recast
0 reaction