maurelian  pfp
maurelian
@maurelian.eth
I want to be able to have a conversation with an LLM that has full context on a large piece of text, that is too large to copy/paste into ChatGPT. For example I might want to give it the text of a book, or the contents of a repo, and (I think the term is) fine-tune it. Does anyone know how I'd go about that?
9 replies
1 recast
5 reactions

Petra ⊙ pfp
Petra ⊙
@0xpetra
Run into the same question. Basically what you’re looking for are embeddings. tl;dr: What you do is traslate the text into LLM language and store it. Later you can retrieve relevant info, and feed it to the model to use it as input.
1 reply
0 recast
0 reaction

maurelian  pfp
maurelian
@maurelian.eth
Have a favourite resource for applying embeddings to my use case?
0 reply
0 recast
0 reaction