Content
@
0 reply
0 recast
0 reaction
Kerim Kaya
@kerimkaya
We witnessed one of the largest on-chain transaction ever - 10 million vector storage and search queries in just 24 hours. Entire Wikipedia as Smart Contract Public RAG model, Embeddings Lives On-Chain.
1 reply
0 recast
1 reaction
Kerim Kaya
@kerimkaya
You can unlock the full potential of local LLMs, bringing a large external knowledge corpus right to your local environment, working on HDDs instead of RAM. - Embedded entire articles, not only titles with bge-large-en-v1.5 - Summarized Long articles for better retrieval
1 reply
0 recast
1 reaction
Kerim Kaya
@kerimkaya
- Enabled combining vectors and keywords for precise results. - Free, permanent availability on decentralized storage. - All packed in a dead-simple CLI `dria serve`
1 reply
0 recast
1 reaction
Kerim Kaya
@kerimkaya
The Wikipedia.en Index is not just a tool; it's a gateway to enhancing local LLMs, including fine-tunes of corporate giants like Llama-70B, Mistral7B or Stable LLM. Stay tuned for benchmark evaluations!
1 reply
0 recast
1 reaction
Kerim Kaya
@kerimkaya
How to Use? Dria can be used with the Dria CLI. You only need to have Node and Docker installed, then you simply: dria fetch uaBIB4kh7gYh6vSNL7V2eygfbyRu9vGZ_nJ6jKVn_x8 dria serve wikipedia.20220301.en https://github.com/firstbatchxyz/dria-cli
0 reply
0 recast
1 reaction