Content
@
0 reply
0 recast
0 reaction
typeof.eth π΅
@typeof.eth
Update: Read a bunch of articles, watched a bunch of videos, used o1 and Cursor to make a project that uses a local llama model to autonomously work on your Github projects (RYO Devin basically). The project implements a lot of the concepts I wanna learn about. Now I have to get this thing running and start reading through the code. I havenβt tested it yet. Gonna do that tomorrow and report back. Repo below for the curious. https://warpcast.com/typeof.eth/0xf248c080 https://github.com/dgca/local-code-agent
3 replies
8 recasts
18 reactions
Phil Cockfield
@pjc
have you been able to do this without a python runtime being required in the stack, @typeof.eth ?
1 reply
0 recast
1 reaction
typeof.eth π΅
@typeof.eth
So far! ollama can run in your terminal (so easy to set up, seriously takes like 5 minutes) and starts an HTTP server by default Iβm gonna need Python to run ChromaDB but they have a Docker image so thatβll take care of that I still donβt understand how youβre supposed to use python packages on MacOS haha
1 reply
0 recast
2 reactions
Phil Cockfield
@pjc
Is ChromaDB currently de rigueur? Thanks good to know. Man, I'd love to keep python out of my critical the runtime dependencies, but I guess Docker IS the "don't-worry-about-itβ’οΈ" abstraction. Wish it could all just be TS/WASM though.
1 reply
0 recast
0 reaction
typeof.eth π΅
@typeof.eth
Chroma seems to be _the_ vector database for AI π€·ββοΈ Yeah at least we can Dockerify. I always feel like Iβm gonna brick my laptop when I have to install python deps
1 reply
0 recast
1 reaction
Phil Cockfield
@pjc
"ollama can run in your terminal (so easy to set up, seriously takes like 5 minutes)" π«‘ - truth! thanks @typeof.eth
1 reply
0 recast
1 reaction
typeof.eth π΅
@typeof.eth
Hell yeah, have fun digging into this stuff!
0 reply
0 recast
1 reaction