Content
@
0 reply
0 recast
0 reaction
typeof.eth π΅
@typeof.eth
Update: Read a bunch of articles, watched a bunch of videos, used o1 and Cursor to make a project that uses a local llama model to autonomously work on your Github projects (RYO Devin basically). The project implements a lot of the concepts I wanna learn about. Now I have to get this thing running and start reading through the code. I havenβt tested it yet. Gonna do that tomorrow and report back. Repo below for the curious. https://warpcast.com/typeof.eth/0xf248c080 https://github.com/dgca/local-code-agent
3 replies
8 recasts
21 reactions
Phil Cockfield
@pjc
have you been able to do this without a python runtime being required in the stack, @typeof.eth ?
1 reply
0 recast
1 reaction
typeof.eth π΅
@typeof.eth
So far! ollama can run in your terminal (so easy to set up, seriously takes like 5 minutes) and starts an HTTP server by default Iβm gonna need Python to run ChromaDB but they have a Docker image so thatβll take care of that I still donβt understand how youβre supposed to use python packages on MacOS haha
1 reply
0 recast
2 reactions