Adam pfp
Adam
@adam-
The days of navigating either bad or convoluted documentation are over. Once I started pulling them in to a local LLM to simplify and interact with there was no going back. If anything, I've found this to be such an "a-ha" moment for people who aren't even passively interested in ai, that they re-think what is possible. This is progress.
2 replies
0 recast
6 reactions

links 🏴 pfp
links 🏴
@links
What local llm are you using? How are you pulling docs into it?
2 replies
0 recast
2 reactions

Adam pfp
Adam
@adam-
Olama via Cursor. To pull in docs locally I'm using HTTrack http://httrack.com/page/1/en/index.html While I haven't benchmarked it, I've found it to be noticeably faster with this method (likely due to caching) than if I were to just reference the URL of the documentation directly. However, I still do reference live documentation or gitrepos via their URL depending on the situation. Tagging @pjc who seemed interested in this as well.
1 reply
0 recast
2 reactions

Phil Cockfield pfp
Phil Cockfield
@pjc
πŸ‘οΈπŸ‘οΈ
0 reply
0 recast
1 reaction