David Furlong pfp
David Furlong
@df
LLM coding assistants will probably be browser first like replit rather than desktop first like cursor in the long term
4 replies
1 recast
8 reactions

links 🏴 pfp
links 🏴
@links
Really? I feel like full-OS LLMs make sense to me. For any text input
1 reply
0 recast
0 reaction

Shashank  pfp
Shashank
@0xshash
and eventually mobile first
0 reply
0 recast
0 reaction

BBB🎩 pfp
BBB🎩
@0xbuilders
I gave cursor to my gf (she's not technical) and she wrote tg bot for her work - a simple leaderboard to grant and spend employees points. The only thing with cursor is that it reads all the .envs and I don't wanna share my deployment pk
0 reply
0 recast
0 reaction

MetaEnd🎩 pfp
MetaEnd🎩
@metaend.eth
very much depends, local first is definitely moving with WASM and the ability to use small LLMs directly in the browser. other than that, you can already run VS Code with AI extensions in tthe browser (or IDX by google, but sucks still)
0 reply
0 recast
0 reaction

s5eeo pfp
s5eeo
@s5eeo
Why do you think so?
0 reply
0 recast
0 reaction