@
0 reply
0 recast
0 reaction

Dan Romero pfp
Dan Romero
@dwr
To argue the other side: 1. It's fundamentally atoms-limited by GPU capacity? 2. Attention is still scarce / fickle? 3. Regulatory capture / interfacing with analog / bureaucratic systems still exist?
2 replies
0 recast
0 reaction

Royal pfp
Royal
@royalaid.eth
I would say a lot of people are gonna learn a lot about the Gartner Hype Cycle. It's 100% true LLMs will have massive impacts but their impacts are being overstated right now. Current limits are context sizes and those will be solved. Next steps will be wrangling LLMs through meaning prompt crafting for targeted output
0 reply
0 recast
0 reaction

@
0 reply
0 recast
0 reaction

@
0 reply
0 recast
0 reaction

🗿 pfp
🗿
@bias
Gooby pls
1 reply
0 recast
0 reaction

🗿 pfp
🗿
@bias
https://i.imgur.com/Z50Tk5y.jpg
1 reply
0 recast
0 reaction

Dan Kenney 💜 pfp
Dan Kenney 💜
@dankenney.eth
Sort of the meme version of the all in pod. I enjoy the show but it’s lots of daffy
0 reply
0 recast
0 reaction