@
0 reply
0 recast
0 reaction
Dan Romero
@dwr.eth
To argue the other side: 1. It's fundamentally atoms-limited by GPU capacity? 2. Attention is still scarce / fickle? 3. Regulatory capture / interfacing with analog / bureaucratic systems still exist?
2 replies
0 recast
0 reaction
Royal
@royalaid.eth
I would say a lot of people are gonna learn a lot about the Gartner Hype Cycle. It's 100% true LLMs will have massive impacts but their impacts are being overstated right now. Current limits are context sizes and those will be solved. Next steps will be wrangling LLMs through meaning prompt crafting for targeted output
1 reply
0 recast
0 reaction
timdaub
@timdaub.eth
Chamath said that AutoGPT is ushering in a new paradigm for startups where the rounds for a pre-MVP company are now smaller, because you can replace human cognition with machines. He‘s at„New Paradigm!!!“ https://i.imgur.com/aWarged.jpg
0 reply
0 recast
0 reaction
@
0 reply
0 recast
0 reaction
🗿
@bias
Gooby pls
1 reply
0 recast
0 reaction
🗿
@bias
https://i.imgur.com/Z50Tk5y.jpg
1 reply
0 recast
0 reaction
Dan Kenney 💜
@dankenney.eth
Sort of the meme version of the all in pod. I enjoy the show but it’s lots of daffy
0 reply
0 recast
0 reaction