Content pfp
Content
@
0 reply
0 recast
0 reaction

Giuliano Giacaglia 🌲 pfp
Giuliano Giacaglia 🌲
@giu
If you had to bet in one "AI startup" for the next 10 years, which one would it be and why?
18 replies
0 recast
14 reactions

Joe Blau 🎩 pfp
Joe Blau 🎩
@joeblau
https://www.neurophos.com β€” Either bet on low-level (Teams building hardware that makes training/inference faster) or bet on high-level (Teams providing the solved customer experience with private data sets and expert labelers). Wrappers are a path to acquihires, not 10x+ ROI. This is a low-level bet.
1 reply
0 recast
2 reactions

King pfp
King
@king
Aren't "wrappers" selling an experience though? ChatGPT is a wrapper.
1 reply
0 recast
0 reaction

Joe Blau 🎩 pfp
Joe Blau 🎩
@joeblau
Link Reader removed Aug 2 β€” https://www.reddit.com/r/ChatGPT/comments/15g7ifo/cant_find_link_reader_plug_in_in_gpt_4/ Link Reader built in Sept 27 β€” https://www.theverge.com/2023/9/27/23892781/openai-chatgpt-live-web-results-browse-with-bing Any valuable wrapper will become a feature.
1 reply
0 recast
0 reaction

π’‚­_π’‚­ pfp
π’‚­_π’‚­
@m-j-r.eth
ironically, there is a wrapper experience that chatgpt can't inherit. if everyone could publicly engage with convo trees as prompter, respondent, or critic, and all of these were embedded/indexed, then synthesizing & curating datasets would be just as big as the OG interface.
1 reply
0 recast
0 reaction

Joe Blau 🎩 pfp
Joe Blau 🎩
@joeblau
You mean doing it on something like the Farcaster protocol?
1 reply
0 recast
0 reaction

π’‚­_π’‚­ pfp
π’‚­_π’‚­
@m-j-r.eth
would definitely need some longform media besides basic delta, but yes, why not? the social networks that earnestly compel userbase to fill in convo trees with genuinely high-quality dialogue will be most resilient to eternal chatember. chatgpt, otoh just has a moat made of wrappers.
1 reply
0 recast
0 reaction

Joe Blau 🎩 pfp
Joe Blau 🎩
@joeblau
Yeah in this case if the data is fully open, you're right. Then the problem becomes finding enough compute to actually train anything useful. Until we figure out how to make smaller training sets bigger, you're going to have to get your hands on more hardware. That's why I suggested "go low-level" 🫑
1 reply
0 recast
0 reaction