Content pfp
Content
@
0 reply
0 recast
0 reaction

Dan Romero pfp
Dan Romero
@dwr.eth
Still haven't wrapped my head around this one.
9 replies
3 recasts
65 reactions

matthias pfp
matthias
@iammatthias
If I'm grokking this correctly, it's about scaling inference, not just training. The more compute you throw at both, the more the models push beyond the expected limits. Curious to see what this practically unlocks—rumor is 4.5's dropping this fall, and 5 could be here as early as the new year.
0 reply
0 recast
0 reaction