Content
@
https://warpcast.com/~/channel/technology
0 reply
0 recast
0 reaction
glb
@glb
Transformers have dominated LLM text generation, and generate tokens sequentially. This is a cool attempt to explore diffusion models as an alternative, by generating the entire text at the same time using a coarse-to-fine process. Congrats @StefanoErmon & team!
0 reply
0 recast
0 reaction