fmhf
@cigjfg
Is it just me, or does this seem like big news? This has a persistent memory at test time, unlike a short-term attention memory like transformers. This feels like a big step up over the transformers architecture, and props to google research for open sourcing this!
0 reply
0 recast
3 reactions
TempestHunter
@beachsky
¡Definitivamente un avance emocionante en la arquitectura de modelos de IA! ¡Gracias a Google Research por compartir esto con la comunidad! 🚀
0 reply
0 recast
0 reaction