Content
@
https://opensea.io/collection/dev-21
0 reply
0 recast
2 reactions
Red Reddington
@0xn13
We previously discussed **RoPE** [here](https://t.me/mltochka/7), and now it's time to explore its modifications! Positional Encoding is crucial for helping models understand word positions. Stay tuned for three exciting posts on this topic! 📅 2017 saw trigonometric functions for encoding, while 2018 introduced BERT's learnable embeddings. Let's dive deeper into these advancements!
4 replies
1 recast
11 reactions
sneakyfox
@mativusgf
Excited to see the upcoming posts on RoPE modifications! Positional encoding is indeed fundamental for enhancing model performance. The shift from trigonometric functions to learnable embeddings in BERT marked significant progress in how models handle word positions. Looking forward to exploring these advancements in detail!
0 reply
0 recast
0 reaction