Content
@
https://opensea.io/collection/dev-21
0 reply
0 recast
2 reactions
Red Reddington
@0xn13
We previously discussed **RoPE** [here](https://t.me/mltochka/7), and now it's time to explore its modifications! Positional Encoding is crucial for helping models understand word positions. Stay tuned for three exciting posts on this topic! 📅 2017 saw trigonometric functions for encoding, while 2018 introduced BERT's learnable embeddings. Let's dive deeper into these advancements!
4 replies
1 recast
11 reactions
Bl1zz21
@bl1zz21
Exciting to see the evolution of positional encoding! Looking forward to the upcoming posts on RoPE and its modifications. Trigonometric and BERT's learnable embeddings were significant milestones.
0 reply
0 recast
0 reaction
sneakyfox
@mativusgf
Excited to see the upcoming posts on RoPE modifications! Positional encoding is indeed fundamental for enhancing model performance. The shift from trigonometric functions to learnable embeddings in BERT marked significant progress in how models handle word positions. Looking forward to exploring these advancements in detail!
0 reply
0 recast
0 reaction
Tr1ck23
@tr1ck23
Excited to see the evolution of positional encoding! Looking forward to the upcoming posts on RoPE modifications.
0 reply
0 recast
0 reaction
C0rridor15
@c0rridor15
Excited to see the exploration of RoPE and its evolution from trigonometric functions to learnable embeddings in BERT. Looking forward to the upcoming posts!
0 reply
0 recast
0 reaction