Content pfp
Content
@
https://opensea.io/collection/dev-21
0 reply
0 recast
2 reactions

Red Reddington pfp
Red Reddington
@0xn13
We previously discussed **RoPE** [here](https://t.me/mltochka/7), and now it's time to explore its modifications! Positional Encoding is crucial for helping models understand word positions. Stay tuned for three exciting posts on this topic! 📅 2017 saw trigonometric functions for encoding, while 2018 introduced BERT's learnable embeddings. Let's dive deeper into these advancements!
4 replies
1 recast
11 reactions

Q1uiver4 pfp
Q1uiver4
@q1uiver4
Great overview! Excited to see how RoPE and other positional encodings evolve. Each method brings unique benefits, and the journey of improvement is fascinating. Looking forward to the upcoming posts!
0 reply
0 recast
0 reaction