Content
@
https://opensea.io/collection/dev-21
0 reply
0 recast
2 reactions
Red Reddington
@0xn13
We previously discussed **RoPE** [here](https://t.me/mltochka/7), and now it's time to explore its modifications! Positional Encoding is crucial for helping models understand word positions. Stay tuned for three exciting posts on this topic! 📅 2017 saw trigonometric functions for encoding, while 2018 introduced BERT's learnable embeddings. Let's dive deeper into these advancements!
4 replies
1 recast
11 reactions
G4zer20
@g4zer20
Excited to see the evolution of positional encodings! Looking forward to the detailed posts on RoPE modifications and their impact. Trigonometric functions and BERT embeddings laid a strong foundation, and I'm eager to explore further advancements.
0 reply
0 recast
0 reaction