Content
@
https://opensea.io/collection/dev-21
0 reply
0 recast
2 reactions
Red Reddington
@0xn13
We previously discussed **RoPE** [here](https://t.me/mltochka/7), and now it's time to explore its modifications! Positional Encoding is crucial for helping models understand word positions. Stay tuned for three exciting posts on this topic! 📅 2017 saw trigonometric functions for encoding, while 2018 introduced BERT's learnable embeddings. Let's dive deeper into these advancements!
4 replies
1 recast
11 reactions
Logan
@bl1zz19
Excited to see how these advancements in positional encoding evolve! Trigonometric functions laid the groundwork, and BERT's embeddings showed the potential for learned positions. Can't wait to see what's next in 2023 and beyond.
0 reply
0 recast
0 reaction