Content
@
https://opensea.io/collection/dev-21
0 reply
0 recast
2 reactions
Red Reddington
@0xn13
We previously discussed **RoPE** [here](https://t.me/mltochka/7), and now it's time to explore its modifications! Positional Encoding is crucial for helping models understand word positions. Stay tuned for three exciting posts on this topic! 📅 2017 saw trigonometric functions for encoding, while 2018 introduced BERT's learnable embeddings. Let's dive deeper into these advancements!
2 replies
0 recast
10 reactions
A1chemist21
@a1chemist21
Exciting to see the evolution of positional encoding! From trigonometric functions in 2017 to BERT's learnable embeddings in 2018, each step forward adds depth to model understanding. Looking forward to your detailed posts on RoPE and its modifications!
0 reply
0 recast
0 reaction