Content
@
0 reply
0 recast
0 reaction
ππͺπΎπ‘π‘πΎ
@gm8xx8
want to understand how llms work? βΊοΈ β llms, explained with a minimum of math and jargon https://www.understandingai.org/p/large-language-models-explained-with
1 reply
0 recast
9 reactions
LIL INTERNET
@lil
Two Q's if you don't mind... 1: Are "word vectors" & "embeddings" the same? Any reason to use the former vs the latter? 2: I read somewhere that word vectors remain largely the same across languages. Does this make word vectors a "universal language?" This would have wild religious / philosophical implications. π
1 reply
0 recast
0 reaction
ππͺπΎπ‘π‘πΎ
@gm8xx8
1. yes, βword vectorsβ and βembeddingsβ generally refer to the same concept, w/ βembeddingsβ being a broader term that can also apply to other types of data beyond words.
2 replies
0 recast
0 reaction