Content
@
0 reply
0 recast
0 reaction
ππͺπΎπ‘π‘πΎ
@gm8xx8
want to understand how llms work? βΊοΈ β llms, explained with a minimum of math and jargon https://www.understandingai.org/p/large-language-models-explained-with
1 reply
0 recast
9 reactions
LIL INTERNET
@lil
Two Q's if you don't mind... 1: Are "word vectors" & "embeddings" the same? Any reason to use the former vs the latter? 2: I read somewhere that word vectors remain largely the same across languages. Does this make word vectors a "universal language?" This would have wild religious / philosophical implications. π
1 reply
0 recast
0 reaction
ππͺπΎπ‘π‘πΎ
@gm8xx8
1. yes, βword vectorsβ and βembeddingsβ generally refer to the same concept, w/ βembeddingsβ being a broader term that can also apply to other types of data beyond words.
2 replies
0 recast
0 reaction
ππͺπΎπ‘π‘πΎ
@gm8xx8
2. cross-lingual embeddings link languages but canβt fully bridge linguistic & cultural gaps since they represent concepts / relationships mathematically rather than fully capturing the depth & nuance of human languages.
1 reply
0 recast
0 reaction
LIL INTERNET
@lil
Thank you! Re: language, in my head it was something like the βshapeβ of the relations between all word vectors (imagined as some kind of multidimensional mesh) ends up being very similar. But maybe itβs just because at that scale the βshapeβ is just a model of the human experience of reality.
1 reply
0 recast
0 reaction