Content pfp
Content
@
0 reply
0 recast
0 reaction

borodutch pfp
borodutch
@warpcastadmin.eth
gm, so llms mimic human consciousness by acting almost like it (i'd say almost inperceivably accurate) now, there is a research paper that proves that llms work better if we use emotional words in prompts what if that human thing called "feelings" and "empathy" is just an artifact of an llm we use internally?
4 replies
0 recast
6 reactions

borodutch pfp
borodutch
@warpcastadmin.eth
so feelings are not a feature of consciousness that makes us human but a property of a tool that we use to organize knowledge like llms which opens up a whole range of questions if true can unconscious beings feel? are feelings somehow inherent to how universe is structured? what are feelings if they are "unalive"?
2 replies
0 recast
1 reaction

Thomas pfp
Thomas
@aviationdoctor.eth
Can you link to the research paper? Prima facie, it’s not surprising considering that LLMs are trained on texts written by humans, and humans often express themselves using emotionally-loaded words/tokens.
1 reply
0 recast
1 reaction

Luis pfp
Luis
@eon
That's exactly what it is
0 reply
0 recast
1 reaction

Ben pfp
Ben
@benersing
I think that's probably right. There's lots of literature about the evolution of emotions/feelings in humans and the benefit to social formation, trust building and collaboration. Will see what I can dig up.
1 reply
0 recast
1 reaction