Content pfp
Content
@
0 reply
0 recast
0 reaction

tani pfp
tani
@tanishq
Well I don't want to keep jumping between different AI models as the things progress and companies do one up on each other, I wonder if most people would shift to aggregated wrappers that create a pipeline and allow easily switch-ability between models like LangChain does, but targeted to consumers instead.
1 reply
0 recast
0 reaction

m_j_r pfp
m_j_r
@m-j-r.eth
100% I'd extend this even further. perhaps the knowledge base is really static/global & outcomes are just recombinant of non-inferential elements, so wrappers also bargain for lower cost by publicly aggregating as many instances of recombination, not internalizing into the models they chain together for their service.
1 reply
0 recast
0 reaction

tani pfp
tani
@tanishq
That makes even more sense! It won’t work in closed models tho But at the same time, aggregation on multiple open models might just work?
1 reply
0 recast
0 reaction