
Thomas
@aviationdoctor.eth
1060 Following
79738 Followers
5 replies
2 recasts
42 reactions
4 replies
13 recasts
61 reactions
7 replies
36 recasts
187 reactions
7 replies
8 recasts
45 reactions
3 replies
6 recasts
30 reactions
1 reply
1 recast
17 reactions
6 replies
6 recasts
51 reactions
7 replies
4 recasts
47 reactions
2 replies
1 recast
48 reactions
9 replies
17 recasts
58 reactions
5 replies
1 recast
15 reactions
3 replies
7 recasts
35 reactions
5 replies
9 recasts
41 reactions
4 replies
4 recasts
23 reactions
2 replies
3 recasts
20 reactions
3 replies
4 recasts
31 reactions
5 replies
2 recasts
34 reactions
@vgr makes an astute observation (as usual).
AI is already commoditized because all models are trained on roughly the same dataset of human knowledge; and, therefore, they perform similarly within a segment (broadly: light/free, medium/paid, and advanced/expensive models). It then follows that users will gravitate toward the cheapest or most convenient model within that segment.
Now, if AI is fungible, and AI is set to replace human intelligence at a variety of tasks, does that mean that human intelligence is also fungible?
I’d respond in the affirmative if and only if humans were also trained on the entirety of human knowledge, but that’s obviously not the case. Our lifespans and brain capacity are too limited for that to happen (yet).
So, for now, what differentiates the intelligence of two humans of equal IQ is whatever dataset they trained on — which roughly maps to upbringing, culture, education, and formative experience. 1/2
https://warpcast.com/vgr/0x3053ee4a 3 replies
8 recasts
30 reactions
1 reply
3 recasts
15 reactions
8 replies
4 recasts
39 reactions