ryangtanaka pfp
ryangtanaka
@ryangtanaka
But back then, AI was not "market ready". It would say "politically incorrect" things all the time, go into "inappropriate" subjects most people don't want to hear (you know, sex, taboo subjects, violence, illegal stuff - it was trained on the internet, after all), and being a robot, it did not care about your feelings and would just blurt out whatever it thought was the right "solution" for you. https://teia.art/objkt/865998 Anyone remember "Tay" - Microsoft's AI project back then?
0 reply
0 recast
1 reaction