Content pfp
Content
@
0 reply
0 recast
0 reaction

gakonst pfp
gakonst
@gakonst
apropos of nothing I don't think I've ever liked any person that's been involved in ai alignment and talks doom/cites linear projections on log charts, "scaling" etc instead of math about interpretability (eg the recent anthropic paper was excellent)
1 reply
29 recasts
236 reactions

🪱 pfp
🪱
@worm.eth
Is this a moral shortcoming on your part? Should you get over this? Personally I have no idea
0 reply
0 recast
3 reactions