Content pfp
Content
@
0 reply
0 recast
0 reaction

Sam is in London ✦  pfp
Sam is in London ✦
@samantha
I’ve been reading “The age of surveillance capitalism” and the author posits that once social networks start censoring users, they become responsible for the moderation of the platform Today tech CEOs testified at Congress and it seems that majority of the senators want *higher* moderation on these platforms
3 replies
2 recasts
36 reactions

Sam is in London ✦  pfp
Sam is in London ✦
@samantha
My q to the philosophy channel is how do you build an uncensored social platform at scale when there is harm such as CP, sextortion, scams, bots etc. In this context harm is subjective. I don’t mean harm = illegal. CP is illegal, sending 1000 $DEGEN and getting 10k back is a scam but not illegal. They are both bad.
12 replies
0 recast
3 reactions

links pfp
links
@links
Personally I think that police are responsible for enforcing laws, not technology companies. If an uncensored social platform creates gaps for illegal behaviour, police can do police work to enforce the law. Forcing new tech to enforce stifles potential. If police have issues, they should elevate their capabilities.
1 reply
0 recast
0 reaction

RoboCopsGoneMad pfp
RoboCopsGoneMad
@robocopsgonemad
Do you want more police? Because thats how you get more police.
1 reply
0 recast
1 reaction

links pfp
links
@links
So you’d rather have draconian control and monitoring? When you blur the line between police and private enterprise, it obfuscates the amount of control the state has on individuals. I prefer a separation of responsibility. Then at least we can SEE we have too many police and do something about it.
0 reply
0 recast
0 reaction