Content pfp
Content
@
0 reply
0 recast
0 reaction

Sam (crazy candle person) ✦  pfp
Sam (crazy candle person) ✦
@samantha
I’ve been reading “The age of surveillance capitalism” and the author posits that once social networks start censoring users, they become responsible for the moderation of the platform Today tech CEOs testified at Congress and it seems that majority of the senators want *higher* moderation on these platforms
3 replies
2 recasts
33 reactions

Sam (crazy candle person) ✦  pfp
Sam (crazy candle person) ✦
@samantha
My q to the philosophy channel is how do you build an uncensored social platform at scale when there is harm such as CP, sextortion, scams, bots etc. In this context harm is subjective. I don’t mean harm = illegal. CP is illegal, sending 1000 $DEGEN and getting 10k back is a scam but not illegal. They are both bad.
12 replies
0 recast
2 reactions

Sam (crazy candle person) ✦  pfp
Sam (crazy candle person) ✦
@samantha
@christin @tldr would love to get your thoughts on this if you are open to it. I know it’s a heavy topic but I am so curious, if you have the capacity.
2 replies
0 recast
0 reaction

kia pfp
kia
@kia.eth
censor at the client all you want. just let the protocol be permissionless, both in writing and reading (many clients). e.g. email: google can kick me off gmail but no one can kick me off of SMTP (this is actually a bad example because spam prevention wars centralized email too)
2 replies
0 recast
6 reactions

Ben  🟪 pfp
Ben 🟪
@benersing
Optionality. Put another way: freedom to “vote” with one’s feet and move on to another node without sacrificing your data / social ties. Permissionlessness at the protocol level is the solution.
2 replies
0 recast
1 reaction

grin↑ pfp
grin↑
@grin
spent a bunch of time thinking about this at lbry. our best answer is that moderation should happen in layers and be opt-in. at the protocol layer, its least moderated. that's either no moderation at all, or maybe rare and low-resolution decisions (eg slashing validators, kicking out nodes, etc)
1 reply
0 recast
6 reactions

elle pfp
elle
@riotgoools
i think x is a good example of how it's not possible. elon shouts about free speech on x every day but ppl get suspended and banned constantly. the ppl who run platforms are always going to have to moderate/censor because they have lines they can't cross – either personal principles or bottom lines for the business
0 reply
0 recast
1 reaction

wartime art hoe pfp
wartime art hoe
@ivy
@cassie might have some thoughts
0 reply
0 recast
1 reaction

netop://ウエハ pfp
netop://ウエハ
@netopwibby.eth
I don’t think this is possible. Even if you empower users with the most comprehensive anti-harassment tools, there’s still the matter of keeping them on rails. That is, have strong defaults while also letting them know periodically to check their settings. Normies aren’t delving into settings though…we are.
0 reply
0 recast
0 reaction

links pfp
links
@links
Personally I think that police are responsible for enforcing laws, not technology companies. If an uncensored social platform creates gaps for illegal behaviour, police can do police work to enforce the law. Forcing new tech to enforce stifles potential. If police have issues, they should elevate their capabilities.
1 reply
0 recast
0 reaction

Brad Barrish pfp
Brad Barrish
@bradbarrish
Like so many things, I think this comes down to incentives, first and foremost. When your main business goal is to drive engagement vs. say, caring about your community, that’s a problem and it will remain a problem.
1 reply
0 recast
2 reactions

Gökhan Turhan 🧬💾🚀 pfp
Gökhan Turhan 🧬💾🚀
@gokhan.eth
currently, my, or our in the future, only solution for spam posts is a simple guideline to cut off the noise. i myself keep getting bullyish DCs from people i kindly try to showcase how to make the best of the channel without patronizing. reading this in the channel specifics but i get the general gist, & been thinking
1 reply
0 recast
0 reaction

Elad pfp
Elad
@el4d
Sadly they will forever fail as long as they stay closed gardens. If they want users to have a better experience they have to open up to different clients. Which they probably won't do because it might destroy their rev streams. We are stuck in this Moloch dynamic.
0 reply
0 recast
0 reaction

Max Miner pfp
Max Miner
@mxmnr
a possible approach is build a system that gives each individual a personalized layer of censorship (e.g. guardian AI). You’d also need to give communities (e.g channels) a method to set their own behavior rules. You aren’t restricting the platform, but empower individuals to dictate what they want protection from.
0 reply
0 recast
0 reaction