Content pfp
Content
@
0 reply
20 recasts
20 reactions

Barry pfp
Barry
@baz.eth
Hey all - just wanted to share some of my concerns about the public nature of our casts/behavior on FC. Hoping for an open dialogue on where we think this is headed, and in the long run, how to provide privacy protections from a user content perspective. https://paragraph.xyz/@barrycollier/farcaster-content-graph
20 replies
9 recasts
72 reactions

adrienne pfp
adrienne
@adrienne
Thx for writing and sharing. Really great perspective. I am a bit of a reformed privacy maxi. I ended up giving up what I thought was a losing battle. The rewards and incentives against it were too strong. Is there a world where data can be public but the antibodies for protecting us happen at the UX/client level? πŸ€”
1 reply
1 recast
3 reactions

Barry pfp
Barry
@baz.eth
Ty Adrienne - this is exactly my question There are some old data stewardship principles that still feel valid to me, one of them being data should be encrypted at rest (eg hub) and decrypted at the client level (eg Warpcast) When I install a client, I grant what permissions I want to allow to my onchain/onhub data
1 reply
0 recast
2 reactions

adrienne pfp
adrienne
@adrienne
DMs and private messages aside, what’s the point of encrypting at rest of a public social network though? I love the idea of a potential privacy focused client that uses farcaster protocol data (username and social graph) but all messaging happens outside of hubs on proprietary servers with more privacy/security
1 reply
0 recast
0 reaction

Barry pfp
Barry
@baz.eth
I spent $10 to spin up an ec2 instance to download the entire FC graph I now have every single cast/reaction/rel for every single FID I can use this to create psychographic models on every single FID (your language, who/how you engage, etc) If FC stays small, it's a minor issue If FC wants to scale, it's a problem
1 reply
0 recast
0 reaction

Barry pfp
Barry
@baz.eth
FB was fined $5B by the FTC for selling this type of data to Cambridge Analytica, but on FC it's essentially free for anyone to exploit I've had to wear the security hat/think about exposure/attack vectors at scale throughout my career if I'm thinking about it, many people much smarter w/ bad motives will be as well
1 reply
0 recast
0 reaction

adrienne pfp
adrienne
@adrienne
Well, thanks in advance for showing up in my nightmares tonight 😨 But srsly appreciate you raising the topic. Everyone should read what you just wrote. I need to spend more time thinking about the problem, and discussing with / learning from ppl closer to it. Data as a weapon is a huge problem.
2 replies
0 recast
1 reaction

adrienne pfp
adrienne
@adrienne
I think it will likely need novel techniques. I'm skeptical encryption and giving users control over their data will work when the incentives to make data public are too high, and the costs (manipulation) often invisible at the individual level.
2 replies
0 recast
1 reaction

Barry pfp
Barry
@baz.eth
Sorry! The fact that we can pose these questions and talk about them openly in a constructive way is what really matters, and bodes well for the community. I don't care about DAUs and other vanity things, but I do want to at lesat be thinking about the impact of protocol design choices at scale.
1 reply
0 recast
0 reaction