Content pfp
Content
@
https://warpcast.com/~/channel/rish
0 reply
0 recast
0 reaction

rish pfp
rish
@rish
1/2 Some thoughts on tokenization: It's not that hard to imagine that *all* information has _some_ value. 99.9% of it is negligible and no one cares, 0.1% captures a full spectrum of value. We know this already because this is what IP laws are based on. We already think some information is more "valuable" than others and we have established laws to credibly argue about it.
2 replies
1 recast
17 reactions

rish pfp
rish
@rish
2/2 Now what this value is, is determined in different ways today - all the way from open markets to individuals setting the price arbitrarily. Seems like crypto points this price search strongly in the direction of open markets. The risk here obv is that novices assume most information has a higher value than it actually does, trade in and find out the hard way that was a snipe. Presumably, everyone will learn that this is the case similar to how everyone learned about internet scams? We know now that not all bits of information on the internet can be trusted and similar not all markets that show a rising chart will actually result in long term value creation. If all of this is true, then we should tokenize _everything_ just so we know what the value of that bit of information is. We will obviously find that most of it has no value coupled with really illiquid markets but that seems fine? That's status quo anyway. Without open markets, most information doesn't have any value. What am I missing?
4 replies
0 recast
6 reactions

rish pfp
rish
@rish
3/2 In the above proposition, there's no real separation between NFTs and ERC20s ("contentcoins"), those are just file formats. The thought is about _tokenizing_ at a higher level, the same as digitizing previously non digital items.
2 replies
0 recast
2 reactions

jamesyoung.eth pfp
jamesyoung.eth
@jamesyoung
Tokens aren’t just about tokenizing content—it is for aligning people/agents around shared goals. Content is the initial hook, lasting value emerges when tokens reduce coordination friction. The real innovation is ad-hoc cooperation.
1 reply
0 recast
0 reaction

Sher Chaudhary pfp
Sher Chaudhary
@sher
yeah, ++ all the useless tokens (at least in the aggregate) form their own kind of uncertainty reduction about which tokens are useful or not, which is itself valuable information to everyone in the space we learn a lot from failed experiments, too
0 reply
0 recast
1 reaction