Content pfp
Content
@
0 reply
0 recast
0 reaction

Michael Silberling pfp
Michael Silberling
@msilb7
Related: One of the biggest "grand challenges" in crypto data (imo) is how to scale data models (or whatever you'd need to ~easily do something like this) when there are new apps, updates/versions, etc faster than we can keep up with. I can't think of any solution that doesn't involve an AI loaded up with our brains.
7 replies
1 recast
3 reactions

Michael Silberling pfp
Michael Silberling
@msilb7
The second one is: How do you even do data when a majority of internet activity is onchain?
0 reply
0 recast
0 reaction

J Hackworth pfp
J Hackworth
@jhackworth
I think that crypto data does a decent job for having composable data compared to other industries I’ve worked in One way would be to leverage token incentivizes to build out critical data models from contributors. Bridge data is likely always going to be one of the hardest to keep up with.
0 reply
0 recast
2 reactions

Ben | 0443.eth pfp
Ben | 0443.eth
@nvben
This is the core of what we do. We have are on the nth iteration of an ai model but we don't use it for production yet because sometimes it's wrong and for this type of situation we really don't want a wrong answer.
1 reply
0 recast
1 reaction

JJ pfp
JJ
@jpknegtel
Tehehehehehe πŸ‘€
1 reply
0 recast
0 reaction

Data Always pfp
Data Always
@dataalways
This is why I’ve always ignored solana. It’s great tech but how do you handle that flood as an analyst.
1 reply
0 recast
1 reaction

troy πŸ₯‘ pfp
troy πŸ₯‘
@troyb
you would be equally overwhelmed by the data generated by web2, but it’s all proprietary. this is a feature, not a bug but it is overwhelming πŸ˜…
1 reply
0 recast
0 reaction

Sam pfp
Sam
@sqlsundaysam
πŸ’―πŸ’― . Every new curated view is an added responsibility to keep track of its updates . But also, maybe scaling in the future just means prioritising the protocols to cover instead of curating everything πŸ€”
0 reply
0 recast
0 reaction