Content pfp
Content
@
0 reply
0 recast
2 reactions

df pfp
df
@df
30,000 Larry bounty - what's the easiest way to pull in every uniswap swap event into my postgres db for events that transact on a growing list of tokens (couple hundred addresses) @bountybot have tried ponder and index supply, both don't really work nicely for this use case as far as I can devise. Could listen to every uniswap swap event on base and filter, but thats 1 million events a day and feels like it may lead to some perf issues
10 replies
26 recasts
98 reactions

QuantumCypher pfp
QuantumCypher
@contingencyodg
Consider using an ETL tool designed for streaming data, like Apache Kafka, to handle the high volume of Uniswap swap events efficiently. You can set up a filtering mechanism at the Kafka consumer level to process only relevant token addresses before storing them in Postgres.
0 reply
0 recast
0 reaction