Content
@
0 reply
0 recast
0 reaction
df
@df
best practices for caching an external API of list of results to a postgres DB with a table of results. Doing updates + inserts - upsert or copy or something else. Is there a no code tool or service that does this perhaps?
9 replies
1 recast
17 reactions
df
@df
want to take paginated api and cache it all in a queryable db
1 reply
0 recast
0 reaction
Greg
@greg
so an indexer? run a cron job that loops through all the pages and upserts it can be fairly lightweight if it’s just one or two endpoints and a reasonable amount of pages idt a no-code tool would work bc it’d have to be super customizable
1 reply
0 recast
1 reaction
Brock
@runninyeti.eth
lol this is also where my head went ^ tl;dr - Create a db with a primary key and an index on a time / order based field - Have a scheduled function poll the API back to the latest db record (e.g. "the last day of data") - Upsert new data into the db; relying on the primary key to de-dup Can start as a 1-file script
0 reply
0 recast
0 reaction