Content pfp
Content
@
0 reply
0 recast
0 reaction

df pfp
df
@df
best practices for caching an external API of list of results to a postgres DB with a table of results. Doing updates + inserts - upsert or copy or something else. Is there a no code tool or service that does this perhaps?
9 replies
1 recast
17 reactions

df pfp
df
@df
want to take paginated api and cache it all in a queryable db
1 reply
0 recast
0 reaction

Joe Petrich 🟪 pfp
Joe Petrich 🟪
@jpetrich
It matters how much data this is, how often you'll be updating it, and how much you expect the data to change. I think in non extreme cases you can write a single statement to upsert the new data every time with an updated_at column and when you query, ignore stale data by querying since an update time.
1 reply
0 recast
1 reaction

Marcus Lee pfp
Marcus Lee
@marcustut
The simplest solution is just to use an in-memory cache in your backend and until that isn’t feasible look into something like redis, etc.
0 reply
0 recast
1 reaction

jtgi pfp
jtgi
@jtgi
I store in memory with node-cache until I can’t. It gets flushed, isn’t distributed etc but for most use cases it’s fine, free and fast. If you start getting thundering herds, need locks/scale etc then redis/elasticache is ez.
0 reply
0 recast
1 reaction

Manan pfp
Manan
@manan
Ask chatgpt to write you a function that wraps the API call with redis cache
1 reply
0 recast
3 reactions

CV pfp
CV
@c-v
What’s your infra like?
0 reply
0 recast
0 reaction

CV pfp
CV
@c-v
Could be a nice case for Redis or SQLite
0 reply
0 recast
0 reaction

agusti pfp
agusti
@bleu.eth
clickhouse? metabase? are you trying to get analytics or just the db/api
0 reply
0 recast
0 reaction

YuriNondual(Mental Health Break) pfp
YuriNondual(Mental Health Break)
@yurinondual.eth
Do you need to invalidate it when data in db is updated? The easy solution IMO is add @cloudflare CDN on top of the api url. It's amazing for a lot of things really
1 reply
0 recast
0 reaction