I have a site with around 150.000 daily page views using a 180Mb postgres database size running on a 800 mb ram db on Heroku.
This site does not do many updates, queries are mostly select that are cached on the page. SELECT count(*) FROM pg_stat_activity;
always show 30 connections to the database (that should correspond to at least one connection active per worker).
By looking into new_relic I see queries to the database seem to be around 10ms but I have the impression that even If the 800 mb may be ram. Those queries could eventually run noticeable faster.
Is there any relationship in this scenario to giving more ram to process the queries faster or it doesn't matter since the db size is below 200mb?
1 Answer 1
So long as the DB fits in RAM used for disk cache and/or shared_buffers
there is very little benefit to adding more RAM.
Focus your performance efforts elsewhere - eliminate unnecessary queries, cache frequently read results that don't need to be perfectly fresh, replace repeated tiny queries with more efficient joins over sets, etc.
-
What’s the best statistic to see if there are cache misses?eckes– eckes2019年06月16日 18:44:32 +00:00Commented Jun 16, 2019 at 18:44