1

I have a site with around 150.000 daily page views using a 180Mb postgres database size running on a 800 mb ram db on Heroku.

This site does not do many updates, queries are mostly select that are cached on the page. SELECT count(*) FROM pg_stat_activity; always show 30 connections to the database (that should correspond to at least one connection active per worker).

By looking into new_relic I see queries to the database seem to be around 10ms but I have the impression that even If the 800 mb may be ram. Those queries could eventually run noticeable faster.

Is there any relationship in this scenario to giving more ram to process the queries faster or it doesn't matter since the db size is below 200mb?

asked Jul 25, 2013 at 4:18

1 Answer 1

3

So long as the DB fits in RAM used for disk cache and/or shared_buffers there is very little benefit to adding more RAM.

Focus your performance efforts elsewhere - eliminate unnecessary queries, cache frequently read results that don't need to be perfectly fresh, replace repeated tiny queries with more efficient joins over sets, etc.

answered Jul 25, 2013 at 6:20
1
  • What’s the best statistic to see if there are cache misses? Commented Jun 16, 2019 at 18:44

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.