i have read the following questions:
Limit Postgres results for a user
and also noted the recommendation of limiting a user's statement_timeout, however i would like something a little more consistent.
noticed one poster on the question mentions that 'row level security' feature would allow someone to enforce this feature.
i have read some tutorials that use row-level security to limit what the user can see by enforcing some column relationship, but i don't know how to extend this to a case of limiting rows since it does not impose any column value constraint.
i also saw this about using FETCH_COUNT:
https://stackoverflow.com/questions/30369100/how-to-handle-large-result-sets-with-psql
as an alternative (key word here):
if row-level security does not allow us to limit the rows returned per-query for a user, it seems a combination of fetch_count and statement_timeout for a specific user may achieve my goal.
is there a way to set the FETCH_COUNT for a user account without having them to specify it in their client/session?
thanks in advance.
-
Hi, and welcome to dba.se! You do realise that a query with a single response record could cripple your entire server if it was some sort of analytic one? See this - also, see this - you can limit containers' resources - maybe a better approach?Vérace– Vérace2022年01月17日 07:05:23 +00:00Commented Jan 17, 2022 at 7:05
-
my solution was to essentially learn ASP.Net core so i can protect my database from being easily copied, whilst allowing the masses to interact with the content much more freely. not really what i wanted to do, but giving direct DB access is just too much of a risk. seems to be much more secure to have an ASP.net app that has access to the read-only account locally. on that note, on to the next question!gagan– gagan2022年01月25日 03:06:38 +00:00Commented Jan 25, 2022 at 3:06