We have a query which we want to optimize for high performance. The message column in table is of JSONB type and it holds a large json object. Is there a way to optimze this? We have an index already on the batch_num
SELECT jsonb_array_elements_text(message #> '{results}') AS res FROM TABLE WHERE batch_num = '11';
-
How large is the JSONB, how fast is it now, and how fast does it need to be?jjanes– jjanes2020年12月11日 19:16:20 +00:00Commented Dec 11, 2020 at 19:16
1 Answer 1
You painted yourself into a corner by modeling this with jsonb
. The whole large thing has to be read to extract the attribute.
The best I can think of is to break the desired value out of the large object into its own column:
ALTER TABLE mytab
ADD results jsonb GENERATED ALWAYS AS (message -> 'results') STORED;
This will work on recent PostgreSQL versions; for older versions you can use a trigger that does the same thing.
Then you only have to read the results
column instead of the whole message
.
Explore related questions
See similar questions with these tags.