1

We have a query which we want to optimize for high performance. The message column in table is of JSONB type and it holds a large json object. Is there a way to optimze this? We have an index already on the batch_num

SELECT jsonb_array_elements_text(message #> '{results}') AS res FROM TABLE WHERE batch_num = '11';
nbk
8,6996 gold badges15 silver badges27 bronze badges
asked Dec 11, 2020 at 14:48
1
  • How large is the JSONB, how fast is it now, and how fast does it need to be? Commented Dec 11, 2020 at 19:16

1 Answer 1

1

You painted yourself into a corner by modeling this with jsonb. The whole large thing has to be read to extract the attribute.

The best I can think of is to break the desired value out of the large object into its own column:

ALTER TABLE mytab
 ADD results jsonb GENERATED ALWAYS AS (message -> 'results') STORED;

This will work on recent PostgreSQL versions; for older versions you can use a trigger that does the same thing.

Then you only have to read the results column instead of the whole message.

answered Dec 11, 2020 at 18:42

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.