0

Consider the following json:

{
 "floor": [
 {
 "level": 1,
 "apt": [
 {
 "no": 1,
 "area": 40,
 "rooms": 1
 },
 {
 "no": 2,
 "area": 80,
 "rooms": 3
 },
 {
 "no": 3,
 "area": null,
 "rooms": 2
 }
 ]
 },
 {
 "level": 2,
 "apt": [
 {
 "no": 4,
 "area": 100,
 "rooms": 3
 },
 {
 "no": 5,
 "area": 60,
 "rooms": 2
 }
 ]
 }
 ]
}

Using Postgres jsonpath, if I wanted to get the apartments with 3 rooms, the following would suffice:

'$.floor[*].apt[*] ? (@.rooms == 3)'

My question is how would I do the same if I wanted to compared against an ARRAY such that I could provide the number of rooms in an array similar to ANY in SQL

'$.floor[*].apt[*] ? (@.rooms = ANY{3,5,6,10,11} )'

Using Postgres 12.2 on Linux Ubuntu 18.04, 64Bit

asked Apr 23, 2020 at 9:46
2
  • I assume using an or is not an option? e.g. (@.rooms = 3 || @.rooms == 5 || .. ) Commented Apr 23, 2020 at 11:35
  • Not an option :-) Creating that on the fly not trivial and they could be many variables > 10 Thanks Commented Apr 23, 2020 at 12:26

2 Answers 2

0

I don't know of any way to do that with jsonpath.

I'd suggest pulling apartments with any number of rooms and filtering that list, but I suspect that would get quite large as the dataset grows ...

... but then, if you're talking about large data volumes here, I would have to ask why you're not doing this is a proper, Relational way? Yes, you can get postgres to do this with JSON data but should you? This is a very simple data structure that you could represent in a single table which, properly indexed, will cope with huge data volumes.

select level, no, size, rooms 
from apts
order by 1, 2 ; 
+-------+----+------+-------+
| level | no | size | rooms |
+-------+----+------+-------+
| 1 | 1 | 40 | 1 |
| 1 | 2 | 80 | 3 |
| 1 | 3 | null | 2 | 
| 2 | 4 | 100 | 3 |
| 2 | 4 | 60 | 2 |
+-------+----+------+-------+
select * 
from apts 
where rooms in ( 3, 5, 6, 10, 11 ) ; 
+-------+----+------+-------+
| level | no | size | rooms |
+-------+----+------+-------+
| 1 | 2 | 80 | 3 |
| 2 | 4 | 100 | 3 |
+-------+----+------+-------+

Relational databases are really good at finding small bits of [related] data and bolting them back together again. They're generally pretty rubbish at taking large blocks of data, like chunks of JSON, and pulling them apart again. Yes, RDBMS providers are trying to improve this, but I would suggest it'll still be way slower than doing it the "old-fashioned" way.

answered Apr 23, 2020 at 11:00
1
  • We just prefer to use jsonb a lot. It does simplify things for us by a huge margin seeing it has indexes just as normal tables have in postgres while mapping very easily to our application layer. There is already a solution using lateral join that works, but jsonpath is neater... Hope to get some pointers... Thanks... Commented Apr 23, 2020 at 11:20
0

You can try something like given below:

SELECT
 apt
FROM (
 SELECT
 jsonb_path_query(data, '$.floor[*].apt[*]') AS apt
 FROM
 json_test) foo
WHERE
 apt ->> 'rooms' = ANY (ARRAY[3, 4, 5]);
answered Apr 23, 2020 at 12:03
1
  • This should work, but 1. Has a subquery, so I assume would take a performance hit... I will have to test. 2. Uses the normal jsonb operators (->>) so we are back to something similar to lateral joins Any ideas on a pure jsonpath query? Commented Apr 23, 2020 at 12:24

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.