I’m building an app that lets users manage data across multiple tables. I also expose an API so they can fetch their data and process it in external services. I’d like to enhance the API to support filtered queries. Inside the app, I already have a fairly sophisticated system that converts logical conditions objects (a tree with many operators) into SQL queries. I’m now looking for a clean way to:
- express filters in the API request, and
- parse those filters into my internal condition object, which I then turn into SQL.
The challenge: complex queries are hard to pass via GET if I want to remain RESTful and I want to use GET in the end to cache my results. I’m considering two approaches:
POST /tables/{table}/records/search
with a complex body that defines the filter. This would create a "saved search" in my DB. Clients could thenGET /tables/{table}/records/search/{id}
to run it, which would first resolve the parameters and then execute the query.Use RSQL for a readable query syntax in the URL. The drawback is that, as far as I understand, RSQL is usually evaluated directly against the DB. In my case I’d still need to parse it and map fields/operators to my internal model so I am not sure that this is a proper solution to my problem.
I’m undecided between these two and not fully convinced by either. Do you have advice or alternative designs?
Also I then would like to be able to reverse the process (serialize my condition object into query params)
3 Answers 3
The challenge: complex queries are hard to pass via GET if I want to remain RESTful and I want to use GET in the end to cache my results
- complex queries are hard to pass via GET
GET supports a body. It's just unusual to use it.
- I want to use GET in the end to cache my results
POSTs are also cacheable with the right headers and browser support.
The problem you are having is that REST ideas don't always work. for example Are you setting cache headers on all your GETs? are you using HATEOAS?
I could suggest:
https://www.rfc-editor.org/rfc/rfc7231
For cases where an origin server wishes the client to be able to cache the result of a POST in a way that can be reused by a later GET, the origin server MAY send a 200 (OK) response containing the result and a Content-Location header field that has the same value as the POST's effective request URI
This would be "RESTful", but you could just as (probably more) easily add a caching layer in your client code rather than relying on the browser.
-
"GET supports a body" - while factually true, the real world support is hit-and-miss. I'd think twice before going this way. POST requests also have a disadvantage of not being copy-pastable and shareable, in contrast with GETsAnton Pastukhov– Anton Pastukhov2025年09月05日 13:12:08 +00:00Commented Sep 5 at 13:12
-
POST /tables/{table}/records/search with a complex body that defines the filter. This would create a "saved search" in my DB. Clients could then GET /tables/{table}/records/search/{id} to run it, which would first resolve the parameters and then execute the query.
Assuming that you can't "just" encode the "complex body" into an identifier for a request target, then POST and make the resulting document available via GET is the way to go.
There's no rule that says that the GET request has to be handled completely separately. You could instead choose to do something like
201 Created
Location: /tables/example/records/search/1
Content-Location: /tables/example/records/search/1
<results of the query>
In the longer term, you should be paying attention to the standardization of the QUERY method, which is perhaps closer to what you really want (ie: a way to pass a body in a request that general purpose components will recognize as being safe).
This specification defines the HTTP QUERY request method as a means of making a safe, idempotent request (Section 9.2 of [HTTP]) that contains content.
Just how complex your filters are? You didn't mention your tech stack, but passing arrays or nested objects in query string is generally a solved problem in many languages, qs being one example in the JS ecosystem.
POST requests have a disadvantage of not being easily bookmarkable and shareable, and also more difficult to cache.
Explore related questions
See similar questions with these tags.
QUERY
method may be a perfect fit for GET-like requests that need a body. Until then, everyone just usesPOST
. Since HTTP defines few semantics forPOST
, it is perfectly suitable for general-purpose needs. Such requests are just not idempotent or cacheable by default. Do not implement a "saved searches" feature (and double latency) just to satisfy your opinions about RESTfulness – there's going to be a POST request involved either way.