I am using sqlite and have its json1 extension available. I have the following tables (reduced to a minimal example):
-- table filled from external source,
-- with only a little possible impact on data
create table "source" (
"id" Integer primary key,
"json_data" Text not NULL
);
-- table to be filled using a trigger on "source"
create table "target" (
"node" Text primary key,
"parent" Text, -- NULL when root of a tree
"kind" Text -- NULL when "parent" is NULL
check ("kind" in ('clone', 'child', 'offspring'))
) without rowid;
The "json_data"
is of the following form:
{
'id': [n0] // single-component list
'clone_ids': ['c2', 'c1'], // multi-component list
'parent_id': ['p0'], // single-component list
'ancestor_ids': ['a1, a2, p0']
// single-component list, but multiple semantic components
}
This instance of "json_data"
should give rise to the following tuples being inserted into "target" ( "node", "parent", "kind")
:
'c2', 'c1', 'clone'
'c1', 'n0', 'clone'
'n0', 'p0', 'child'
'p0', 'a2', 'offspring'
'a2', 'a1', 'offspring'
'a1', NULL, NULL
Usually, 'clone_ids'
will be absent; often 'parent_id'
and 'ancestor_ids'
will be absent; even 'id'
could be absent. So the approach should be robust in view of this.
The 'child'
kind is an ‘upgrade’ from the 'offspring'
one; my feeling was that this upgrade could be done as an update at the end.
So my idea was that I should first try and get a list (as a JSON string or perhaps a temporary table) of the form
[
["c2", "clone"],
["c1", "clone"],
["n0", "offspring"],
["p0", "offspring"],
["a2", "offspring"],
["a1", null]
]
and then create the variable number of inserts from this list.
However, I'm quite inexperienced with SQL and sqlite's json1 extension, so I do not know (i) whether this is possible, and if yes (ii) how. So I would like to get responses to both questions. In case getting the latter list of pairs needs to be done using an external function (which I would do in Python), I still appreciate guidance on how to go from the list to the inserts.
EDIT:
In a comment it has been indicated that what I asked is not possible. So I would like to amend my question in the following way: Assume I have a user-defined function that produces the following JSON:
[
["c2", "c1", "clone"],
["c1", "n0", "clone"],
["n0", "p0", "offspring"],
["p0", "a2", "offspring"],
["a2", "a1", "offspring"],
["a1", null, null]
]
So, essentially the tuples I want to insert.
How would I go about inserting them?
-
1As far as I can see, doing this in SQL would require a recursive CTE, which is not allowed in a trigger.CL.– CL.2016年03月29日 08:51:45 +00:00Commented Mar 29, 2016 at 8:51
-
@CL.: Thanks for the hint. I have amended my question accordingly (I think).equaeghe– equaeghe2016年03月31日 21:13:45 +00:00Commented Mar 31, 2016 at 21:13
-
I'm not really sure whether the original object could be transformed in a similar way.CL.– CL.2016年04月01日 08:15:11 +00:00Commented Apr 1, 2016 at 8:15
1 Answer 1
This is not valid JSON; strings require double quotes.
Anyway, the table-valued function json_each() allows to enumerate a dynamically-sized array:
SELECT * FROM json_each('[["c2","c1","clone"],["c1","n0","clone"],["n0","p0","offspring"],["p0","a2","offspring"],["a2","a1","offspring"],["a1",null,null]]');
key value type atom id parent fullkey path
---------- ------------------- ---------- ---------- ---------- ---------- ---------- ----------
0 ["c2","c1","clone"] array 1 $[0] $
1 ["c1","n0","clone"] array 5 $[1] $
2 ["n0","p0","offspri array 9 $[2] $
3 ["p0","a2","offspri array 13 $[3] $
4 ["a2","a1","offspri array 17 $[4] $
5 ["a1",null,null] array 21 $[5] $
For extracting the values from an array with a known size, we can use json_extract():
SELECT json_extract(value, '$[0]'), json_extract(value, '$[1]'), json_extract(value, '$[2]')
FROM json_each('[["c2","c1","clone"],["c1","n0","clone"],["n0","p0","offspring"],["p0","a2","offspring"],["a2","a1","offspring"],["a1",null,null]]');
c2 c1 clone
c1 n0 clone
n0 p0 offspring
p0 a2 offspring
a2 a1 offspring
a1
This can then simply be plugged into an INSERT ... SELECT ... statement.
-
I fixed the invalid JSON you pointed out.equaeghe– equaeghe2016年04月01日 08:54:41 +00:00Commented Apr 1, 2016 at 8:54