0

Consider the following:

create table query(id integer, query_definition jsonb);
create table query_item(path text[], id integer);
insert into query (id, query_definition)
values 
    (100, '{"columns":[{"type":"integer","field":"id"},{"type":"str","field":"firstname"},{"type":"str","field":"lastname"}]}'::jsonb),
    (101, '{"columns":[{"type":"integer","field":"id"},{"type":"str","field":"firstname"}]}'::jsonb);
insert into query_item(path, id) values
    ('{columns,0,type}'::text[], 100),
    ('{columns,1,type}'::text[], 100),
    ('{columns,2,type}'::text[], 100),
    ('{columns,0,type}'::text[], 101),
    ('{columns,1,type}'::text[], 101);

I have a query table which has a jsonb column named query_definition.

The jsonb value looks like the following:

{
  "columns": [
    {
      "type": "integer",
      "field": "id"
    },
    {
      "type": "str",
      "field": "firstname"
    },
    {
      "type": "str",
      "field": "lastname"
    }
  ]
}

In order to replace all "type": "..." with "type": "string", I've built the query_item table which contains the following data:

path            |id |
----------------+---+
{columns,0,type}|100|
{columns,1,type}|100|
{columns,2,type}|100|
{columns,0,type}|101|
{columns,1,type}|101|

path matches each path from the json root to the "type" entry, id is the corresponding query's id.

I made up the following sql statement to do what I want:

update query q
set query_definition = jsonb_set(q.query_definition, query_item.path, ('"string"')::jsonb, false)
from query_item
where q.id = query_item.id

But it partially works, as it takes the 1st matching id and skips the others (the 1st and 4th line of query_item table).

I know I could build a for statement, but it requires a plpgsql context and I'd rather avoid its use.

Is there a way to do it with a single update statement?

I've read in this topic it's possible to make it with strings, but I didn't find out how to adapt this mechanism with jsonb treatment.

vanR
  • 87
  • 7

0 Answers0