JSON Processing in SQL
For many reasons it is useful to store arbitrary nested key-value and array data in JSON. Among the deficiencies of JSON is limited data types, but the great benefit is that it is universal can be queried.
Import/Navigation
psql -c "CREATE TABLE pgbackrest (info jsonb);" pgbackrest info --output=json --stanza=main | psql -c "COPY pgbackrest (info) FROM STDIN;"
This json structure may be printed in a line-orented format using flattenjs. Since a single backup stanza was selected this data structure may be simplified by removing the top-level array
psql -c "UPDATE pgbackrest SET info=info->0;"
Array Processing
The trick to processing JSON arrays is using a lateral join to explan data elments
SELECT backup->>'type' AS backup_type, pg_size_pretty((backup#>'{info,delta}')::bigint) AS size, ((backup#>'{timestamp,stop}')::int - (backup#>'{timestamp,start}')::int) AS "duration (sec)" FROM pgbackrest CROSS JOIN LATERAL jsonb_array_elements(info->'backup') AS backup ORDER BY (backup#>'{timestamp,start}')::int DESC LIMIT 7
In PostgreSQL, the
CROSS LATERAL JOIN
is implicit
for set-returning functions
when references in the
FROM
clause
FROM pgbackrest, jsonb_array_elements(info->'backup') AS backup
Object Processing
Some JSON functions return a
recordset
with columns named
key
and
value
SELECT key, value FROM pgbackrest, LATERAL jsonb_each(info->'status')
In this way it is possible to iterate over keys as if they were array elements.
Validation
JSON may be pretty-pretted or validated using the Python3 JSON library
$ python -m json.tool < info.json