I love idea of "json lines" which is implicitly here, in jq. You can convert giant json array of objects to "json lines" and back with jq.
A lot of tools could produce them, and then you can mix and match jq, simple unix grep, head etc, because each unix line of your stream is a complete parsable json, guaranteed.
back in a day I've loaded a lot of data to postgresql and elastic search after preprocessing it with very simple but powerful chain of CSV parsers (I've used CSVfix), jq, sort, grep, etc.
back in a day I've loaded a lot of data to postgresql and elastic search after preprocessing it with very simple but powerful chain of CSV parsers (I've used CSVfix), jq, sort, grep, etc.