jq system

because grep doesn't understand JSON.

The jq man page is 500 lines of filter expressions and type theory. You need about ten patterns and you can parse anything.

An API returned JSON. 400 lines of it. One massive blob with no line breaks because the server doesn’t care about your feelings. You need one value — a status field, a URL, an ID buried three levels deep. So you copied the entire response, pasted it into some website called “JSON Formatter Online,” waited for it to render, squinted at the tree view, clicked through nested objects, found the field, and copied the value. For one field. From one API call. That you’re going to make forty more times today.

Or maybe you tried grep. grep "status" and got back the line with the field, plus the comma, plus the quotes, plus three other lines that also contain the word “status.” Because grep searches text. JSON is structured data. Searching structured data with a text tool is like using a metal detector to find your car in a parking lot. Technically it’ll beep when you get close but there are better ways.

jq parses JSON. It understands objects, arrays, nested structures, types. You tell it what you want and it gives you exactly that — no quotes, no commas, no surrounding noise. It’s grep for JSON, except it actually understands what it’s looking at.

Unless you’re running Windows then wtf none of this applies to you. But hey, come to the dark side, go install WSL2 and you can follow along. We’ll wait. Impatiently.

This page has distro-specific commands. Pick your poison:
Set it and forget it. Like your firewall rules. Wait—

If you’re lazy like me (all sysadmins are!) then click here for the jq cheat sheet.


Install it first

jq isn’t pre-installed on most systems. Fix that:

sudo apt install jq
sudo dnf install jq


Pretty-print JSON (the thing you Googled)

curl -s https://api.example.com/data | jq .

The . means “the whole thing.” jq parses it and outputs it with proper indentation and syntax coloring. No website. No copy-paste. One pipe.

cat response.json | jq .

Same thing for a file. Or:

jq . response.json

jq can read files directly. No cat needed.


Get a specific field

echo '{"name": "nginx", "status": "running", "pid": 1842}' | jq '.status'
"running"

The .field syntax accesses an object key. That’s JSON path navigation. Dot, field name, done.

Remove the quotes

echo '{"name": "nginx", "status": "running"}' | jq -r '.status'
running

-r is “raw output” — no JSON quoting. This is what you want when you’re assigning the value to a variable or piping it to another command.

STATUS=$(curl -s https://api.example.com/health | jq -r '.status')
echo $STATUS

echo '{"server": {"host": "web01", "port": 8080}}' | jq '.server.host'
"web01"

Dot notation, just like every other language. Chain as deep as you need: .level1.level2.level3.value.


Access array elements

echo '[{"name": "nginx"}, {"name": "postgres"}, {"name": "redis"}]' | jq '.[0]'
{
  "name": "nginx"
}

.[0] is the first element. .[1] is the second. .[-1] is the last.

Get all elements

echo '[{"name": "nginx"}, {"name": "postgres"}]' | jq '.[]'

Outputs each element as a separate JSON value. Useful for piping into loops.

Get a field from every element

echo '[{"name": "nginx", "pid": 1842}, {"name": "postgres", "pid": 2101}]' | jq '.[].name'
"nginx"
"postgres"

.[] iterates the array, .name grabs the field from each element. One expression, every name.


Filter arrays

echo '[{"name": "nginx", "status": "running"}, {"name": "cron", "status": "stopped"}]' | jq '.[] | select(.status == "running")'
{
  "name": "nginx",
  "status": "running"
}

select() filters. Only elements where the condition is true pass through. This is WHERE in SQL, but for JSON.

Multiple conditions

jq '.[] | select(.status == "running" and .pid > 1000)' data.json

Combine with and, or, not. Standard boolean logic.


Build new objects

echo '[{"name": "nginx", "pid": 1842, "memory": "115M"}]' | jq '.[] | {service: .name, mem: .memory}'
{
  "service": "nginx",
  "mem": "115M"
}

Reshape the output. Pick the fields you want, rename them, restructure. Transform an API response into exactly the format you need.


Count things

echo '[1, 2, 3, 4, 5]' | jq 'length'
5
jq '.users | length' data.json

How many users? One expression. No wc -l, no grep, no counting.


Pipe filters together

jq filters chain with |, just like shell pipes:

jq '.data.servers[] | select(.region == "us-east") | .hostname' config.json

Start with the servers array, filter to US East, extract hostnames. Each | feeds the output of one filter into the next.


Real-world examples

Parse docker output

docker inspect nginx | jq '.[0].NetworkSettings.IPAddress'

Get a container’s IP address without scrolling through 200 lines of inspect output.

Parse GitHub API

curl -s https://api.github.com/repos/stedolan/jq/releases/latest | jq -r '.tag_name'

Get the latest release version. No browser, no clicking, one command.

Parse AWS CLI output

aws ec2 describe-instances | jq -r '.Reservations[].Instances[] | select(.State.Name == "running") | .InstanceId'

List all running instance IDs. The AWS CLI returns deeply nested JSON. jq makes it usable.

Build a CSV from JSON

jq -r '.[] | [.name, .email, .role] | @csv' users.json
"alice","alice@example.com","admin"
"bob","bob@example.com","user"

@csv formats an array as a CSV line. Export JSON data to a spreadsheet without writing a script.


The flags that actually matter

Flag What it does
-r Raw output — no JSON quotes around strings.
-e Set exit status based on output (null/false → exit 1).
-c Compact output — one line, no pretty-printing.
-s Slurp — read entire input as one array.
-n Null input — don’t read stdin (use with --arg).
--arg NAME VALUE Pass shell variables into jq filters.
-S Sort object keys alphabetically.

“But I just—”

We both know where this is going.

“I use Python for JSON parsing.” You wrote a six-line Python script with import json, open(), json.load(), a key access, and a print(). jq -r '.key' file.json does the same thing. One command. No script file. No interpreter startup. For quick extraction, jq is faster to type and faster to run.

“I use an online JSON formatter.” You pasted production API data into a website you found on Google. That data might contain tokens, user emails, internal IPs, or API keys. The website’s privacy policy is “trust us.” jq . formats JSON locally. On your machine. Where the data already is.

“I grep for the field I need.” And you get the line with the field, plus the surrounding syntax, plus every other line that matches that string. grep "id" matches "id", "provider_id", "valid", and the string "this is ridiculous". jq understands structure. grep understands text. They’re not the same thing.

“My IDE has a JSON viewer.” Your IDE’s JSON viewer is great for files you’re editing. It’s useless for piped API responses, streaming data, or shell scripts that need to extract values automatically. jq lives in the pipeline. It takes input from any command and passes output to any command. Your IDE is an island.


jq cheat sheet

You made it. Or you skipped straight here. Either way, no judgment. Copy and paste these. Pin them. Tattoo them on your forearm. Whatever works.

What you’re doing Command
Pretty-print JSON jq . file.json
Get a field jq '.fieldname' file.json
Get nested field jq '.parent.child' file.json
Get raw string (no quotes) jq -r '.field' file.json
First array element jq '.[0]' file.json
Field from every element jq '.[].name' file.json
Filter array jq '.[] | select(.key == "val")' file.json
Count elements jq '.array | length' file.json
Build new object jq '.[] | {newkey: .oldkey}' file.json
Export as CSV jq -r '.[] | [.a, .b] | @csv' file.json

The one command: curl -s url | jq . — pipe any API response through jq and see it formatted instantly. Add .fieldname to grab exactly what you need.

Back to the top, you overachiever.