14 essential CLI tools for working with APIs

2022-08-16 - 10 min read

This article will explore how the command line can be a powerful tool for working with APIs. We'll cover some basic commands for making HTTP requests and manipulating data.

CLI commands for API development

A terminal is a powerful tool for developers, and there are many different ways to use it when working with APIs. We'll show you the tools listed below.

  • cURL cURL is the go-to tool to make HTTP requests from the command line. You can use many options, including changing the request method, HTTP headers, and post body. This makes it a handy tool for testing APIs.
  • jq and xmllint jq (JSON query) is a command line tool for formatting, parsing, and manipulating JSON data. Xmllint is kind of its XML counterpart.
  • HTTPie HTTPie is another command line tool you can use to make HTTP requests. It has a more user-friendly interface than cURL and can be used to make the same types of requests.
  • Find, xargs, grep, sed, awk A bunch of command line tools that give you great powers when combining them.
  • Diff See the difference between files.
  • Apache Bench (ab) and Siege You can use these tools to test the performance of APIs.
  • Pbcopy / clip Copy output to the clipboard.

Requesting data from an API

Before we can do anything, we'll want to get some data from an API using cURL. We'll use the Star Wars API as an example. It has an endpoint at /api/people/1. Since it doesn't need any authentication or other data, we can just write:

curl <https://swapi.dev/api/people/1>

And it comes back with some data:

{"name":"Luke Skywalker","height":"172","mass":"77","hair_color":"blond","sk…

The response is a long unfriendly JSON string. We can use jq to make it readable:

curl -s <https://swapi.dev/api/people/1> | jq .
  "name": "Luke Skywalker",
  "height": "172",
  "mass": "77",
  "hair_color": "blond",
  "skin_color": "fair",
  "eye_color": "blue",
  "birth_year": "19BBY"

This is as clean as it gets. The -s flag puts curl in silent mode. Otherwise, it will emit a progress indicator above our JSON. Then we pipe the results to the jq tool. Piping means sending the output of one command to the input of another. The dot after jq is specific for that tool and means that we want to print the root of the JSON object. You can specify a path as well, for example:

curl -s <https://swapi.dev/api/people/1> | jq -r .name
Luke Skywalker

Here, we also included the -r flag, which means "raw". Its effect is that the output is now the raw string, instead of a JSON string.

When working with XML, we can use xmllint to format the response:

curl -s \
-H ‘Accept: application/xml' \
<https://petstore.swagger.io/v2/pet/findByStatus?status=available> | xmllint — format -

It's also possible to get specific elements using XPath, for example, with xmllint — format — xpath ‘//Pet/name' -

Authenticating cURL requests

There are several ways to authenticate API requests, and most APIs only accept one of them. The typical flavors are:

  • Using Basic authentication This sends a header in the form Authorization: Basic base64("username:password")
  • Using an API key This is a fixed key, but the header to use can differ. Authorization: Bearer api_key or a custom header name such as "Api-Key" is commonly seen.
  • Using OAuth This also uses the Authorization: Bearer key form, but the key is temporary and obtained in an authorization flow.

In cURL, you can add basic authorization by using curl -u username:password. There is no need to do the base64 encoding yourself. Adding a header is also easy. For example:

curl -H ‘api_key: SuperSecret' \

In this case, SuperSecret is the API key. The -H flag is used to send a header (and can be repeated). Also, note that you can end lines with a backslash to write commands over multiple lines.

It is possible to execute the OAuth2 flow using cURL, but it is more comprehensive. Sometimes you can grab the authentication token from the network tab in the browser's developer tools.

Posting data using cURL

Now that we know how to retrieve data, we also need to know how to post. Posting data with cURL can be done with the --data option or the shorter form -d. An example:

curl -H 'Content-Type: application/json' \
-d '{
  "category": {
    "name": "domestic"
  "name": "Mrs Kitty",
  "status": "available"
}' -X POST <https://petstore.swagger.io/v2/pet>

Note that you need to specify the content type header. For most APIs, this will be application/json, but some APIs mandate a different content type. An example of this is the Contentful API, which requires application/vnd.contentful.management.v1+json.

The example above posts JSON data. However, some APIs require form encoding. That type of encoding looks like the URL parameters, for example, firstname=John&lastname=Doe, and is the default encoding used when submitting an HTML form. An example is:

curl -d secret=TopSecret -d response=Token <https://www.google.com/recaptcha/api/siteverify>

This is the call to verify a ReCaptcha v2 response in curl. We've left out the -X POST part. The POST method is the default when data is added. Curl will automatically set the Content-Type header for form data. You can see the posted headers by adding the -v flag to the command above.

Posting data using HTTPie

HTTPie is an alternative to curl which comes with JSON support and syntax highlighting in the terminal. In HTTPie, JSON is the default format hence we can reduce the call above to just:

https POST petstore.swagger.io/v2/pet \
'category[name]=domestic' \
'name=Mrs Kitty' \

The response comes back nicely formatted and colored, even without using jq. Adding the -v flag also works for HTTPie, which displays the posted JSON data.

Combining CLI commands

One of the great things about working in the terminal is that you can combine various CLI commands. We've already seen an example where we used jq to extract the name property from a JSON object, but there is much more we can do. Just one more example of what you can do with jq:

curl <https://datausa.io/api/data?drilldowns=Nation&measures=Population> | jq '[{year: .data[].Year|tonumber, population: .data[].Population}]'

This CLI command transforms the JSON from the API to a cleaner list, which our own property names. You might want to try jq play online to test out transformations using jq.

Want to know if some text is in the response? We can pipe the result to the grep utitlity to check that. Example:

curl <http://universities.hipolabs.com/search\?country\=United+States> | jq . | grep -i yale

The -i flag is for case insensitivity. Note that we also format the output here. This is required because the whole response comes back as a single line of JSON, while the grep utility returns the lines with a match. We will get back the complete response without formatting first.

Sometimes you want to replace text in the output. The sed utility can do this. This CLI tool is already available on most systems.

echo 'fixme' | sed 's/xm/x m/g'

Here we used a simple echo for demonstration, but you can do this equally well with a curl request. The parts between the slashes are what we find & replace. You might use other characters instead of slashes, such as hashes and dollar signs, which is convenient when you need to use a slash in your find or replace string. Like jq, this is a tool with many more possibilities, but this is something I often use it for.

Parsing tabular data

We've worked with JSON and a bit of XML, but sometimes your data is just plain text. We can parse text using the awk tool. Check out this example:

$ free -m
              total        used        free      shared  buff/cache   available
Mem:           3944        1026         347          44        2570        2613
Swap:             0           0           0
$ free -m | grep Mem | awk '{print $4}'
$ free -m | grep Mem | awk '{print "Total: " $2 ", free: " $4}'
Total: 3944, free: 344

Working with files

In the CLI, you can write API output to files using the > symbol:

curl <https://catfact.ninja/fact> > catfact.json

And append data using >>. Example:

curl <https://catfact.ninja/fact> | jq -r .fact >> catfacts.txt

Running that CLI example a few times gives you a text file with a cat fact on each line!

If you have multiple files on disk, you can list them using the find . command. In itself not very useful, but you can combine it with grep to find text in files:

find . | xargs grep -lsi domestic

And there is the next top hat magic trick: xargs. This tool does feed the output as an argument to the following command (grep in this case). That's a bit different from using it without xargs, since that would pipe it to the standard input.

Back to requesting APIs; how can use JSON stored in a file? With curl, you have to use the @ sign.

curl -X POST -d @path/to/file.json -H 'Content-Type: application/json' <http://example.com/>

Difference between files

Sometimes we store files to inspect differences between files. To test the difference between the API response in two environments, for example. This can be done using the diff command:

diff -up old.txt new.txt

Make sure to format JSON and XML data first because diff works line-by-line, like grep.

Performance testing

The de facto performance testing standard in the CLI is Apache Bench (or just: ab).

ab -c 10 -n 20 <http://example.com/>

This command sends 20 requests with a concurrency of 10. It is handy to see how fast an API is and to test serverside caching. The example above is for a simple GET request, but we can use it to post data as well:

ab -c 10 -n 20 -p input.json -T application/json <http://example.com/>

While ab is perfect for simple tests, its traffic is far from realistic. For testing user scenarios, you might check out Siege. For example, you can use a URL list to test with multiple endpoints by using its -f flag. With all this know-how, we can compose a sufficient but hacky way of listing all URLs in a sitemap and then requesting all URLs at random. Using two subsequent commands:

cat https://example.com/sitemap.xml | xmllint --format - | grep '<loc>' | sed 's/    <loc>//g' | sed 's#</loc>##g' > urls.txt
siege -f urls.txt

These CLI commands are suitable for quick checks, but we recommend tools like JMeter and hosted tools for accurate testing.


The terminal is nice, but I bet you're not doing everything in the terminal. Therefore, you'll want a quick way to get data from the terminal in other editors and vice versa. Linux and macOS have the pbcopy and pbpaste commands for that. Their usage is straightforward. Get the current bitcoin price in your clipboard with the following command:

curl <https://api.coindesk.com/v1/bpi/currentprice.json> | jq . | pbcopy

And use pbpaste for the other way around:

pbpaste | jq .

Windows users can use the clip command instead of pbcopy.

Get work done

Working with APIs can be a bit daunting, but with a little command line mastery, it's easy to get jobs done. When you've solved your problem on the command line, you can use Flowlet to automate your task.

Want to create your API using Flowlet?