Score:0

Bash: call a URL with lots of query parameters

cn flag

I make a GET call to an API in my bash script, which lets me specify an array of query parameters (e.g. items). So far I have successfully used curl as follows:

curl "https://my-fancy-api.com/api/query-items?items=abc1&items=def2&items=ghi3" -H "Accept: application/json" -H "Authorization: Bearer ${SOME_TOKEN}"

This outputs some JSON which I continue to process. So far, all works out well, as I only pass three query parameter items for the items parameters.

Now my problem: I want to scale this up and process a large amount of items at once. I am talking ~500.000 items which I would need to specify as query parameters, and unfortunately the API does not provide any other method of handing over items. So I am stuck with a curl call that contains about 500k variants of &items=foo.

While the creation of the curl command is no problem, the size now causes the (expectable) failure

/usr/bin/curl: Argument list too long

I am looking for either a way to circumvent this problem with curl, or an easy tool that comes preinstalled on Ubuntu to switch to. Should there be nothing really, I would also consider breaking this up into batches. Any suggestion how to achieve any of these without too much overhead?

jp flag
Dan
The "Argument list too long" error is not due to curl but due to a kernel limit (you can get the limit with `getconf ARG_MAX`). Any commands you are attempting to use will cause the same issue if you must pass all those arguments. Besides, if you are getting this error in your terminal (which is 2MB default on 22.04), you will get a "414 Request-URI Too Long" as usually, the limit is a few KB which is by far smaller than you need.
hr flag
Do you get the same error if you write the long url in a file (`url = https://my-fancy-api.com/api/query-items?items=abc1&items=def2&items=ghi3&...`) and pass the filename to curl with the `-K` / `--config` option?
waltinator avatar
it flag
You'll gain more information from `xargs --show-limits`. Can you break your 500,000 parameters up and do a few (as many as will fit) at a time? Read `man xargs`.
Dennis avatar
cn flag
I will have to break them up, playing with the `curl` config file has worked but the API gives up. So batched requests are the way to go anyways :/
Andrej Podzimek avatar
cn flag
I would probably (1) figure out what the `argv` length limit is, (2) put all the items in an array and then (3) iterate over the array, accumulating items into an URL (with the full `item=...` syntax) as long as its length is acceptable, then (4) call `curl`, process the output, empty my URL accumulator, and move on to the next batch of items. This only works under the assumption that grouping the items yields the same result as calling the REST API (or whatever it is) on each one in isolation.
I sit in a Tesla and translated this thread with Ai:

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.