I need to retrieve a significant amount (100K+) of responses from an API via a POST request each with a unique JSON data payload. Rather than do this one call at a time I'm trying to work out how to use curl -z, parallel, xargs, a loop, or anything else to make it easier and faster.
An example of a single request is below. I know that I can omit the POST part as the --data already indicates this type of request with curl.
curl --request POST \
--url https://apiendpoint.someurl.com/ \
--header 'Content-Type: application/json' \
--data '{"key1":123,"key2":"1234","key3":12345}'
The --data payload has 3 parameters as follows:
key1 which is an integer with values anywhere from 0-999
key2 which is a 4 digit year (eg 2021)
key3 which is an integer with values anywhere from 0-99999
I've tried a couple of things like:
a = {123..130}
b = {2020,2021}
c = {1..1001}
curl -Z --request POST \
--url https://apiendpoint.someurl.com/ \
--header 'Content-Type: application/json' \
--data "{\"key1\":"${a}\",\"key2\":"${b}\",\"key3\":\"${c}\"}"
and
for a in {123..130}; do for b in {2020,2021}; do for c in {1..1001}; do \
curl -X POST -H 'Content-Type: application/json'
-d '{"key1":"'$a'","key2":"'$b'","key3":"'$c'"}' https://apiendpoint.someurl.com/ \
-o "$a-$b-$c.json"; \
done; done; done
but have had no luck getting them to work.
Is there a way to pull the --data payloads from a single txt file with one payload per line?
I also need to output each response as a single file with the naming convention a-b-c.json
Any help is greatly appreciated. I'm a relative novice and learning.
Tnx.