Score:2

How to log URL to a file only if it’s HTTP status code is 200?

ro flag

I want to make a script so I can check if I can connect to several URL.

For example I do curl -i -L --silent URL | grep "HTTP/2 200" >> *.txt

I want to combine it with an if statement so I can do if grep "HTTP/" 200" >> *.txt", then: grep "Location" >> *.txt

So if it does grep "HTTP/2 200" then it should grep the "location" (URL) as well. I can't get it to work though.

Also I want to do echo "-------------------------------------" between every URL output so it shows like this in a file:

#URL1

HTTP(1)

URL(1)

#URL2

HTTP(2)

URL(2)

and so on.

thanks for every help.

Raffa avatar
jp flag
Yep, possible ... `curl -i -L --silent URL | if grep -q "HTTP/2 200"; then grep "Location"; fi` ... But, please give examples to make your question clearer so we can help you better ... i.e. Be specific please.
dummyuser avatar
uy flag
did you think about `curl -i -L --silent URL | grep "HTTP/2 200" | grep "Location" >> *.txt` ? this may solve the fist part of your question.
cc flag
Check the bash man pages for the && construct. execute what's after if the previous succeeded.
Bodo avatar
pt flag
Please [edit] your question and show some example output of `curl -i -L --silent URL` and the corresponding expected result in your output file.
Score:2
jp flag

If you want the HTTP response/status and the URL saved to a file in the format you desire. then curl alone can do that ... It has the option -w or --write-out(Which defines what to display on stdout after a completed and successful operation. The format is a string that may contain plain text mixed with any number of variables ... e.g. -w "-----------\nStatus: %{http_code}\nURL: %{url}\n") and if you redirect the rest of the output to e.g. /dev/null then you end up with a clean result of just what you want and you can then redirect that result and append it to a file called e.g. url.txt like so:

curl -o /dev/null -s -w "-----------\nStatus: %{http_code}\nURL: %{url}\n" https://askubuntu.com >> url.txt

If you have a list of URLs(each on a new line) in a file called e.g. list.txt like so:

$ cat list.txt
https://askubuntu.com/
https://unix.stackexchange.com/
https://notworkingurlblabla.com/

Then, you can check all the URLs in that file, filter-out non 200 status URLs and append the result to url.txt all at once e.g. like so:

xargs -n 1 <list.txt curl -o /dev/null -s -w "%{http_code} %{url}\n" | \
awk '{if ($1 == "200") printf "------------\nStatus: "$1"\nURL: "$2"\n"}' >> url.txt

You can also, obviously, use awk to filter output with a single URL as well like so:

curl -o /dev/null -s -w "%{http_code} %{url}\n" https://askubuntu.com/ | \
awk '{if ($1 == "200") printf "------------\nStatus: HTTP "$1"\nURL: "$2"\n"}' >> url.txt
bahsnub avatar
ro flag
how can i output the HTTP 200 and Location to a file tho?
bahsnub avatar
ro flag
@dummyuser sadly no, i dont want both the HTTP status + the Location to get saved into the file. i Only want them to be saved if there is a HTTP2/2 200, but with your command it would save the Location anyway even if there is no http 200
Raffa avatar
jp flag
@user2315 `curl -o /dev/null -s -w "%{http_code} %{url}\n" https://askubuntu.com/ | awk '{if ($1 == "200") printf "------------\nStatus: HTTP "$1"\nURL: "$2"\n"}' >> url.txt`
bahsnub avatar
ro flag
thank you for your fast answer. I need a little more help tho, i tried the second thing you have written with the xargs and the list.txt file. I copied everything exactly like you wrote it but it doesnt work for me, do i have to change some things from your original post?
Raffa avatar
jp flag
@user2315 Do you have a file called `list.txt` that contains URLs like the example I show in my answer? ... It needs to be in the same working directory where you run the command.
Raffa avatar
jp flag
@user2315 Oh, probably obvious but you need to copy the two lines xargs and awk together and paste them in the terminal at once ... It's just a one line broken down to two by the backslash "\" for readability purposes. :-)
bahsnub avatar
ro flag
Oh okay thank you. I have my script like this now: xargs -n 1 <url.txt curl -o /dev/null -s -w "%{http_code} %{url_effective}\n" | awk '{if ($1 == "200") printf "------------\nStatus: "$1"\nURL: "$2"\n"}' >> test.txt it doenst work for me with "url" tho so i tried using "url_effective", it doenst copy the output to the test.txt file tho. My file with all the URL`s is in the file "url.txt" so i changed that off aswell
bahsnub avatar
ro flag
I was missing the -i -L because i get 3 http status codes because of nginx but now it works thank you.
I sit in a Tesla and translated this thread with Ai:

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.