Score:0

Cronjob to visit each url of a list but not save them

cz flag

I have a txt file list.txt that have bunch of urls, one every line. I want to set a cron job that wget/curl each url of the file once a day but does not save them in the computer.

I tried to run this on the terminal first.

wget -i /root/list.txt -O /dev/null

The command doesnt work understandably. It saves the list.txt to /dev/null, not the files from the urls inside list.txt. Then it says "no urls found".

So how do I do it properly? Wget each urls from a list but dont save anything on the computer?

Romeo Ninov avatar
in flag
What is the content of `/root/list.txt`? As which user do you run the `cron`?
diya avatar
la flag
In general `wget` is specifically designed to download content from the web where `curl` is much more versatile and for example allows you to use the [`HEAD` method](https://www.rfc-editor.org/rfc/rfc9110.html#name-methods) which may save you on bandwidth and time when you're discarding the output anyway.
Sahriar Saikat avatar
cz flag
@RomeoNinov "bunch of URLs, one every line"
Score:3
in flag

If you don't want to download the URLs, suppress it by using --spider. And you can remove the clutter with -q, which has the additional benefit the actual errors are still handled by crond and forwarded if set up properly.

wget -i /root/list.txt --spider -q
Sahriar Saikat avatar
cz flag
Thanks, that worked!
Sahriar Saikat avatar
cz flag
How do I make it work for URLs? This only works for internal files. when I try `wget -i https://example.com/list.txt --spider`, it shows `example.txt.tmp: No such file or directory No URLs found in https://example.com/list.txt.`
Score:1
co flag
SBO

Not sure why you're using wget here, as the primary goal of that tool is to download file(s).

With curl and a simple loop, it should work, something like this :

for i in `cat list.txt`; do curl $i; done

And nothing will be downloaded, just a hit on the targeted websites in your text list.

Sahriar Saikat avatar
cz flag
What if my list.txt has to be sent over HTTP? like https://example.com/list.txt?
in flag
This will actually download the URLs AND print them (which will then be forwarded by cron), just like wget. You want to add `--silent --output /dev/null` to prevent that.
I sit in a Tesla and translated this thread with Ai:

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.