That's WordPress what you using there. I'm sure there are better ways to warm up the cache: https://de.wordpress.org/plugins/search/cache+warm/
Create a file like warmup.sh with the following code
#!/bin/bash
# URL of main Sitemap
sitemap_url="https://www.vgopromo.com/wp-sitemap.xml"
# Extract all Sitemap URLs
sitemap_urls=$(curl -s "$sitemap_url" | grep -oP '(?<=<loc>)[^<]+')
# Loop over and retrieve the individual URLs
for sitemap in $sitemap_urls; do
urls=$(curl -s "$sitemap" | grep -oP '(?<=<loc>)[^<]+')
for url in $urls; do
curl -IL "$url"
done
done
This will do what you requested.
You can also a Cronjob that run this file.
# Example: At minute 15 past every hour.
15 */1 * * * /bin/bash /root/warmup.sh
EDIT
This modified code adds the option to define also subdomains.
#!/bin/bash
# Array of Subdomains, just extend in same princip
subdomains=("www" "subdomain_2")
# Loop over Subdomains and retrieve URLs
for subdomain in "${subdomains[@]}"; do
sitemap_url="https://$subdomain.vgopromo.com/wp-sitemap.xml"
sitemap_urls=$(curl -s "$sitemap_url" | grep -oP '(?<=<loc>)[^<]+')
for sitemap in $sitemap_urls; do
urls=$(curl -s "$sitemap" | grep -oP '(?<=<loc>)[^<]+')
for url in $urls; do
curl -IL "$url"
done
done
done
Ill personally suggest this way. Makes it compatible with multiple projects
#!/bin/bash
# Array of Domains, just extend in same princip
domains=("https://www.vgopromo.com/wp-sitemap.xml" "https://example.vgopromo.com/wp-sitemap.xml")
# Loop over Domains and retrieve URLs
for domain in "${domains[@]}"; do
sitemap_url="$domain"
sitemap_urls=$(curl -s "$sitemap_url" | grep -oP '(?<=<loc>)[^<]+')
for sitemap in $sitemap_urls; do
urls=$(curl -s "$sitemap" | grep -oP '(?<=<loc>)[^<]+')
for url in $urls; do
curl -IL "$url"
done
done
done