Score:0

Error 503 service unavailable - varnish,drupal & nginx

cn flag

We are having a Varnish server which is running in port 6081 and an NGINX server hosting drupal website on another server. So, the VCL file is as below:

# Default backend definition. Set this to point to your content server.
backend default {
.host = "192.168.xyz.ab"; //nginx server IP
.port = "80";
.connect_timeout = 60s;
.between_bytes_timeout = 60s;
}

We are able to connect directly to port 80 of the server and the webpage is loading. When we access http://varnishserverip:6081, it says

Error 503 Backend fetch failed
Backend fetch failed

Guru Meditation:
XID: 65539

Varnish cache server

We see "service unavailable" from varnishncsa. However, not sure where's the actual issue. Any help would be really appreciated.

Update after the first response

I don't see any failure here, but please guide with your thoughts. FYI under "host" in probe health, I have changed from localhost to the destination server IP (192.168.xyz.ab).

Update after curl

root@ip-192-168-xyz-ab:~# curl -I http://3.110.xyz.ab
HTTP/1.1 200 OK
Server: nginx/1.18.0 (Ubuntu)
Content-Type: text/html; charset=UTF-8
Connection: keep-alive
Set-Cookie:SESSb33ae58c46135429be459dc6c2c59eae=XOoT6GqWSjeL2aLcFuw11CsaLFFVrygrZXJ1cpGGkafyMf9u; expires=Thu, 01-Dec-2022 14:06:47 GMT; Max-Age=2000000; path=/; HttpOnly
Cache-Control: must-revalidate, no-cache, private
Date: Tue, 08 Nov 2022 10:33:26 GMT
X-Drupal-Dynamic-Cache: UNCACHEABLE
Link: <http://3.110.xyz.ab/>; rel="canonical", <http://3.110.xyz.ab/>; rel="shortlink"
X-UA-Compatible: IE=edge
Content-language: en
X-Content-Type-Options: nosniff
X-Frame-Options: SAMEORIGIN
Expires: Sun, 19 Nov 1978 05:00:00 GMT
X-Generator: Drupal 9 (https://www.drupal.org)
X-XSS-Protection: 1; mode=block
Score:0
in flag

See https://www.varnish-software.com/developers/tutorials/troubleshooting-varnish/#backend-health-monitoring to learn how to debug backend errors.

Here's the long story short.

Register a health probe

In order to monitor backend health in real time, it is advised to register a health probe.

Here's how you do this in VCL:

vcl 4.1;

probe health {
    .request =
        "HEAD / HTTP/1.1"
        "Host: localhost"
        "Connection: close"
        "User-Agent: Varnish Health Probe";
    .interval  = 10s;
    .timeout   = 5s;
    .window    = 5;
    .threshold = 3;
}

backend default {
    .host = "192.168.xyz.ab";
    .port = "80";
    .probe = health;
}

Use varnishlog to monitor the backend health

Once the probe is configured, run the following command to monitor the health of your backend:

sudo varnishlog -g raw -i Backend_health

The output will already give an indication of what's wrong. But if you want to go further, you can trigger the page yourself and run the following command to get an in-depth view into the situation:

sudo varnishlog -g request -q "VCL_call eq 'BACKEND_ERROR'"

Next steps

Either the logs will give you the answers you're looking for. If the output confuses you, don't hesitate to add the full varnishlog output to your question so I can help you examine.

serverstackqns avatar
cn flag
Thanks Thijs for the detailed explanation. Please find my update in the question.
Thijs Feryn avatar
in flag
Unfortunately the error message is rather vague. Can you run `curl -I http://192.168.xyz.ab` from the command line of your Varnish server? We need to figure out if the Nginx webserver is reachable from the Varnish server.
serverstackqns avatar
cn flag
Please see the update
Thijs Feryn avatar
in flag
@serverstackqns I'm a bit confused. You're doing a `curl` request to `3.110.xyz.ab` from server `ip-192-168-xyz-ab` and yet your VCL backend points to `192.168.xyz.ab`. Are you sure `.host = "192.168.xyz.ab"` is the right value?
I sit in a Tesla and translated this thread with Ai:

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.