Score:0

Nginx Reverse Proxy 403 Errors on POST Requests

es flag

I'm trying to set up a stack of services in Docker: Unifi, PHP, Nginx, and Certbot, where Unifi and PHP are the backend services and Nginx serves them in reverse proxy mode, while Certbot runs periodically to get SSL certs for Nginx.

I have it mostly working; all the GET requests work and I can view the page that Unifi serves. However, any and all POST requests via AJAX all throw a 403 error due to CORS.

Now, I'm not super familiar with how to manipulate CORS headers, or what's causing the error. Is it the browser, Nginx, or unifi? Though, Nginx is all I can change the configuration of.

Here's the error I get on all AJAX post requests, from the browser inspector/network monitor:

POST
scheme                    https
host                      example.com:8443
filename                  /api/stat/device
Address                   (server_ip_address):8443
Status                    403 Forbidden
Version                   HTTP/2
Transferred               141 B (0 B size)
Referrer Policy           strict-origin-when-cross-origin
    
    
RESPONSE HEADERS    
content-length            0
content-type              text/plain
date                      Fri, 17 Sep 2021 00:59:09 GMT
server                    nginx
X-Firefox-Spdy            h2
    
    
REQUEST HEADERS 
Accept                    application/json, text/plain, */*
Accept-Encoding           gzip, deflate, br
Accept-Language           en-US,en;q=0.5
Connection                keep-alive
Content-Length            0
Cookie                    unifises=(random token here); csrf_token=(random token here)
DNT                       1
Host                      example.com:8443
Origin                    https://example.com:8443
Referer                   https://example.com:8443/setup/configure/controller-name
Sec-Fetch-Dest            empty
Sec-Fetch-Mode            cors
Sec-Fetch-Site            same-origin
TE                        trailers
User-Agent                Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:92.0) Gecko/20100101 Firefox/92.0
X-Csrf-Token              (random token here)

Here's the Nginx config:

# enbables GZIP compression
gzip on;

# compression level (1-9)
# 6 is a good compromise between CPU usage and file size
gzip_comp_level 6;

# minimum file size limit in bytes to avoid negative compression
gzip_min_length 256;

# compress data for clients connecting via proxies
gzip_proxied any;

# directs proxies to cache both the regular and GZIp versions of an asset
gzip_vary on;

# disables GZIP compression for ancient browsers
gzip_disable "msie6";

server {
    listen 80;
    listen [::]:80;

    server_name example.com;

    location ~ /.well-known/acme-challenge {
        allow all;
        root /var/www/certbot/;
        }
    # Redirect relevant Unifi paths Unifi Address and Port
    location / {
        rewrite ^ https://$host:8443$request_uri?;
    }
}
server {
    listen 8443 ssl http2;
        listen [::]:8443 ssl http2;

    server_name example.com;

    server_tokens off;

    ssl_certificate /etc/letsencrypt/live/example.com/fullchain.pem;
    ssl_certificate_key /etc/letsencrypt/live/example.com/privkey.pem;

    ssl_buffer_size 8k;

    ssl_dhparam /etc/ssl/certs/dhparam-2048.pem;

    ssl_protocols TLSv1.2 TLSv1.1 TLSv1;
    ssl_prefer_server_ciphers on;

    ssl_ciphers ECDH+AESGCM:ECDH+AES256:ECDH+AES128:DH+3DES:!ADH:!AECDH:!MD5;

    ssl_ecdh_curve secp384r1;
    ssl_session_tickets off;

    ssl_stapling on;
    ssl_stapling_verify on;
    resolver 1.1.1.1 1.0.0.1 208.67.222.222 208.67.220.220;


    location / {

        add_header 'Access-Control-Allow-Origin' '*';
    add_header 'Access-Control-Allow-Credentials' 'true';
    add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS';
    add_header 'Access-Control-Allow-Headers' 'DNT,X-CustomHeader,Keep-Alive,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type';



        proxy_pass https://unifi:8443/;
        proxy_set_header Authorization "";
        proxy_pass_request_headers on;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-Host $remote_addr;
        proxy_set_header X-Forwarded-For $remote_addr;
        proxy_set_header X-Forwarded-Proto $scheme;
        proxy_set_header X-Forwarded-Ssl on;
        proxy_http_version 1.1;
        proxy_buffering off;
        proxy_redirect off;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection "Upgrade";
        auth_basic "Restricted";
        proxy_set_header Referer "";

    }
}

I've tired looking more guides on and off Stack Exchange than I could keep track of, hence why my config is now so messy.

So, how do I modify Nginx to serve the XHR requests without failing due to CORS?

Edit 1: I added port 443 to the Nginx listen ports alongside 8443. If I access Unifi over 443 and proxy it to unifi:8443, it works as expected. But, I need it to work transparently on 8443.

Edit 2: I tried adding another "middle-man" Nginx container with a slightly modified config. I proxied requests to port 8443 on the original Nginx container to the second container on port 443, and reverse proxied that to Unifi on 8443. Same result as not having the "man in the middle" proxy as before. So Web -> Nginx on 8443 --> Nginx on 443 -> Unfi on 8443. Removed this config since it didn't work, plus it's inefficient.

Petr 'PePa' Pavel avatar
cn flag
Just wondering, did you happen to find a solution? If so, could you please answer your own question to help others?
mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.