Score:0

Reuse source ports when connecting to multiple upstream servers?

ga flag

I am trying to set up reverse proxy meant to serve multiple STUN clients and proxy the communication to multiple upstream servers. I have increased the file descriptors limit for the service so that shouldn't be an issue. My nginx.conf looks more or less like this:

user root;
worker_processes auto;
pid /run/nginx.pid;
include /etc/nginx/modules-enabled/*.conf;
include /etc/nginx/sites-enabled/*.conf;

events {
    worker_connections 1000000;
    multi_accept on;
}

stream {

    upstream stun_backend {
        server 192.168.1.10:3478 max_fails=0;
        server 192.168.1.11:3478 max_fails=0;
    }
    server {
    proxy_timeout 30m;
    listen 3478 udp reuseport;
    proxy_pass stun_backend;

  }
}

Everything works fine until I get around 28232 (my ephemeral port range is of this size) "udp ESTAB" connections to upstream servers. This is when I start getting the "Resource temporarily unavailable" errors and clients lose ability to contact the stun servers.

I was under the impression that the "IP_BIND_ADDRESS_NO_PORT" option used by NGINX since version 1.11.2 would help with the problem by reusing existing source ports as long as the destination was different so that the 4-tuple is unique and realistically I should be able to serve 28232 * number_of_upstream_hosts but that doesn't seem to be the case.

Am I mistaken about what this option is meant to do? Is what I'm trying to do even achieveable? If so, how?

A.B avatar
cl flag
A.B
I'm surprised that your setup works. STUN is supposed to receive unaltered connections so it can figure out [what kind of network alteration](https://en.wikipedia.org/wiki/STUN#/media/File:STUN_Algorithm3.svg) the client is behind. With a reverse proxy between, this probably makes STUN's work harder.
Dawid T. avatar
ga flag
@A.B well, in this case STUN's only task is to ensure there's a way for the server to send payload to client when needed (it's app-specific feature). It works fine but unfortunately the NGINX node eventually runs out of free ports to upstream servers. There will be tens of thousands of connections from different clients concurrently so I need to ensure each NGINX node can support as many clients as possible.
I sit in a Tesla and translated this thread with Ai:

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.