Score:1

Is it possible to run a mailserver on a nginx layer 4 server?

cn flag

I've just created a layer4 nginx server, which is used as a load-balancer for a kubernetes cluster.

I know it is a very basic question (and probably I'll get a lot of downvotes for this), but is it possible to install and run a mail-server on the same machine? Is the mailserver traffic just running on different ports and therefore it shouldn't be a problem?

My nginx config looks like this:

user www-data;
worker_processes auto;
pid /run/nginx.pid;
include /etc/nginx/modules-enabled/*.conf;

events {
    worker_connections 768;
}

stream {
    upstream http {
        server 123.45.67.170:80;
        server 123.45.67.171:80;
        server 123.45.67.172:80;
    }
    upstream ssl {
        server 123.45.67.170:443;
        server 123.45.67.171:443;
        server 123.45.67.172:443;
    }
    server {
        listen 80;
        proxy_pass http;
    }
    server {
        listen 443;
        proxy_pass ssl;
    }
}

http {
    sendfile on;
    tcp_nopush on;
    types_hash_max_size 2048;

    include /etc/nginx/mime.types;
    default_type application/octet-stream;

    ssl_protocols TLSv1 TLSv1.1 TLSv1.2 TLSv1.3;
    ssl_prefer_server_ciphers on;

    access_log /var/log/nginx/access.log;
    error_log /var/log/nginx/error.log;

    gzip on;

    include /etc/nginx/conf.d/*.conf;
}

and I think I have to add something like this:

mail {
    server_name mail.test.com;
    auth_http   http://127.0.0.1:8000;
    xclient off;
    server {
        listen     3333;
        protocol   smtp;
        smtp_auth  none;
    }
}

But if this is correct, should I install the mailserver on this machine itself or should I use a mailserver, which is running in the k8s cluster?

Score:1
ua flag

Your assumptions are correct, indeed as long as those ports are not in use you can run whatever alongside it. If it is a clever thing to do is another question but yes absolutely possible.

I would highly recommend running the mailserver on a different machine/instance for various reasons, also for production purposes you might want to consider running it on a different IP, making it easier to deal with blocklist issues.

No worries this platform is for learning and this is exactly the kind of question that is not simple googlable. Mods wil probably disagree :)

user3142695 avatar
cn flag
Thanks for your kindly reply. I was thinking, this machine is just used as a load balancer, so there is a lot of unused capacity left. That's why I thought, I could run a small mail server to use the machine a bit more effective. Could you please comment about the question, if the mailserver should be run on this machine or as a kubernetes pod?
proxx avatar
ua flag
If this is going to run in a production environment then I would certainly run the mailserver on a dedicated pod. Security alone would be plenty of reason but also imagine that when you want to move or upgrade different parts of the system you might run into problems, lets say the loadbalancer is going to require higher availability in the future, what would you do with that mailserver? Again, would it work, yes! Is it clever from an engineering standpoint , definitely not! Besides, if your loadbalancer has that much overcapacity then you might be doing something wrong :)
I sit in a Tesla and translated this thread with Ai:

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.