Score:0

How do I run two Nginx Web Servers on the same machine?

br flag

I want to host a website and run a microservices project on my own server.

  1. The website will be run with the Nginx web server. The domain of the website will look like that sampleapp.com and this website will use freessl.

  2. One of the services of the Microservice project will be run with Nginx web server as a service in a docker container. This service uses subdomains of my sampleapp.com such as api-dev.sampleapp.com and those subdomains should also work with SSL.

When trying to deploy services with docker-compose I get the following error:

[warn] 1#1: conflicting server name "api-dev.sample.com" on 0.0.0.0:80, ignored

The main concern is that how can I setup ssl inside of docker. 443 is default port for HTTPS.

Nginx config file of my microservice is as below.

worker_processes auto;

events {
  worker_connections 1024;
}

http {

  server {
    listen 80 default_server;
    server_name "";
    return 444;
  }

  server {
    server_name game-dev.sampleapp.com;

    location / {
      proxy_set_header X-Real-IP $remote_addr;
      proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
      proxy_set_header Host $http_host;
      proxy_set_header X-NginX-Proxy true;
      proxy_http_version 1.1;
      proxy_set_header Upgrade $http_upgrade;
      proxy_set_header Connection "upgrade";

      proxy_pass http://game_nodes;
      proxy_redirect off;
    }
  }
  server {
    if ($host = game-dev.sampleapp.com) {
      return 301 https://$host$request_uri;
    }


    listen 80;
    listen [::]:80;
    server_name game-dev.sampleapp.com;
    return 404;
  }

  upstream game_nodes {
#    enable sticky session
 #   ip_hash;
    server game-alpha:3000;
    keepalive 8;
  }

  server {
    server_name api-dev.sampleapp.com;

    location / {
      proxy_set_header X-Real-IP $remote_addr;
      proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
      proxy_set_header Host $http_host;
      proxy_set_header X-NginX-Proxy true;

      proxy_pass http://main_nodes;
      proxy_redirect off;

    }
  }

  server {
   # if ($host = api-dev.sampleapp.com) {
    #  return 301 https://$host$request_uri;
    #}

    listen 80;
    listen [::]:80;
    server_name api-dev.sampleapp.com;
    return 404;
  }

  upstream main_nodes {
    server main-alpha:8000;
    server main-beta:8000;
    keepalive 8;
  }
}

Nginx config fle of the website is as below

server {
    listen 8080;
    listen [::]:8080;
    listen 8443 ssl http2;
    listen [::]:8443 ssl http2;

    server_name  sampleapp.com www.sampleapp.com;
    root /var/www/sampleapp.com;
    index index.html;

    ssl_certificate /etc/ssl/certs/sampleapp.com.pem;
    ssl_certificate_key /etc/ssl/private/sampleapp.com.key;
    ssl_client_certificate /etc/ssl/certs/origin-pull-ca.pem;
    ssl_verify_client on;

    client_max_body_size 100M;
  
    autoindex off;

    location / {
        try_files $uri $uri/ =404;

    }

}

I'm a developer, not a sysadmin, so I'm having a hard time figuring out the best way to do this.

djdomi avatar
za flag
nginx has in common cases a site-available and site-enable directories for adding vhosts to the service.
Score:1
es flag

One of the possible solutions here is to run 3 docker containers with Nginx.

  1. listens on 80 and 443 ports, offloads ssl and works as a reverse proxy for two other containers.
  2. spins the web site
  3. spins the microservice
mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.