Score:1

Nginx + Socket.io + Nodejs: How to configure socket.io on custom path

gb flag
Ubuntu 22.04
Nginx: 1.23.2
Nodejs: 12.22.9
Socket.io: 2.0.3

I had a basic Node.js chat room built in Apache2, but I migrated everything over to Ngnix and I'm having problems trying to get the chat room working again. I am trying to get my chat app working as directory /cnode2/ rather than the root URL (as I have other apps).

When I try loading the page, I am getting this error:

WebSocket connection to 'wss://www.*****.com/socket.io/?EIO=3&transport=websocket' failed: 
r.doOpen @ websocket.js:112
r.open @ transport.js:80
r.open @ socket.js:245
r @ socket.js:119
r @ socket.js:28
r.open.r.connect @ manager.js:226
r @ manager.js:69
r @ manager.js:37
r @ index.js:60
(anonymous) @ (index):539
cmain.js:160 connect_error: {"type":"TransportError","description":{"isTrusted":true}}

It seems like this must be a really simple fix, like maybe it should be connecting to "wss://www.*****.com/cnode2/socket.io/" (or maybe not)? I am not quite seeing how to get everything configured correctly or where to configure it. Here are my configs:

package.json:

{
  "name": "chat",
  "version": "0.0.1",
  "private": true,
  "scripts": {
    "start": "nodejs app"
  },
  "dependencies": {
    "cluster": "^0.7.7",
    "domain": "0.0.1",
    "express": "^4.16.3",
    "helmet": "^3.12.0",
    "mongoose": "^5.0.0",
    "mysql": "^2.15.0",
    "os": "^0.1.1",
    "package.json": "^2.0.1",
    "redis": "^2.8.0",
    "socket.io": "^2.0.3",
    "socket.io-redis": "^5.2.0",
    "sticky-session": "^1.1.2"
  }
}

Ngnix Config:

server {
    server_name www.********.com;
    root /home/********/********.com;
    charset utf-8;

    #listen [::]:443 http2 ssl; # managed by Certbot

    rewrite ^([^.]*[^/])$ $1/ permanent;

    location ~ \.php$ {
      fastcgi_pass unix:/var/run/php/php8.1-fpm-******.sock;
      fastcgi_index index.php;
      fastcgi_param SCRIPT_FILENAME $realpath_root$fastcgi_script_name;
      include fastcgi_params;
    }

    location /cnode2/ {
      proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
      proxy_set_header Host $host;

      proxy_pass http://localhost:3000;

      proxy_http_version 1.1;
      proxy_set_header Upgrade $http_upgrade;
      proxy_set_header Connection "upgrade";
    }

    listen 443 ssl; # managed by Certbot
    ssl_certificate ***; # managed by Certbot
    ssl_certificate_key ***; # managed by Certbot
    include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot
    ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot
}

Client-side connection to IO - code in the webpage:

<script src="https://www.*******.com/cnode2/socket.io.js"></script>
<script>
  var socket = io('https://www.*********.com/cnode2', {resources: "/cnode2/socket.io", transports: ['websocket','polling']});
</script>

Server-side app.js setup:

var http = require('http'),
  express = require('express'),
  cluster = require('cluster'),
  net = require('net'),
  io = require('socket.io'),
  io_redis = require('socket.io-redis'),
  sticky = require('sticky-session'),
  os = require('os'),
  helmet = require('helmet'),
  app = new express;
  app.use(helmet());
  var server = http.createServer(app, function(req, res) {
    console.log('launch server');
    res.end('worker: '+cluster.worker.id);
  });

var numCPUs = os.cpus().length;
  console.log('num CPUs: '+numCPUs);

if(!sticky.listen(server, 3000)) {
  for(var i = 0; i < numCPUs; i++) {
    cluster.fork();
  }
  server.once('listening', function() {
    console.log('server started on 3000 port');
  });
}
else {
  console.log('spawn worker');
  var io = io(server, {path: '/cnode2', transports: ['websocket', 'polling']});
....
}
Score:0
gb flag

After banging my head against the wall and a little bit of trial/error luck, I got everything working by changing the client-side script to:

<script>
  var socket = io('https://www.*************.com', {path: "/cnode2", transports: ['websocket','polling']});
</script>

It seems that the "path" has to match the "path" set on the server-side app config.

I sit in a Tesla and translated this thread with Ai:

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.