Score:0

Best way to handle directory/path traversal attacks on a Nginx http site

mz flag

I have a Node.js-driven site running in a Docker container, and there's a public-facing proxy site driven by Nginx server that redirects traffic to the dockerized Node.js site. Studying the Nginx logs, I see a lot of directory/path traversal attacks on all kind of paths:

GET /.env
GET /phpmyadmin/index.php
GET /owa/auth/logon.aspx
GET /+CSCOE+/logon.html
GET /ecp/Current/exporttool/microsoft.exchange.ediscovery.exporttool.application
GET /owa/auth/logon.aspx?url=https%3a%2f%2f1%2fecp%2f
GET /core/.env
GET /.vscode/sftp.json
GET /.git/config
GET /info.php
GET /config.json
etc.

Currently all of those attempts are duly processed and return http 404 response. However, I don't like to bother the dockerized site with all those fake requests, so I have started including a long list of location directives in proxy site's my Nginx config:

    location = /phpmyadmin/index.php {
        return 404;
    }
    location = /.env {
        return 404;
    }

But actually, isn't that a too great honor to serve them a proper 404 response? Perhaps they deserve some more devious response, like a response that is never properly finished, or something else of that nature. Also, it's kind of tiresome to keep that site config updated with new kind of paths. Using regular expressions can somewhat shorten it, but not that much.

What is considered to be the most appropriate way to handle those kind of attacks?

Score:2
pt flag

This isn't an attack, just a (likely automated) search for common files that might reside in the web root, similar to how a port scan is used to check for open ports. This is neither unusual nor inherently dangerous.

If you want to reject such requests early, then the best approach would be a white list of paths that you want to be processed by the Node.js server. For example, if all valid paths start with /api, then create a location block for that path and reject all other requests with a 404 response. Additionally, as a precaution, you could blacklist specific paths like .git to prevent sensitive files from being served after they accidentally ended up in the web root.

Sending “devious” responses is not a good idea. First, invalid requests can be sent by perfectly legitimate clients, possibly even your own applications. There can always be bugs or misconfigurations, and the whitelists or blacklists can be out of sync after a while. In that case, you want a proper response to detect and fix the problem. Secondly, automated scanners look for interesting responses. If your server sends something other than a standard 4XX response, you may end up attracting more attention and provoke actual attacks.

I sit in a Tesla and translated this thread with Ai:

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.