Score:0

Nginx 413 - Client intended to send too large body - Node app

hu flag

I am runing a node app on nginx under Ubuntu and I am having problems uploading a 200M json file. On my local I use nginx on a Mac and I am not having this limit issue.

I have set the nginx.conf http client max size to 0 and 1000M but I am still getting the error

/etc/nginx/nginx.conf

http {
   client_max_body_size 1000M;
...

On the front end I use express and have done this but I am not sure if the problem is only with nginx

server.js

app.use(express.json({ limit: '1000mb' }))
app.use(express.urlencoded({ limit: '1000mb', extended: true }))

/var/log/nginx/error.log

2021/12/20 11:08:42 [error] 5451#5451: *4 client intended to send too large body:
jp flag
You need to make sure that you restarted/reloaded `nginx` after changing its config.
Álvaro avatar
hu flag
I have done `systemctl restart nginx` and `nginx -s reload` but nothing
jp flag
Then you need to check that you applied the changes to the correct `nginx` configuration file. You can check the current running configuration with `nginx -T`.
Álvaro avatar
hu flag
The `nginx.conf` is the global config file, I have no other settings on the local config. And `nginx -t` gives out: `nginx: the configuration file /etc/nginx/nginx.conf syntax is ok nginx: configuration file /etc/nginx/nginx.conf test is successful`
Álvaro avatar
hu flag
The problem was inded the client_max_body_size on the configurarion file rather than the `ngnix.conf`
mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.