Score:0

NGINX setup for REST API returning default configuration

cn flag

I've hired a VPS to play with some personal projects. To get started, I'm trying to set it up to host a REST API using Node.js, as I've only ever used Spring Boot for that.

I've implemented the solutions by adapting from the following guides:

https://www.robinwieruch.de/node-express-server-rest-api (most of the API code looks like this)

https://itnext.io/building-restful-api-with-node-js-express-js-and-postgresql-the-right-way-b2e718ad1c66 (but I'm slowly converting to use the standards from here)

For the actual deployment, I've changed to bundle the API with Babel and I'm deploying it with PM2.

https://www.nginx.com/blog/deploying-nginx-plus-as-an-api-gateway-part-1/ (not Plus though)

(All those links are available via http://web.archive.org/ so they aren't going anywhere anytime soon)

I created another Node.js project using axios to test the REST requests. Running it in the same VPS works, but I had to change the API code to bind to localhost. Before, with the binding address unspecified, it was binding to an IPv6 localhost (if I understood correctly) and, since my domain doesn't work with IPv6, I'll stick with IPv4.

On the NGINX side I've made the most changes as I only have 1 API now and I won't be using load balancing. Also, I've changed the naming policy. I'll be using example.com/app_or_project_name/api_or_web_or_other_kind_of_interface/project_specific_routes.

Here's how my NGINX setup is looking like. I've anonymized it and changed to use the same name as the NGINX example:

api_backends.conf

upstream warehouse {
    zone api 64k;
    server 127.0.0.1:some_port_number;
}

That's the REST API host and port number.

api_conf.d/warehouse_api.conf

# Warehouse API
#
location /warehouse/api/ {
    # Policy configuration here (authentication, rate limiting, logging, more...)
    #
    access_log /var/log/nginx/warehouse_api.log main;
    auth_request /_validate_apikey;

    # URI routing
    #
    proxy_pass http://warehouse;

    return 404; # Catch-all
}

api_gateway.conf

include api_backends.conf;
include api_keys.conf;

server {
    access_log /var/log/nginx/api_access.log main; # Each API may also log to a separate file

    listen 443 ssl;
    server_name my-domain.net;

    # TLS config
    ssl_certificate /etc/letsencrypt/live/my-domain.net/fullchain.pem; # managed by Certbot
    ssl_certificate_key /etc/letsencrypt/live/my-domain.net/privkey.pem; # managed by Certbot
    ssl_session_cache    shared:SSL:10m;
    ssl_session_timeout  5m;
    ssl_ciphers          HIGH:!aNULL:!MD5;
    ssl_protocols        TLSv1.2 TLSv1.3;

    # API definitions, one per file
    include api_conf.d/*.conf;

    # Error responses
    # error_page 404 = @400;         # Invalid paths are treated as bad requests
    proxy_intercept_errors on;     # Do not send backend errors to the client
    include api_json_errors.conf;  # API client friendly JSON error responses
    default_type application/json; # If no content-type then assume JSON

    # API key validation
    location = /_validate_apikey {
        internal;

        if ($http_apikey = "") {
            return 401; # Unauthorized
        }
        if ($api_client_name = "") {
            return 403; # Forbidden
        }

        return 204; # OK (no content)
    }

}

I might set proxy_intercept_errors to off once it's working. I'll have to make some tests to see what changes in the responses.

api_json_errors.conf

Same as example.

default.conf

server {
    server_name  www.my-domain.net;

    #access_log  /var/log/nginx/host.access.log  main;

    location / {
        root   /usr/share/nginx/html;
        index  index.html index.htm;
    }

    #error_page  404              /404.html;

    # redirect server error pages to the static page /50x.html
    #
    error_page   500 502 503 504  /50x.html;
    location = /50x.html {
        root   /usr/share/nginx/html;
    }

    # proxy the PHP scripts to Apache listening on 127.0.0.1:80
    #
    #location ~ \.php$ {
    #    proxy_pass   http://127.0.0.1;
    #}

    # pass the PHP scripts to FastCGI server listening on 127.0.0.1:9000
    #
    #location ~ \.php$ {
    #    root           html;
    #    fastcgi_pass   127.0.0.1:9000;
    #    fastcgi_index  index.php;
    #    fastcgi_param  SCRIPT_FILENAME  /scripts$fastcgi_script_name;
    #    include        fastcgi_params;
    #}

    # deny access to .htaccess files, if Apache's document root
    # concurs with nginx's one
    #
    #location ~ /\.ht {
    #    deny  all;
    #}

    listen 443 ssl; # managed by Certbot
    ssl_certificate /etc/letsencrypt/live/my-domain.net/fullchain.pem; # managed by Certbot
    ssl_certificate_key /etc/letsencrypt/live/my-domain.net/privkey.pem; # managed by Certbot
    include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot
    ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot

}

server {
    if ($host = www.my-domain.net) {
        return 301 https://$host$request_uri;
    } # managed by Certbot

    if ($host = my-domain.net) {
        return 301 https://$host$request_uri;
    } # managed by Certbot

    listen       80;
    server_name  my-domain.net www.my-domain.net;
    return 404; # managed by Certbot

}

I had to make some changes here because there was a combination of address and port that had a duplicate server configuration.

nginx.conf

user  nginx;
worker_processes  auto;

error_log  /var/log/nginx/error.log info;
pid        /var/run/nginx.pid;

load_module /etc/nginx/modules/ngx_http_js_module.so;

events {
    worker_connections  1024;
}

http {
    include       /etc/nginx/mime.types;
    default_type  application/octet-stream;

    log_format  main  '$remote_addr - $remote_user [$time_local] "$request" '
                      '$status $body_bytes_sent "$http_referer" '
                      '"$http_user_agent" "$http_x_forwarded_for"';

    access_log  /var/log/nginx/access.log  main;

    sendfile        on;
    #tcp_nopush     on;

    keepalive_timeout  65;

    include /etc/nginx/api_gateway.conf; # All API gateway configuration
    include /etc/nginx/conf.d/*.conf;    # Regular web traffic
}

When I run the same test project outside the VPS, I get the following result:

{
  message: 'Request failed with status code 404',
  name: 'Error',
  description: undefined,
  number: undefined,
  fileName: undefined,
  lineNumber: undefined,
  columnNumber: undefined,
  stack: '...',
  config: {
    url: 'https://my-domain.net/warehouse/api/messages',
    method: 'get',
    headers: {
      Accept: 'application/json, text/plain, */*',
      'Access-Control-Allow-Origin': '*',
      'User-Agent': 'axios/0.21.1'
    },
    transformRequest: [ [Function: transformRequest] ],
    transformResponse: [ [Function: transformResponse] ],
    timeout: 0,
    adapter: [Function: httpAdapter],
    xsrfCookieName: 'XSRF-TOKEN',
    xsrfHeaderName: 'X-XSRF-TOKEN',
    maxContentLength: -1,
    maxBodyLength: -1,
    validateStatus: [Function: validateStatus],
    apikey: '...',
    data: undefined
  },
  code: undefined
}

One thing I managed to figure out is that the 404 error is returned by warehouse_api.conf because, if I change return 404; to another code, that's the code I'll get.

I've enabled debugging in NGINX but I couldn't understand the output, even after searching a little:

2021/07/22 11:54:17 [debug] nginx_pid#nginx_pid: *757 using location: @404 "/warehouse/api/messages?"
2021/07/22 11:54:31 [debug] nginx_pid#nginx_pid: *758 http cl:-1 max:1048576
2021/07/22 11:54:31 [debug] nginx_pid#nginx_pid: *758 rewrite phase: 3
2021/07/22 11:54:31 [debug] nginx_pid#nginx_pid: *758 http finalize request: 404, "/warehouse/api/messages?" a:1, c:1
2021/07/22 11:54:31 [debug] nginx_pid#nginx_pid: *758 http special response: 404, "/warehouse/api/messages?"
2021/07/22 11:54:31 [debug] nginx_pid#nginx_pid: *758 test location: "@400"
2021/07/22 11:54:31 [debug] nginx_pid#nginx_pid: *758 test location: "@401"
2021/07/22 11:54:31 [debug] nginx_pid#nginx_pid: *758 test location: "@403"
2021/07/22 11:54:31 [debug] nginx_pid#nginx_pid: *758 test location: "@404"
2021/07/22 11:54:31 [debug] nginx_pid#nginx_pid: *758 using location: @404 "/warehouse/api/messages?"

I tried a few different approaches to search about all this but couldn't find any leads.

So, what's going on, what's wrong and how do I fix it?

Thanks in advance.

Update 2021-08-04

Following @jose-fernando-lopez-fernandez 's answer, I changed api_conf.d/warehouse_api.conf to the following:

# Warehouse API
#
location /warehouse/api/ {
    # Policy configuration here (authentication, rate limiting, logging, more...)
    #
    access_log /var/log/nginx/warehouse_api.log main;
    auth_request /_validate_apikey;

    # URI routing
    #
    location /warehouse/api/ {
        proxy_pass http://warehouse;
    }

    return 404; # Catch-all
}

to keep it in line with the examples I'm following. I tested again and got an 401 error instead. I checked and I was passing the apikey incorrectly.

I fixed it, tested again and got a 404 again. But now I'm getting a lot more output out of nginx-debug.

I've anonymized it. I've replaced apikey by another I generated and won't be using for anything.

I've also replaced posix_memalign, http cleanup add, free, chain writer in and malloc by strings of the same length with random bytes. I don't know if it should be anonymous or not. If it's needed for the solution of this question please ask and I'll add them back in on a need to know basis.

Here goes:

2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http cl:-1 max:1048576
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 rewrite phase: 3
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 post rewrite phase: 4
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 generic phase: 5
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 generic phase: 6
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 generic phase: 7
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 access phase: 8
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 access phase: 9
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 access phase: 10
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 auth request handler
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http subrequest "/_validate_apikey?"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http posted request: "/_validate_apikey?"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 rewrite phase: 1
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 test location: "/warehouse/api/"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 test location: "/_validate_apikey"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 using configuration "=/_validate_apikey"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http cl:-1 max:1048576
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 rewrite phase: 3
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http script var
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http script var: "o6ZlKSX24MCY/uPwCRl80WAS"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http script value: ""
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http script equal
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http script equal: no
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http script if
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http script if: false
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http script var
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http map started
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http script var: "o6ZlKSX24MCY/uPwCRl80WAS"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http map: "o6ZlKSX24MCY/uPwCRl80WAS" "client_one"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http script var: "client_one"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http script value: ""
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http script equal
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http script equal: no
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http script if
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http script if: false
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http finalize request: 0, "/_validate_apikey?" a:1, c:2
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 auth request done s:204
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http wake parent request: "/warehouse/api/messages?"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http posted request: "/warehouse/api/messages?"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 access phase: 10
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 auth request handler
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 auth request set variables
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 post access phase: 11
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 generic phase: 12
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 generic phase: 13
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 posix_memalign: 218512C89A2ED401:4096 @16
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http init upstream, client timer: 0
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 epoll add event: fd:3 op:3 ev:80002005
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http script copy: "Host"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http script var: "warehouse"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http script copy: "Connection"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http script copy: "close"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http script copy: ""
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http script copy: ""
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http proxy header: "Accept: application/json, text/plain, */*"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http proxy header: "Access-Control-Allow-Origin: *"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http proxy header: "apikey: o6ZlKSX24MCY/uPwCRl80WAS"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http proxy header: "User-Agent: axios/0.21.1"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http proxy header:
"GET /warehouse/api/messages HTTP/1.0
Host: warehouse
Connection: close
Accept: application/json, text/plain, */*
Access-Control-Allow-Origin: *
apikey: o6ZlKSX24MCY/uPwCRl80WAS
User-Agent: axios/0.21.1

"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http cleanup add: 90C9DA232086B6FA
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 get rr peer, try: 1
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 stream socket 15
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 epoll add connection: fd:15 ev:80002005
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 connect to 127.0.0.1:some_port_number, fd:15 #2
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http upstream connect: -2
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 posix_memalign: A9E50626EC2A1D36:128 @16
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 event timer add: 15: 60000:878601635
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http finalize request: -4, "/warehouse/api/messages?" a:1, c:2
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http request count:2 blk:0
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http run request: "/warehouse/api/messages?"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http upstream check client, write event:1, "/warehouse/api/messages"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http upstream request: "/warehouse/api/messages?"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http upstream send request handler
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http upstream send request
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http upstream send request body
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 chain writer buf fl:1 s:225
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 chain writer in: 4C4F626384F523C9
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 writev: 225 of 225
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 chain writer out: 0000000000000000
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 event timer del: 15: 878601635
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 event timer add: 15: 60000:878601636
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http upstream request: "/warehouse/api/messages?"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http upstream process header
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 malloc: 1D36E73206B5EE11:4096
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 recv: eof:1, avail:-1
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 recv: fd:15 444 of 4096
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http proxy status 404 "404 Not Found"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http proxy header: "X-Powered-By: Express"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http proxy header: "Access-Control-Allow-Origin: *"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http proxy header: "Content-Security-Policy: default-src 'none'"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http proxy header: "X-Content-Type-Options: nosniff"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http proxy header: "Content-Type: text/html; charset=utf-8"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http proxy header: "Content-Length: 168"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http proxy header: "Date: Wed, 04 Aug 2021 17:43:08 GMT"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http proxy header: "Connection: close"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http proxy header done
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 finalize http upstream request: 404
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 finalize http proxy request
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 free rr peer 1 0
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 close http upstream connection: 15
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 free: A9E50626EC2A1D36, unused: 48
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 event timer del: 15: 878601636
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 reusable connection: 0
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http finalize request: 404, "/warehouse/api/messages?" a:1, c:1
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 http special response: 404, "/warehouse/api/messages?"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 test location: "@400"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 test location: "@401"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 test location: "@403"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 test location: "@404"
2021/08/04 17:43:08 [debug] nginx_pid#nginx_pid: *1 using location: @404 "/warehouse/api/messages?"

It seems weird to me that it seems to be considering warehouse as a host name. On the other hand, NGINX does define some addresses with that name so it might be related to that.

Score:1
sz flag

I'm assuming you just stubbed out the proxy-related machinations to be handled at a later time in your warehouse API file, but that actually won't work, as you found out.

# Warehouse API
#
location /warehouse/api/ {
    # Policy configuration here (authentication, rate limiting, logging, more...)
    #
    access_log /var/log/nginx/warehouse_api.log main;
    auth_request /_validate_apikey;

    # URI routing
    #
    proxy_pass http://warehouse;

    return 404; # Catch-all
}

Compare that one line in charge of URI routing against the version in the example you linked.

# Warehouse API
#
location /api/warehouse/ {
    # Policy configuration here (authentication, rate limiting, logging, more...)
    #
    access_log /var/log/nginx/warehouse_api.log main;
    auth_request /_validate_apikey;

    # URI routing
    #
    location /api/warehouse/inventory {
        proxy_pass http://warehouse_inventory;
    }

    location /api/warehouse/pricing {
        proxy_pass http://warehouse_pricing;
    }

    return 404; # Catch-all
}

The reason for why the example works but yours doesn't has to do with the "precedence" ("immediacy" might be a better term) of the return directive.

As per the documentation, the return directive immediately causes Nginx to stop processing the current request and return right away1. This means your proxy_pass directive isn't even getting the chance to even try to execute.

However, in the example, there are two nested, prefix-based locations within the block, which means that Nginx will opt for the longest match. This is why the example request succeeded; the requested URI, https://api.example.com/api/warehouse/pricing/item001, matched the second of the two nested location blocks, and thus the request was proxied as expected.

In conclusion, you need to add a nested location block for the proxy_pass directive to execute within, if you want to replicate the example. Otherwise, it looks like you should be able to simply remove the return directive and your configured upstream backend will be free to try to do its thing.

GuiRitter avatar
cn flag
Thanks! However, now it's failing further down the line. I've updated the first post with more information.
Jose Fernando Lopez Fernandez avatar
sz flag
I genuinely don't know if this is the source of the problem, but the Warehouse API is a Plus feature isn't it? I honestly have zero experience with it, so I could be wrong, but a 401 sounds like that could be it, right?
GuiRitter avatar
cn flag
I don't see how that could be the case. In my understanding, the Warehouse API is just an example, where the "Warehouse" is a hypothetical app that accesses a hypothetical API, and the NGINX example shows how to expose that API.
mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.