Score:1

nginx: Uncaught (in promise) ChunkLoadError: Loading chunk

et flag

I installed Sentry self-hosted 23.1.1, and configured nginx/1.22.1 as frontend proxy. If I access the Docker port, Sentry loads fine, while if I pass through nginx I get the error in the subject. It looks like an application error, but it works without nginx, so it must be something on the webserver side. I raised an issue to Sentry itself, but the configuration looks correct.

I tried clearing cookies, private navigation, different browsers... Same result. It worked only the first time I loaded the app, after installation, then not anymore.

The full exception which is found in the browser's console is:

Loading failed for the <script> with source “https://logger.xx.com/_static/dist/sentry/chunks/app_actionCreators_organization_tsx-app_bootstrap_commonInitialization_tsx-app_bootstrap_init-73196b.1fa346ca59b89f776cae.js”. [issues:1:1](https://logger.xx.com/organizations/sentry/issues/)
Uncaught (in promise) ChunkLoadError: Loading chunk app_actionCreators_organization_tsx-app_bootstrap_commonInitialization_tsx-app_bootstrap_init-73196b failed.
(error: [https://logger.xx.com/_static/dist/sentry/chunks/app_ac…zation_tsx-app_bootstrap_init-73196b.1fa346ca59b89f776cae.js](https://logger.xx.com/_static/dist/sentry/chunks/app_actionCreators_organization_tsx-app_bootstrap_commonInitialization_tsx-app_bootstrap_init-73196b.1fa346ca59b89f776cae.js))
    j jsonp chunk loading:27
    e ensure chunk:6
    e ensure chunk:5
    r initializeMain.tsx:14
    async* index.tsx:80
    async* index.tsx:83
    <anonymous> app.js:1
[jsonp chunk loading:27:17](webpack:///webpack/runtime/jsonp%20chunk%20loading)
    r initializeMain.tsx:15
    AsyncFunctionThrow self-hosted:811
    (Async: async)
    <anonymous> index.tsx:80
    AsyncFunctionNext self-hosted:807
    (Async: async)
    <anonymous> index.tsx:83
    <anonymous> app.js:1

The nginx config:

server {
        listen 443 ssl;
        listen [::]:443 ssl;
        server_name logger.xx.com;
        ssl_certificate /etc/letsencrypt/live/logger.xx.com/fullchain.pem; # managed by Certbot
        ssl_certificate_key /etc/letsencrypt/live/logger.xx.com/privkey.pem; # managed by Certbot

        # Some taken from https://github.com/mherrmann/sentry-self-hosted/blob/master/nginx-site
        # keepalive + raven.js is a disaster
        keepalive_timeout 0;

        gzip off;
        proxy_http_version 1.1;
        proxy_redirect off;
        proxy_buffering off;
        proxy_next_upstream error timeout invalid_header http_502 http_503 non_idempotent;
        proxy_next_upstream_tries 2;
        proxy_set_header Connection '';
        proxy_set_header Host $host;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
        proxy_set_header X-Request-Id $request_id;
        proxy_read_timeout 30s;
        proxy_send_timeout 5s;

        root /dati/www/logger.xx.com;
        index index.html;
        access_log /var/log/nginx/logger.xx.com.access.log;
        error_log /var/log/nginx/logger.xx.com.error.log error;

        location / {
                proxy_pass http://127.0.0.1:9012;
        }
}

thanks

I sit in a Tesla and translated this thread with Ai:

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.