Score:0

AWS S3 links return 403 when application is run using Docker Swarm, but works fine when run with Docker Compose

jp flag

I have a Django application on EC2 (ubuntu) instance using S3 for static & media files, which is set up currently using Docker Compose. I'm trying to convert it to use Docker Swarm, but in doing so all my S3 links are returning 403 status.

Error message: Failed to load resource: the server responded with a status of 403 (Forbidden)

If I run the same using docker-compose, all the static files are loaded properly. The only thing that I noticed is that the IP address of all the containers are different when run on swarm and compose. Although I'm not sure if this has anything to do with the issue.

Is there anything extra that I need to take care of for S3 to work in Docker Swarm?

djdomi avatar
za flag
You answered Yourself already - either change the IP _*or*_ allow the access to it.
mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.