Score:0

Cache Images centrally

ng flag

Application: Serve responsive and optimized images on the fly by cropping/resizing/compressing master images

Current Load: 10k request per minute, ~60MBps traffic.

Current Config : NginxPlus LB sits at the top. Multiple app servers with nginx->tomcat. On each app server images are cached in the nginx

Current Problems:

  • Poor cache hit ratio as cache is decentralized, probability of same request hitting the same server is low.
  • Duplicate caching, although this can be tolerated
  • Cache purge is cumbersome. Since cache can possibly be present on multiple servers, For purging cache, It needs to be purged from all app servers

Potential Solutions

  • Nginx Load Balancer consistent hashing. The problem with this approach is that it may cause uneven traffic distribution

  • Introduce a mid layer of few nginx servers: A dedicated nginx caching layer can be introduced between LB and app servers. But Lets say i keep 3 servers in this layer, Still It has the same problem of duplicate caching and purge headaches although magnitudes lower.

  • Disk I/O might be a problem in case of single nginx

Does anyone has experience in solving this use case ? Be it not with nginx. Feel free to share your thoughts.

anx avatar
fr flag
anx
What makes you think uneven traffic distribution becomes a problem? Most setups large enough to justify many app servers will also have diverse enough users to have randomly selected but sticky upstream selection not cause uneven distribution that is significant compared to the relative capacity reserves you want to have on standby anyway. You may be trying to solve a problem that is barely measurable with common traffic patterns.
djdomi avatar
za flag
I am unaware which filesystem may be the best but you can use a central storage for the cache, this would eleminate double cache effects due all use the same cache?
us flag
Using a shared file system for cache might not be a supported scenarion in nginx. If cache access isn't synchronized properly, all kinds of weird effects can happen. Also, cache expiration events can cause problems. How about implementing caching for the images on the nginxplus LB?
Holy_diver avatar
ng flag
@TeroKilkanen, I got your point but on NginxPlusLB, we may encounter disk i/o problems as all of the load is through LB. This might work but not much future proof.
djdomi avatar
za flag
@Holy_diver I believe an SSD can handle your scenario, or do you beat one million IOPS on a NVME based storage? Its Round about of 180k Real IOPS
mforsetti avatar
tz flag
Poor cache hit ratio at 10k requests per minute? what's your cache hit ratio right now though? I tend to agree with @anx here with premature optimization.
Holy_diver avatar
ng flag
@mforsetti, Current hit ratio is 10% and LB policy is least connection. So Cache is decentralized. For hit, next request should land on the exact same app server.
mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.