my company has a small and nice product on google could (using public webapp, functions and storage).
it's been hours I'm searching for a solution like a kill switch for services in case of any threat or attack on any of services. I've found that we can define max instances for each cloud function which is somehow cool and prevents too many threats.
but the main problem is the cloud storage. files on buckets are public and I don't know which solution is better to restrict download requests to bucket. one of them is that we create a new function (with max instance defined) and stream download requests from there to webapp. in this case bucket can be private. but after some cost estimations (an streaming function will be involved in each download) apeared that it may cost a lot, since the product has a map with custom tiles and webapp requests around 16 image files at once on init.
after thinking about different solution the only way I can see to prevent DDos attacks is to monitor the usage and shutdown services (specially storage service) at once to have time to track the issue and fix it.
but google cloud doesn't offer anything like this. at least I can't find it. something that brings the whole project (or part of it) to maintenance mode and rejects all requests. I can add a config for each function execution or storage download and turn it off whenever I want. but I think still google cloud counts those requests and bills my boss.
is there anybody who knows a good solution. scenarios in my mind are like:
- multiple IPs try to download request files in bucket continuosly and (~200 dn/s) for hours
- multiple IPs try to invoke functions but just to keep instances live and intrupt real users connections
I'm wondering if there is anyway (instead of custom way) to restrict each ip to have max 2000 requests per day.