I am thinking about how one would implement a good system for guaranteeing that some files are available for a certain set of time. In this particular case it is a backup catalog of some 150 MiB per file.
As part of my implementation I have a bucket with a retention policy set to 1 month. The system that writes this catalog file has a service account which it uses to upload the catalog to the bucket. All good, very simple.
However, when implementing this I started to wonder: "What happens if an attacker gains access to this service account and starts filling the bucket with loads and loads of data?". We would be forced to retain that, and supposedly pay for it. 1 PiB could probably be uploaded without too much fuzz, and with Nearline in Finland that would be $10,000 USD. I would definitely want to ensure that does not happen.
Then I started to think one step further. Imagine a disgruntled employee that creates a bucket, uploads a few PiB, and sets the retention to 10 years before his/her last day. How would that be handled?
The documentation that I have read unsurprisingly puts pressure on how things cannot be deleted under retention, but it seems like there has to exist a way to deal with accidental and malicious uploads. Especially since GCP does not have any form of cost control or at least bucket size limit setting.
What are your thoughts on dealing with this threat vector sensibly? Relying on the hope that GCP billing support will discount any mistakes or attacks?