I want to store cold backups cheaply. Every few months I prepare a new archive (10-100 Gb) that I want to upload and forget till something bad happens.
I'm thinking of using cloud providers for that.
Assumptions:
- Data import/export is completely egress (from my local PC)
- Data retrieval time does not matter
Cloud providers offer cheap cold storage:
- AWS S3 Glacier Deep Archive [1]
- GCP Archive [2]
- Azure Archive [3]
Storage seems to be ~1$/Tb/month everywhere.
Retrieval from ~20$/Tb for AWS (not sure about that one - first restore from Glacier Deep Archive to S3 and then Download) [4].
Retrieval for GCP is 51,2$/Tb [5].
Retrieval in Azure is 20,48$/Tb [6].
Questions:
1. Is there a GUI or CLI tool to manage my proposed workflow (infrequent cold backup to one or several cloud providers). I don't want to use their web portals as I'm not a frequent cloud user and all three interfaces are rather confusing and change all the time. Also uploading gigabytes from the browser seems like a bad idea.
2. Is there something better for secure cold backups? At 1$/Tb/month the storage price looks so good. Or am I barking up the wrong tree and there are dedicated backup services with similar pricing?
[1] https://docs.aws.amazon.com/AmazonS3/latest/dev/storage-class-intro.html#sc-glacier
[2] https://cloud.google.com/storage/docs/storage-classes#archive
[3] https://azure.microsoft.com/en-us/services/storage/archive/
[4] https://aws.amazon.com/s3/pricing/
[5] https://cloud.google.com/storage/pricing#archival-pricing
[6] https://azure.microsoft.com/en-us/pricing/details/storage/blobs/
https://www.tarsnap.com/
Not everyone is a fan, some are real fans, but you need to assess it for yourself. You may think the additional features are not worth the additional cost, but they might be.
I can offer no advice about the specific question you are asking, as having done the risk analysis and usage cases I do my own backups on 4TB disks that I handle myself.