Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Ask HN: Cloud providers as personal cold backups
3 points by jerjerjer on Dec 19, 2020 | hide | past | favorite | 5 comments
I want to store cold backups cheaply. Every few months I prepare a new archive (10-100 Gb) that I want to upload and forget till something bad happens.

I'm thinking of using cloud providers for that. Assumptions:

- Data import/export is completely egress (from my local PC)

- Data retrieval time does not matter

Cloud providers offer cheap cold storage:

- AWS S3 Glacier Deep Archive [1]

- GCP Archive [2]

- Azure Archive [3]

Storage seems to be ~1$/Tb/month everywhere.

Retrieval from ~20$/Tb for AWS (not sure about that one - first restore from Glacier Deep Archive to S3 and then Download) [4].

Retrieval for GCP is 51,2$/Tb [5].

Retrieval in Azure is 20,48$/Tb [6].

Questions:

1. Is there a GUI or CLI tool to manage my proposed workflow (infrequent cold backup to one or several cloud providers). I don't want to use their web portals as I'm not a frequent cloud user and all three interfaces are rather confusing and change all the time. Also uploading gigabytes from the browser seems like a bad idea.

2. Is there something better for secure cold backups? At 1$/Tb/month the storage price looks so good. Or am I barking up the wrong tree and there are dedicated backup services with similar pricing?

[1] https://docs.aws.amazon.com/AmazonS3/latest/dev/storage-class-intro.html#sc-glacier

[2] https://cloud.google.com/storage/docs/storage-classes#archive

[3] https://azure.microsoft.com/en-us/services/storage/archive/

[4] https://aws.amazon.com/s3/pricing/

[5] https://cloud.google.com/storage/pricing#archival-pricing

[6] https://azure.microsoft.com/en-us/pricing/details/storage/blobs/



A different usage model, but one that might actually be more suitable and do some of your work for you, hence be worth the higher cost:

https://www.tarsnap.com/

Not everyone is a fan, some are real fans, but you need to assess it for yourself. You may think the additional features are not worth the additional cost, but they might be.

I can offer no advice about the specific question you are asking, as having done the risk analysis and usage cases I do my own backups on 4TB disks that I handle myself.


That's 256$/Tb/month for storage and 256$/Tb for retrieval.

I plan to do encryption on my end as it's easy to encrypt with 7z using AES-256, it can also encrypt the filenames.

Thanks, but this is online backups for mutable data and my use case is cold backups for immutable data.


I figured that your use case was sufficiently different that it wouldn't be your choice, but I thought it worth mentioning to give you something against which to compare other options.

Good luck with the search - do blog about the search and your chosen solution, then submit that here.


If your provider uses SFTP, you can use nFreezer (http://github.com/josephernest/nfreezer) which I built exactly for the same purpose than you. (I first thought about 7z too but having to reupload the whole archive even if only a few files are modified made me look for another solution).

In short, you can do

    nfreezer backup my_documents/ user@11.22.33.44:/backup/ 
and the backup will be done or updated (if only 1MB have changed since last backup, only 1MB will be uploaded and not 100 GB).

Everything is encrypted locally (including the filenames), and never decrypted on remote. You can't use the files on the remote server, it's only cold storage - the only thing you can do with it is: restore to your local computer (if one day you need it), and decrypt it locally.

It's 250 lines of code in one single source file, so you can read it quickly to see if it's ok or not for your use case.


You should look at second-tier object storage providers like B2 and Wasabi.

For software, look at Restic.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: