Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My experience with dynamic provisioning has been that it is pretty inelastic, at least at the lower range of capacity. E.g. if you have a few read units and then try to export the data using AWS's cli client, you can pretty quickly hit the capacity limit and have to start the export over again. Last time, I ended up manually bumping the capacity way up, waiting a few minutes for the new capacity to kick in, and then exporting. Not what I had in mind when I wanted a serverless database!


I understand it's not really your point, but if you're actually looking to export all the data from the table, they've got an API call you can give to have DynamoDB write the whole table to S3. This doesn't use any of your available capacity.

https://docs.aws.amazon.com/amazondynamodb/latest/developerg...

Beyond that, though, it's really not designed for that kind of use case.


Ah, fair point. Somehow I didn't encounter that when I was trying to export, even though it existed at the time. But it would have solved my problem.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: