You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Jun 18, 2020. It is now read-only.
I'd like to use this tool to migrate / backfill records from a number of large (~10-20Gb) DynamoDB tables. We're migrating from regular tables to global tables. Both source and destination use the On-Demand billing model and as such, the tables do not have provisioned read and write capacity.
The code assumes that tables will have read and write provisioned capacity.
I see that there's a PR open to add this support. I've not tried the branch yet, but I shall. In the mean time, please consider this a 👍 for this feature.
The text was updated successfully, but these errors were encountered:
I can confirm that the PR works as advertised. I've used it to migrate data from two smaller tables, and I have a process actively running for a larger ~12Gb table.
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Hi,
I'd like to use this tool to migrate / backfill records from a number of large (~10-20Gb) DynamoDB tables. We're migrating from regular tables to global tables. Both source and destination use the On-Demand billing model and as such, the tables do not have provisioned read and write capacity.
The code assumes that tables will have read and write provisioned capacity.
I see that there's a PR open to add this support. I've not tried the branch yet, but I shall. In the mean time, please consider this a 👍 for this feature.
The text was updated successfully, but these errors were encountered: