amazon web services - Export a DynamoDB table as CSV through AWS CLI (without using pipeline)

ID : 131364

viewed : 10

Tags : csvamazon-dynamodb

Top 5 Answer for amazon web services - Export a DynamoDB table as CSV through AWS CLI (without using pipeline)

vote vote

93

If all items have the same attributes, e.g. id and name both of which are strings, then run:

aws dynamodb scan \     --table-name mytable \     --query "Items[*].[id.S,name.S]" \     --output text 

That would give tab-separated output. You can redirect this to file using > output.txt, and you could then easily convert tabs into commas for csv.

Note that you may need to paginate per the scan documentation:

If the total number of scanned items exceeds the maximum dataset size limit of 1 MB, the scan stops and results are returned to the user as a LastEvaluatedKey value to continue the scan in a subsequent operation. The results also include the number of items exceeding the limit. A scan can result in no table data meeting the filter criteria.

Another option is the DynamoDBtoCSV project at github.

vote vote

80

For localhost dynamodb:

$aws dynamodb scan --table-name AOP --region us-east-1 --endpoint-url http://localhost:8000 --output json > /home/ohelig/Desktop/a.json 

For dynamodb:

$aws dynamodb scan --table-name AOP --region us-east-1 --output json > /home/ohelig/Desktop/a.json 

Then Convert JSON to CSV or whatever.

I have modified above answer to make it clear.

vote vote

72

A better way to do a full export of all columns without listign out is at Dynamo db export to csv

basically

aws dynamodb scan --table-name my-table --select ALL_ATTRIBUTES --page-size 500 --max-items 100000 --output json | jq -r '.Items' | jq -r '(.[0] | keys_unsorted) as $keys | $keys, map([.[ $keys[] ].S])[] | @csv' > export.my-table.csv 
vote vote

69

You can use jq convert the json output given by aws cli to csv

aws dynamodb scan --table-name mytable --query "Items[*].[id.S,name.S]" --output json | jq -r '.[] | @csv' > dump.csv 
vote vote

60

You can use jq to convert json into csv

aws dynamodb query \     --table-name <table-name> \     --index-name <index-name> \     --select SPECIFIC_ATTRIBUTES \     --projection-expression "attributes1, attributes2,..." \     --key-condition-expression "#index1 = :index1 AND #index2 = :index2" \     --expression-attribute-names '{"#index1": "index1","#index2": "index2"}' \     --expression-attribute-values '{":index1": {"S":"key1"},":index2": {"S":"key2"}}' \     --output json | jq -r '.Items' | jq -r '(.[0] | keys_unsorted) as $keys | $keys, map([.[ $keys[] ][]?])[] | @csv' > output.csv 

But be careful if the column data length is different it will produce wrong output

Top 3 video Explaining amazon web services - Export a DynamoDB table as CSV through AWS CLI (without using pipeline)

Related QUESTION?