amazon-web-servicesamazon-s3amazon-dynamodb

Import CSV from S3 And Specify Data Types of Keys for DynamoDB?


I've got both a csv and a json representation of my table. I can use the import from S3 feature with the csv just fine, but I have a key called updatedAt which represents a unix timestamp. The csv import makes the key a string, so I can't make a GSI and create query logic in my code based on numbers. When I try the json import, I get the following message from cloud watch:

"Unexpected token. Remainder of the file will not be processed."

Ultimately, my goal is to create the GSI so that I can use my query logic. I'm trying to see if there's a way I can force csv import to enforce types. For example, is it possible to create the table, and then use import to simply populate the data and not both create the table and populate the table?


Solution

  • I ended up just writing a script to add a key called updatedAtNumber set equal to the conversion from string to decimal, then delete updatedAt, create a new updatedAt set to updatedAtNumber values, then delete updatedAtNumber