I have CSV data loaded into a stage. The first column contains a timestamp for customer survey responses. Here is what the data looks like:
If I run this command:
SELECT TO_TIMESTAMP( $1, 'MM/DD/YYYY HH12:MI:SS AM') from @CODE_TWO_STAGE;
I get this error:
Can't parse '"8/6/2025 3:27:55 PM"' as timestamp with format 'MM/DD/YYYY HH12:MI:SS AM'
However, if I copy and paste the timestamp data directly into the query like this:
SELECT TO_TIMESTAMP('8/6/2025 3:27:55 PM', 'MM/DD/YYYY HH12:MI:SS AM');
There is no error:
I'm trying to use this data in a snowpipe, but this error is causing the pipe to fail. I have tried messing with the $1 in a couple of ways, such as $1::string, but that has not helped.
Any idea why this is happening?
Data loaded from a stage is usually delivered as raw text, which may include enclosing double quotes (e.g. "8/6/2025 3:27:55 PM"
), especially if the file is a CSV or JSON. When you select directly from a stage, you're retrieving the value including its quotes. As shown in the first screenshot, the values retrieved from the CSV files include double quotes.
When you do `SELECT TO_TIMESTAMP('8/6/2025 3:27:55 PM', ...)`, there are no enclosing quotes in the string itself, so the parser matches the format as intended.
The command below worked for me:
SELECT TO_TIMESTAMP(TRIM($1,'"'), 'MM/DD/YYYY HH12:MI:SS AM')
FROM @CODE_TWO_STAGE;