I have an AWS Account and I'm using AWS Snowflake. I'm looking to transfer files from my S3 buckets to Snowflake's internal stages directly. I initially thought about using the COPY command, but I realized that it requires either input or output is a table, so this approach won't work.
If anyone has any suggestions or solutions, I would greatly appreciate your help. Thank you.
One way to solve this is with a Python Stored Procedure - which you can point to the staged file in S3, and copy to an internal stage:
create or replace procedure copy_bin(file_url string, dest_stage string, dest_filename string)
returns variant
language python
runtime_version=3.8
packages = ('snowflake-snowpark-python')
handler = 'x'
execute as caller
AS
$$
from snowflake.snowpark.files import SnowflakeFile
import io
def x(session, file_url, dest_stage, dest_filename):
file = io.BytesIO(SnowflakeFile.open(file_url, 'rb').read())
session._conn.upload_stream(file, dest_stage, dest_filename, compress_data=False)
return 'file copied to stage [%s] with name [%s]' % (dest_stage, dest_filename)
$$;
Call it like:
call copy_bin(
build_scoped_file_url(@s3_stage, 'sql.like.replace.png')
, '@for_images'
, 'sql.like.replace.png'
);