Not sure why I can't find this answer anywhere.
Input CSV:
201800001779830000,"17798320181.pdf",,,159.0,5371.0,,,,,,2019,"{}",2022-08-25 12:58:20.928,2022-08-26 03:13:35.292,airflow,false,"2018-13-FILE-0000177983"
201800002481440000,"24814420181.pdf",,,180.0,7085.0,,,,,,2018,"{}",2022-08-25 12:57:08.403,2022-08-26 03:13:35.292,airflow,false,"2018-13-FILE-0000248144"
...
Want to run each row as a separate command with with certain columns as input to path values resulting the following commands
aws s3 cp s3://bucket1/prefix1/2019/17798320181.pdf s3://bucket1/prefix2/2019/17798320181.pdf
aws s3 cp s3://bucket1/prefix1/2018/24814420181.pdf s3://bucket1/prefix2/2018/24814420181.pdf
...
So I'd like to use columns 2 and 12 and awk
et al to generate these aws
commands that will run in succession. Each row should run the command not print it out. Thanks!
EDIT: If this can be done as a one-liner (using awk, sed etc whatever) would be preferred.
If you want columns 2 and 12 to be taken as the filenames
awk -F, '{print "s3://bucket1/prefix1/2019/" $2 " s3://bucket1/prefix2/2019/" $12}' FILE | xargs -tL1 aws s3 cp