linuxbashtimeoutsftpmget

mget in Bash script doesn't work due to huge files because of time out error


I am trying to execute bash script which has mget *.* to download all the files in the directory. it is downloading couple of files in File1 but it is skipping the File2 part probably due to time out error. I believe it is due to time out error because

1) I tried to implement the same to other directories, it worked perfectly. Might be because the files in the File2 are less.

2)while i tried to do command line mode of

sftp username@hostname 
cd file2
mget *.* 

it took at least 40 sec to 1 min to respond, but it did download all the files eventually.

So, I guess while executing the bash script, it might be stopping due to time out. Please suggest me the work around. Below is my bash script.

#test purpose only
#!/bin/bash
export Src_Dir=/path

File1=/path/*.*
File2=/path/Archive/*.*
DATE=`date +"%Y-%m-%d"`
Pass_Pwd='password'
PORT=22

cd "$Src_Dir" || { echo 'Failed to chdir into $Src_Dir' ; exit 0; }

/usr/bin/expect<<EOD


spawn /usr/bin/sftp -o Port=${PORT} username@host
expect "password:"
send "$Pass_Pwd\r"
expect "sftp>"
send "lcd ${Src_Dir}\r"
expect "sftp>"
send "mget ${File1}\r"
expect "sftp>"
send "mget ${File2}\r"
expect "sftp>"
send "bye\r"
EOD
echo "Download done"

Solution

  • I have written set timeout -1 above Spawn command, then it worked perfectly :)

    Thank you guys :)