I want to push files to a Windows EC2 and also get folders from my EC2 and store it in S3. As I have many EC2s, I wanted to automate this - I currently have a Node.js server on ECS that uses ssh2-sftp-client library. With the following piece of code I'm able to push files into it (similarly, I'm able to extract files from it and upload it to S3):
var Client = require('ssh2-sftp-client');
var sftp = new Client(newClientID);
sftp.connect({
host: host,
username: 'username',
password: 'password',
port: '22',
tryKeyboard: true
}).then(async () => {
try {
if (file.filename && file.file) {
await sftp.put(file.file, `C:/Users/user/Desktop/${file.filename}`);
console.log(`Successfully pushed ${file.filename}`);
}
sftp.on('error', error => {
console.log(error);
sftp.end();
});
resolve();
} catch (ex) {
console.log("SFTP EXCEPTION PUSHING FILES TO INSTANCE", ex);
} finally {
sftp.end();
}
}
But this is not a robust solution - even when I have 5 or 6 users trying to push at the same time the server errors because it has too many active SSH connections.
Is there a better way to do this? All I want to do is upload/download specific directories using Node.js (for a Windows EC2).
One good way to automate the running of scripts on EC2 is to use SSM Run Command. If you set up each EC2 instance correctly, then it becomes a managed instance and you can trigger the running of scripts across a fleet of EC2 instances, chosen by tags, for example.