I have a Java application that needs the ability to upload and download large files to and from an Amazon S3 storage area.
I've been pleasantly surprised at how quickly large files can be uploaded. Really just a matter of seconds.
And I've also been pretty happy with how quickly it can download these same files and convert to a byte array.
What is way too slow, though, is reading the byte array into an actual file.
Any suggestions for how to get this last part to go faster?
Here is my code:
// Get the response - this is actually quite fast
ResponseInputStream<GetObjectResponse> getResponse = s3Client.getObject(request);
byte[] responseBytes = getResponse.readAllBytes();
// Download to a file - this is extremely slow
File outputFile = new File(downloadPath);
try (FileOutputStream fileOutputStream = new FileOutputStream(outputFile)) {
for (int ii=0; ii<responseBytes.length; ii++) {
fileOutputStream.write(responseBytes, ii, 1);
}
}
Writing the file byte by byte will incur the overhead of a system call for every single byte.
Fortunately, there's an overload of write
that takes an entire byte[]
and writes it out with far fewer system calls:
try (FileOutputStream fileOutputStream = new FileOutputStream(outputFile)) {
fileOutputStream.write(responseBytes);
}