I'm using this function to convert a file size in bytes to a human-readable file size:
function getReadableFileSizeString(fileSizeInBytes) {
var i = -1;
var byteUnits = [' kB', ' MB', ' GB', ' TB', 'PB', 'EB', 'ZB', 'YB'];
do {
fileSizeInBytes /= 1024;
i++;
} while (fileSizeInBytes > 1024);
return Math.max(fileSizeInBytes, 0.1).toFixed(1) + byteUnits[i];
}
console.log(getReadableFileSizeString(1551859712)); // output is "1.4 GB"
However, it seems like this isn't 100% accurate. For example:
getReadableFileSizeString(1551859712); // output is "1.4 GB"
Shouldn't this be "1.5 GB"
? It seems like the division by 1024 is losing precision. Am I totally misunderstanding something or is there a better way to do this?
It depends on whether you want to use the binary or decimal convention.
RAM, for instance, is always measured in binary, so to express 1551859712 as ~1.4GiB would be correct.
On the other hand, hard disk manufacturers like to use decimal, so they would call it ~1.6GB.
And just to be confusing, floppy disks use a mixture of the two systems - their 1MB is actually 1024000 bytes.