node.jsexpressencodingopentypeknox-amazon-s3-client

OPT Files retrieved from S3 AWS are broken


I can successfully retrieve my OPT file from AWS using Knox.. but when I go to use the file it is broken. I believe this is an encoding issue.. but honestly I'm not sure.

The end file size is larger than the files actual size.

Below is a simplified example:

var client = knox.createClient({
    key:    '************', 
    secret: '************',
    bucket: '************'
});
client.get(otfFile).on('response', function(res){
    var file = "";
    res.setEncoding("binary");
    res.on('data', function(chunk){  

        file += chunk;
    });
    res.on('end', function() { 

        // Save File
        fs.writeFile( filepath, file, function(err) {

            if (err) console.error(err);
        }); 
    });
}).end();

Do you know how to fix it or have an idea as to what is wrong?


Solution

  • The short answer — otfs require ISO-8859-1 encoding. :)

    It seems that the issue is that otf are encoded in ISO-8859-1 but node doesn't provide a default functionality to use that format. You can just get the file via GET and encode it after with a package like Incov. https://github.com/bnoordhuis/node-iconv

    var client = knox.createClient({
        key:    '************', 
        secret: '************',
        bucket: '************'
    });
    client.get(otfFile).on('response', function(res){
        var file = "";
        res.setEncoding("utf8");
        res.on('data', function(chunk){  
    
            file += chunk;
        });
        res.on('end', function() { 
    
            // Encode
            var iconv = new Iconv('UTF-8', 'ISO-8859-1');
            file = iconv.convert(file);
    
            // Save File
            fs.writeFile( filepath, file, function(err) {
    
                if (err) console.error(err);
            }); 
        });
    }).end();