What is the best way to feed a huge CSV file to LZ4 compression API? The following isn't optimal for very large files.
byte[] data = file.getBytes();
You can try https://github.com/flanglet/kanzi. The block compressor takes an InputStream and splits it into blocks (you specify the size in the command line).
From the Wiki, you can do something like this:
java -cp kanzi.jar kanzi.app.BlockCompressor --input=myFile.csv --output=myFile.knz --overwrite --block=8M --transform=lz --entropy=none