The memory allocation error specifically seems to occur once it hits the for loop and only for very large storesLists. I only able to hit the memory allocation error if storesList.size() is about 2 million. I've pasted a snipped of the for loop which is supposed to create a new row based off each object in the loop.
The function takes in a ZipOutputStream and a list of objects.
public ZipOutPutStream setStream(ZipoutputStream zos, List<Stores> storesList){
zos.setLevel(1);
newRow = new StringBuilder();
for (Stores s: storesList) {
//clearing the stringbuilder each iteration
newRow.setLength(0);
newRow.append(s.getId());
newRow.append(",")
.append(s.getTitle()).append(s.getDescription());
newRow.append("\n");
}
byte[] data = result.toString().getBytes();
zos.write(data, 0, data.length);
zos.closeEntry();
zos.close();
return zos;
}
What should I be changing so that I am able to process very large lists?
Don't build a memory structure when you can stream the same data directly - as huge structures may hit OutOfMemoryError
on the toString()
or getBytes()
calls.
If you are dealing with platform line separators you can write out rows like this:
PrintWriter w = new PrintWriter(zos/*,false, orCharSetHere*/);
for (Stores s : storesList) {
w.println(s.getId() + "," + s.getTitle() + s.getDescription());
}
w.flush();
But if you want \n
line separators for all platforms the rows can be sent using:
OutputStreamWriter w = new OutputStreamWriter(zos/*, orCharSetHere*/);
for (Stores s : storesList) {
w.write(s.getId() + "," + s.getTitle() + s.getDescription()+"\n");
}
w.flush();
The writers must be flushed to ensure any buffered data is written to the ZipOutputStream
before the zip entry or stream is closed.