I have a situation in my program where I need to access a certain number of images saved on my hard drive. I could either only load them once I really need them or load them all already at start up.
Out of curiousity I tried to read all (around 11k) images from several folders and save them in several Lists. I was wondering whether this takes too long but received an OutOfMemoryError after reading in around ~ 9k images.
My JRE has a heap size of 1g (-Xmx1g).
Why does this happen for my code? Am I producing a memory leak? Do you have any suggestions what to change for not experiencing this anymore or is the solution just to read only the files, once I need them? The overall size of all files is only around 40mb, so I thought that it would be okay to keep it in my memory (40mb seems not to be too much from my 1g heap size). Or does java perform some crazy magic stuff multiplying the file size by high numbers once loaded?
I have done some research on stackoverflow but couldn't really convert the given answers (like from here or here)for my case. So if anyone of you has an idea, I would be really happy :)
My code is here, sorry if it's a bit ugly - I was just messing around:
import java.awt.image.BufferedImage;
import java.io.File;
import java.util.ArrayList;
import java.util.List;
import javax.imageio.ImageIO;
public class FatfontsManager {
private List<BufferedImage> until9;
private List<BufferedImage> until99;
private List<BufferedImage> until999;
private List<BufferedImage> until9999;
// loads and keeps all images
public FatfontsManager() {
until9 = new ArrayList<>();
until99 = new ArrayList<>();
until999 = new ArrayList<>();
until9999 = new ArrayList<>();
String absolutPathToThisProject = new java.io.File("")
.getAbsolutePath();
// load Images
for (int i = 1; i < 10; i++) {
until9.add(loadImage(absolutPathToThisProject
+ "\\latexDir\\0..9\\" + i + ".png"));
}
System.out.println("1 - 9 loaded");
for (int i = 1; i < 100; i++) {
until99.add(loadImage(absolutPathToThisProject
+ "\\latexDir\\0..99\\" + i + ".png"));
}
System.out.println("1 - 99 loaded");
for (int i = 1; i < 1000; i++) {
until999.add(loadImage(absolutPathToThisProject
+ "\\latexDir\\0..999\\" + i + ".png"));
}
System.out.println("1 - 999 loaded");
for (int i = 1; i < 10000; i++) {
until9999.add(loadImage(absolutPathToThisProject
+ "\\latexDir\\0..9999\\" + i + ".png"));
}
System.out.println("1 - 9999 loaded");
}
public static void main(String[] args) {
new FatfontsManager();
}
public static BufferedImage loadImage(String ref) {
BufferedImage bimg = null;
try {
bimg = ImageIO.read(new File(ref));
} catch (Exception e) {
System.err.println("Error loading image file " + ref);
e.printStackTrace();
}
return bimg;
}
}
Thanks a lot for every answer.
The overall size of all files is only around 40mb ...
But when you load the image files into memory, the images will decompressed and turned into arrays of pixels. Depending on the original image file format (and compression parameters) this could require an order of magnitude or more heap space.
It is not entirely clear what you are trying to achieve, but maybe you should consider only caching these images in memory when they are actually going to be used.