javapostgresqlout-of-memorybufferedoutputstream

Buffered writing to database to avoid OutOfMemoryError


Let's assume you have the following code:

BufferedReader br = new BufferedReader(new InputStreamReader(new File("MyFile.txt")));
FooParser parser = new FooParser();
List<Foo> fooList = new ArrayList<Foo>();

try {
    String line = br.readLine();
    while (line != null && !line.isEmpty()) {
        Foo foo = parser.parse(line);
        fooList.add(foo);
        line = br.readLine();
    }
} catch (IOException ioe) {
    // handle it
}

and later on that list is used to write the objects into a database (PostgreSQL to be exact) with the following call:

dao.write(fooList);

Now, the code above may seem simple and easy to follow, however, the problem arises when there are too many Foo objects in your list, to that extent that it causes OutOfMemoryError.

So, I was wondering if there is a pattern of some sort or convention on how to do 'buffered' writes to database, to avoid keeping large amounts of objects in memory that would eventually be stored in the database?

I was thinking about going along these lines, inspired by BufferedOutputStream, keeping a buffer (of configurable length) of objects and once the buffer fills up, to write and flush to dabatase.

Since my hands are quite tied on this (I am only allowed to do small bugfixes, not change the architecture), would this be a good enough approach and/or are there better ones?

Thank you :D


Solution

  • You have to add some sort of buffer logic. My buffer size is "100" as you can see below.

    String line = br.readLine();
        while (line != null && !line.isEmpty()) {
            Foo foo = parser.parse(line);
            fooList.add(foo);
            if(fooList.size() == 100){
                dao.write(fooList);
                fooList = new ArrayList<Foo>();
            }
    
            line = br.readLine();
        }