javahashmapheap-memoryg1gc

java.lang.OutOfMemoryError: GC overhead limit exceeded


I am getting this error in a program that creates several (hundreds of thousands) HashMap objects with a few (15-20) text entries each. These Strings have all to be collected (without breaking up into smaller amounts) before being submitted to a database.

According to Sun, the error happens "if too much time is being spent in garbage collection: if more than 98% of the total time is spent in garbage collection and less than 2% of the heap is recovered, an OutOfMemoryError will be thrown.".

Apparently, one could use the command line to pass arguments to the JVM for

The first approach works fine, the second ends up in another java.lang.OutOfMemoryError, this time about the heap.

So, question: is there any programmatic alternative to this, for the particular use case (i.e., several small HashMap objects)? If I use the HashMap clear() method, for instance, the problem goes away, but so do the data stored in the HashMap! :-)

The issue is also discussed in a related topic in StackOverflow.


Solution

  • You're essentially running out of memory to run the process smoothly. Options that come to mind:

    1. Specify more memory like you mentioned, try something in between like -Xmx512m first
    2. Work with smaller batches of HashMap objects to process at once if possible
    3. If you have a lot of duplicate strings, use String.intern() on them before putting them into the HashMap
    4. Use the HashMap(int initialCapacity, float loadFactor) constructor to tune for your case