So getting this error when trying to render a page on using Velocity via Turbine servlet. The thing is I have tons of memory, and the servlet itself never crashes. It just fails on this request. The page it's trying to render is maybe 10M.
Anyone have any thoughts/suggestions?
java.lang.OutOfMemoryError: Java heap space
java.lang.OutOfMemoryError: Java heap space at java.util.Arrays.copyOf(Arrays.java:2271) at
java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:113) at
java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutputStream.java:93)
at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:140)
at sun.nio.cs.StreamEncoder.writeBytes(StreamEncoder.java:221) at
sun.nio.cs.StreamEncoder.implWrite(StreamEncoder.java:282) at
sun.nio.cs.StreamEncoder.write(StreamEncoder.java:125) at
sun.nio.cs.StreamEncoder.write(StreamEncoder.java:135) at
java.io.OutputStreamWriter.write(OutputStreamWriter.java:220) at
java.io.Writer.write(Writer.java:157) at
org.apache.velocity.runtime.parser.node.ASTReference.render(ASTReference.java:321)
at
org.apache.velocity.runtime.parser.node.ASTBlock.render(ASTBlock.java:94)
at
org.apache.velocity.runtime.parser.node.ASTIfStatement.render(ASTIfStatement.java:109)
at
org.apache.velocity.runtime.parser.node.ASTBlock.render(ASTBlock.java:94)
at
org.apache.velocity.runtime.parser.node.SimpleNode.render(SimpleNode.java:271)
at
org.apache.velocity.runtime.parser.node.ASTIfStatement.render(ASTIfStatement.java:128)
at
org.apache.velocity.runtime.parser.node.ASTBlock.render(ASTBlock.java:94)
at
org.apache.velocity.runtime.parser.node.SimpleNode.render(SimpleNode.java:271)
at
org.apache.velocity.runtime.parser.node.ASTIfStatement.render(ASTIfStatement.java:128)
at
org.apache.velocity.runtime.parser.node.SimpleNode.render(SimpleNode.java:271)
at org.apache.velocity.Template.merge(Template.java:296) at
org.apache.velocity.app.Velocity.mergeTemplate(Velocity.java:492) at
org.apache.velocity.app.Velocity.mergeTemplate(Velocity.java:461) at
org.apache.turbine.services.velocity.TurbineVelocityService.executeRequest(TurbineVelocityService.java:455)
at
org.apache.turbine.services.velocity.TurbineVelocityService.handleRequest(TurbineVelocityService.java:321)
at
org.apache.turbine.services.velocity.TurbineVelocity.handleRequest(TurbineVelocity.java:109)
at
org.apache.turbine.modules.layouts.VelocityOnlyLayout.doBuild(VelocityOnlyLayout.java:155)
at org.apache.turbine.modules.Layout.build(Layout.java:91) at
org.apache.turbine.modules.LayoutLoader.exec(LayoutLoader.java:138) at
org.apache.turbine.modules.pages.DefaultPage.doBuild(DefaultPage.java:191)
at org.apache.turbine.modules.Page.build(Page.java:91) at
org.apache.turbine.modules.PageLoader.exec(PageLoader.java:136)
JAVA_OPTS= -Xms4096M -Xmn2048M -Xmx13128M
Memory usage under top never gets above 100M.
I suspect that your -Xmn
is causing the problem, by reserving too large a chunk of the initial heap for the young generation. I would suggesting running your server without this, and seeing what happens.
My reasoning for this is that the failure occurs on ByteArrayOutputStream.grow()
, which is creating a new array that is some percentage larger than the existing one. Large arrays (> 512M) get put directly into the tenured generation, so if too much space is reserved for the young generation, there may not be enough available in tenured.
Another possibility is that your rendered template is far larger than you think it is. Although the most likely cause for this is a loop, and I don't see that in the stack trace.
Finally, add the -XX:+HeapDumpOnOutOfMemoryError
option when starting. If your output arrays are growing too large, you'll see this in the heap dump (use jhat
to examine the dump).