I'm trying to upload image files. When uploading files of sizes of about 40 to 200kb everything is fine. But when I try to upload a file of size 459kb sometimes it gets uploaded and sometimes OutOfMemoryError is shown. I don't think that file size is very large for this error to be thrown. I'm using resin. Is it possible there's a resin setting that prevents uploading of large sized files? Because in the code there is nowhere I'm setting any maximum size for file uploading.
Any one have any ideas how to avoid this. I want users to upload images of any size of up to about 10 MB.
If you are running out of memory, the first thing to check is how much memory your JVM has allocated. You don't say how Resin is running or which JVM it is using or the OS / host but typically a JVM will be governed by a default heap size and a a maximum heap size. From the command line the standard JVM has -Xms
and -Xmx
parameters that govern these values. When the JVM starts it allocates the default heap and it will allow it to grow to the maximum. When an allocation exceeds the maximum heap it might try to release any soft / weak references and other disposable resources, force a GC, but if it still hasn't enough space it will throw an OutOfMemoryException
. Therefore changing the values can affect when this happens.
The second thing to look for is how you allocate memory in the server. For example, if your server holds a pile of stuff in memory with strong references you can run out of memory, if so consider using a weak references instead or something like Ehcache to move stuff out to disk. Or if your servlet were allocating MB instead of KB due to some kind of calculation error. You shouldn't be trying to read the entire input into memory anyway. Assuming you are implementing a servlet, then the supplied ServletRequest
/ HttpServletRequest
allows you to access the input stream of the POSTDATA and you can read it in chunks (e.g. 32KB at a time) and write it out to an output buffer in chunks. As a general rule of thumb reading input data in discrete chunks would be better practice anyway since you don't want untrusted data to be able to break your server by passing some massive file to you.