javamemory-mapped-filesmappedbytebuffer

MappedByteBuffer - BufferOverflowException


I am using MappedByteBuffer to write records on to file. Below is my code. It throws BufferOverflowException when i increase the numberOfRows to be written. It works fine for 10 million numberOfRows. If i increase numberOfRows to 100 million, it is throwing BufferOverlowException!?

public static void writeOneFile() throws IOException{
     File file = File.createTempFile("outputfile", ".txt", new File("C:\\Data\\Output"));
     //f.delete();
     RandomAccessFile fileAccess = new RandomAccessFile(file, "rw");
     FileChannel fileChannel = fileAccess.getChannel();

     long bufferSize = (long) (Math.pow(10240, 2));//(long)(Math.pow(30720, 2));//(long) (Math.pow(1024, 2));//(long)Integer.MAX_VALUE;
     MappedByteBuffer mappedBuffer = fileChannel.map( FileChannel.MapMode.READ_WRITE, 0, bufferSize );

     long startPosMappedBuffer = 0;
     long million = 1000000; 
     long numberOfRows = million * 100; //million * 200 ;//1000;//million * 200 ; //200 million

     long startTime = System.currentTimeMillis();

     long counter = 1;
     //byte[] messageBytes = (counter+"").getBytes(Charset.forName("UTF-8"));
     //long bufferSize = (counter + "\n").getBytes(Charset.forName("UTF-8")).length * 1000;
     while(true)
     {         
         if( !mappedBuffer.hasRemaining() )
         {
             startPosMappedBuffer += mappedBuffer.position();
             mappedBuffer = fileChannel.map( FileChannel.MapMode.READ_WRITE, startPosMappedBuffer, bufferSize );
         }
         mappedBuffer.put( (counter + System.lineSeparator()).getBytes(Charset.forName("UTF-8")) ); //+ System.lineSeparator() //putLong( counter ); // ); 
         //mappedBuffer.rewind();

         counter++;
         if( counter > numberOfRows )
             break; 
     }
     fileAccess.close();
     long endTime = System.currentTimeMillis();
     long actualTimeTaken = endTime - startTime;
     System.out.println( String.format("No Of Rows %s , Time(sec) %s ", numberOfRows, actualTimeTaken / 1000f) ) ;  
 }

Any hints on what is the issue?

Edit 1: Exception issue is resolved and answered as below.

Edit 2: Regarding a best option for performance.

@EJP: here is the code using DataOutputStream around BufferedOutputStream.

static void writeFileDataBuffered() throws IOException{
        File file = File.createTempFile("dbf", ".txt", new File("C:\\Output"));
        DataOutputStream out = new DataOutputStream(new BufferedOutputStream(new FileOutputStream( file )));
        long counter = 1;
        long million = 1000000;
        long numberOfRows = million * 100;
        long startTime = System.currentTimeMillis();
        while(true){
            out.writeBytes( counter + System.lineSeparator() );
            counter++;
            if ( counter > numberOfRows )
                break;
        }
        out.close();
        long endTime = System.currentTimeMillis();
        System.out.println("Number of Rows: "+ numberOfRows + ", Time(sec): " + (endTime - startTime)/1000f);
    }

.......... Thanks


Solution

  • After some background work, i found out the root cause. The bufferSize i declared was less than the content length i am writing.

    The number of bytes required for 100 million records are: 988888898 while the bufferSize with (long) (Math.pow(10240, 2)) is: 104857600. The bufferSize is short by 884031298 bytes. This was causing the issue as the exception indicates.

    The bufferSize can also be used as Integer.MAX_VALUE instead of calculating the content size being written. Though this increases the file size, it does not have any impact on the performance of the program, as per my trial run results.

    .........

    Thanks