Background
I have a Spring Batch job where :
FlatFileItemReader
- Reads one row at a time from the fileItemProcesor
- Transforms the row from the file into a List<MyObject>
and returns the List
. That is, each row in the file is broken down into a List<MyObject>
(1 row in file transformed to many output rows).ItemWriter
- Writes the List<MyObject>
to a database table. (I used this
implementation to unpack the list received from the processor
and delegae to a JdbcBatchItemWriter
)Question
List
of 100000 MyObject
instances.JdbcBatchItemWriter
will end up writing the entire List
with 100000 objects to the database. My question is : The JdbcBatchItemWriter
does not allow a custom batch size. For all practical purposes, the batch-size = commit-interval for the step. With this in mind, is there another implementation of an ItemWriter
available in Spring Batch that allows writing to the database and allows configurable batch size? If not, how do go about writing a custom writer myself to acheive this?
I see no obvious way to set the batch size on the JdbcBatchItemWriter
. However, you can extend the writer and use a custom BatchPreparedStatementSetter
to specify the batch size. Here is a quick example:
public class MyCustomWriter<T> extends JdbcBatchItemWriter<T> {
@Override
public void write(List<? extends T> items) throws Exception {
namedParameterJdbcTemplate.getJdbcOperations().batchUpdate("your sql", new BatchPreparedStatementSetter() {
@Override
public void setValues(PreparedStatement ps, int i) throws SQLException {
// set values on your sql
}
@Override
public int getBatchSize() {
return items.size(); // or any other value you want
}
});
}
}
The StagingItemWriter in the samples is an example of how to use a custom BatchPreparedStatementSetter
as well.