springspring-batchitemwriteritemprocessor

Pass filenames dynamically to FlatFileItemWriter through StepBuilderFactory stream() when using ClassifierCompositeItemProcessor in SpringBatch


I'm processing multiple input files with multi-format lines using ClassifierCompositeItemProcessor. But when using StepBuilderFactory stream to write the files, I'm unable to pass the Resource filename dynamically. Filename should be the respective input file name. Any help would be much appreciated.

Input File 1 (data-111111-12323.txt)

1#9999999#00001#2#RecordType1
2#00002#June#Statement#2020#9#RecordType2
3#7777777#RecordType3

Input File 2 (data-22222-23244.txt)

1#435435#00002#2#RecordType1
2#345435#July#Statement#2021#9#RecordType2
3#645456#RecordType3

Expected output file 1 (data-111111-12323.txt)

1#9999999#00001#2#RecordType1#mobilenumber1
2#00002#June#Statement#2020#9#RecordType2#mobilenumber2
3#7777777#RecordType3#mobilenumber3

Expected output file 2 (data-22222-23244.txt)

1#9999999#00001#2#RecordType1#mobilenumber1
2#00002#June#Statement#2020#9#RecordType2#mobilenumber2
3#7777777#RecordType3#mobilenumber3

Step

        public Step partitionStep() throws Exception {
            ItemReader reader = context.getBean(FlatFileItemReader.class);
            ClassifierCompositeItemWriter writer = context.getBean(ClassifierCompositeItemWriter.class);
            return stepBuilderFactory.get("statementProcessingStep.slave").<String, String>chunk(12).reader(reader).processor(processor()).writer(writer)
                    .stream(recordType0FlatFileItemWriter())
                    .stream(recordType1FlatFileItemWriter())
                    .build();
                }

Processor

@Bean
    @StepScope
    public ItemProcessor processor() {
            ClassifierCompositeItemProcessor<? extends RecordType, ? extends RecordType> processor = new ClassifierCompositeItemProcessor<>();
SubclassClassifier classifier = new SubclassClassifier();
        Map typeMap = new HashMap();
        typeMap.put(RecordType0.class, recordType0Processor);
        typeMap.put(RecordType1.class, recordType1Processor);
classifier.setTypeMap(typeMap);
        processor.setClassifier(classifier);
return processor;
}

Writer

@Bean
    public FlatFileItemWriter<RecordType1> recordType1FlatFileItemWriter() throws Exception{
        FlatFileItemWriter<RecordType1> writer = new FlatFileItemWriter<>();
        writer.setResource( new FileSystemResource("record1.txt")); //This filename should be dynamic
        writer.setAppendAllowed(true);
        writer.setLineAggregator(new DelimitedLineAggregator<RecordType1>() {{
            setDelimiter("#");
            setFieldExtractor(new BeanWrapperFieldExtractor<RecordType1>() {
                {
                    setNames(new String[] { "RecordType", "ID1", "ID2", "ID3"});
                }
            });
        }});
        return  writer;
    }

Solution

  • You can make your item reader/writer step-scoped and inject values from job parameters or step/job execution context using late-binding. For example:

    @StepScope
    @Bean
    public FlatFileItemReader flatFileItemReader(@Value("#{jobParameters['input.file.name']}") String name) {
       return new FlatFileItemReaderBuilder<Foo>()
            .name("flatFileItemReader")
            .resource(new FileSystemResource(name))
            .build();
    }
    

    You can find more details in the Late Binding of Job and Step Attributes section of the reference documentation.