Relative Content

Tag Archive for springspring-batch

org.springframework.batch.item.WriterNotOpenException: Writer must be open before it can be written to – in Spring batch Multithreading?

I’ve used FlatFileItemReader with the Classifier and ClassifierCompositeItemWriter in my project where we read Tab delimited file and convert it into CSV as a output file by performing some validation, enrichment etc process and create four output file which the logic written onto the Classifier implementation. So I am reading FlatFile and also creating another FlatFile.

JdbcPagingItemReader starts reading from the beginning after successful job completion even though its state is being saved

After the job is successful and completed for the first 10 chunks/pages for example
I then insert more rows and run the batch again expecting it to start from the next chunk but it starts from first chunk again chunk 0.
Even though when the job gets aborted by manually updating the Job and Step execution context’s status to FAILED in the DB as mentioned before in other questions answered here before it does restart at the last successful record
and maintains the last chunk as expected

How to count the number of files created by composite Item Writer using Spring Batch?

I am using Spring Batch in project and using ClassifierCompositeItemWriter to classify records and then using MultiResourceItemWriterBuilder to create multiple files. In my case more than 1 files is getting created, but I’d want to understand is there any way to count no. of files created by each Writer within CompositeItemWriter. How can we do that?

How to count the number of files created by composite Item Writer using Spring Batch?

I am using Spring Batch in project and using ClassifierCompositeItemWriter to classify records and then using MultiResourceItemWriterBuilder to create multiple files. In my case more than 1 files is getting created, but I’d want to understand is there any way to count no. of files created by each Writer within CompositeItemWriter. How can we do that?

Spring Batch how to create fixed size file when reading multiple flat files?

I’d like to further extend Reading data form multiple csv file and writing it into one csv file using Spring Batch, in my case I’ve to read multiple files (with the same filename) from different sub-folder ../main-folder/year/Jan, another path ../main-folder/year/Feb etc for all years and create 1 file out of it, now while creating single file it should not exceed in 3GB or 3 millions record (which ever comes first), if more size or record count then create one more file and so on.