javabatch-processingwebsphere-libertyjsr352java-batch

WebSphere Liberty - Batch reader/writer/etc. not showing updated values when job is re-run with different job parameter values


When I ran a job multiple times on the liberty server it allways takes the parameter values of my first job ran although I changed the values. So I can't run the job multiple times with different parameter values. Why? What happens if I run several of the same jobs in parallel with different parameters?

My JSL looks like:

job id="VerbrauchsfolgeExecutor" xmlns="http://xmlns.jcp.org/xml/ns/javaee" version="1.0">
  <properties>
    <property name="stichtag" value="#{jobParameters['stichtag']}" />
    <property name="filename" value="#{jobParameters['filename']}" />
    <property name="filetype" value="#{jobParameters['filetype']}" />
    <property name="groupSize" value="#{jobParameters['groupSize']}" />
    <property name="db2Umgebung" value="#{jobParameters['db2Umgebung']}" />
    <property name="loglevel" value="#{jobParameters['loglevel']}" />
  </properties>
  <step id="STEP1">
    <chunk item-count="100">
      <reader ref="VerbrauchsfolgeReader"></reader>
      <processor ref="VerbrauchsfolgeProcessor"></processor>
      <writer ref="VerbrauchsfolgeWriter"></writer>
    </chunk>
  </step>
</job>

Reading the properties:

...
@Inject
JobContext context;
...
Properties prop = context.getProperties();
String loglevel = prop.getProperty("loglevel");
...

Job call:

...
.\batchManager submit --jobXMLName=VerbrauchsfolgeExecutor --applicationName=zos-verbrauchsfo
lge-1.0.0 --user=bob --trustSslCertificates --batchManager=localhost:9082 --jobParameter=stichtag=25.03.2022 --jobParame
ter=filename=dataset.out --jobParameter=filetype=RDW --jobParameter=groupSize=10 --jobParameter=db2Umgebung=E11 --jobPar
ameter=loglevel=INFO
...

This is not the behavior I expect when running a job multiple times in a row with different parameter values. Please, can someone help me with my problem.


Solution

  • ANSWER

    Don't use an @ApplicationScoped annotation with a batch artifact that needs per-job instance data (if you are going to run more than one job per application startup). Maybe an alternative would be to use a @Dependent scope for the batch artifact and move per-application-lifecycle instance data into a separate @ApplicationScoped bean which can then be injected into the batch artifact.

    Explanation

    Batch artifacts can be loaded as CDI Beans and the instances are scoped similarly, e.g. for a batch artifact annotated with @ApplicationScoped the container is only going to load a single instance of that type for each time the application is started. This is part of how Jakarta Batch and CDI integrate together within the Jakarta Platform.