Similar questions have been asked, but don't quite address what I'm trying to do. We have an older Seam 2.x-based application with a batch job framework that we are converting to CDI. The job framework uses the Seam Contexts object to initiate a conversation. The job framework also loads a job-specific data holder (basically a Map) that can then be accessed, via the Seam Contexts object, by any service down the chain, including from SLSBs. Some of these services can update the Map, so that job state can change and be detected from record to record.
It looks like in CDI, the job will @Inject
a CDI Conversation object, and manually begin/end the conversation. We would also define a new ConversationScoped
bean that holds the Map (MapBean
). What's not clear to me are two things:
First, the job needs to also @Inject the MapBean
so that it can be loaded with job-specific data before the Conversation.begin() method is called. Would the container know to pass this instance to services down the call chain?
Related to that, according to this question Is it possible to @Inject a @RequestScoped bean into a @Stateless EJB? it should be possible to inject a ConservationScoped
bean into a SLSB, but it seems a bit magical. If the SLSB is used by a different process (job, UI call, etc), does it get separate instance for each call?
Edits for clarification and a simplified class structure:
MapBean
would need to be a ConversationScoped
object, containing data for a specific instance/run of a job.
@ConversationScoped
public class MapBean implements Serializable {
private Map<String, Object> data;
// accessors
public Object getData(String key) {
return data.get(key);
}
public void setData(String key, Object value) {
data.put(key, value);
}
}
The job would be ConversationScoped
:
@ConversationScoped
public class BatchJob {
@Inject private MapBean mapBean;
@Inject private Conversation conversation;
@Inject private JobProcessingBean jobProcessingBean;
public void runJob() {
try {
conversation.begin();
mapBean.setData("key", "value"); // is this MapBean instance now bound to the conversation?
jobProcessingBean.doWork();
} catch (Exception e) {
// catch something
} finally {
conversation.end();
}
}
}
The job might call a SLSB, and the current conversation-scoped instance of MapBean
needs to be available:
@Stateless
public class JobProcessingBean {
@Inject private MapBean mapBean;
public void doWork() {
// when this is called, is "mapBean" the current conversation instance?
Object value = mapBean.getData("key");
}
}
Our job and SLSB framework is quite complex, the SLSB can call numerous other services or locally instantiated business logic classes, and each of these would need access to the conversation-scoped MapBean
.
First, the job needs to also
@Inject
theMapBean
so that it can be loaded with job-specific data before theConversation.begin()
method is called. Would the container know to pass this instance to services down the call chain?
Yes, since MapBean
is @ConversationScoped
it is tied to the call chain for the duration starting from conversation.begin()
until conversation.end()
. You can think of @ConversationScoped
(and @RequestScoped
and @SessionScoped
) as instances in ThreadLocal
- while there exists an instance of them for every thread, each instance is tied to that single thread.
Related to that, according to this question Is it possible to @Inject a @RequestScoped bean into a @Stateless EJB? it should be possible to inject a
@ConservationScoped
bean into a SLSB, but it seems a bit magical. If the SLSB is used by a different process (job, UI call, etc), does it get separate instance for each call?
It's not as magical as you think if you see that this pattern is the same as the one I explained above. The SLSB indeed gets a separate instance, but not just any instance, the one which belongs to the scope from which the SLSB was called.
In addition to the link you posted, see also this answer.
Iv'e tested a similar code to what you posted and it works as expected - the MapBean
is the same one injected throughout the call. Just be careful with 2 things:
BatchJob
is also @ConversationScoped
but does not implement Serializable
, which will not allow the bean to passivate.data
is not initialized, so you will get an NPE in runJob()
.