apache-camelosgiapache-karaf

Apache Karaf datasource as a service versus in blueprint


I am a bit confused about working with Blueprint camel and Apache Karaf. In fact, when I was developping my route, I was using this to connect to my mssql database :

<bean id="dbcp" destroy-method="close" class="org.apache.commons.dbcp2.BasicDataSource">
    <property name="driverClassName" value="com.microsoft.sqlserver.jdbc.SQLServerDriver" />
    <property name="url" value="jdbc:sqlserver://server\instance;databaseName=xxx;" />
    <property name="username" value="xxxx" />
    <property name="password" value="xxx" />
</bean>

This was working flawlessly and then I wanted to export it into Apache Karaf. I did so and ran into a lot of trouble because of the sqlserver driver not being found. So I tried to handle this another way by exposing the DataSource as a service on Apache Karaf. This works and I get a hold of the reference like so in my blueprint:

<reference id="dbcp" interface="javax.sql.DataSource" filter="(osgi.jndi.service.name=Name)" availability="mandatory" />

Now this works but I don't exactly know what this does behind the scenes. I've read about services and references and often to make my first example work, people use a service call and then use it in the bean.

Is there a right and a wrong way? I've read on top of that, that we should a connection pool but I have only seen an example of this in the first approach (1st code sample). I guess it does the same when done with the DataSource as a service since I can call it from multiple bundles.

Thanks for regarding, best regards


Solution

  • In Apache Karaf versions 4.2.x - 4.4.x it's generally a good practice to use OSGi Services to share DataSource type objects. This makes your bundles more loosely coupled and when making changes to connection parameters you'll only need to change them for the service instead of having to reconfigure every bundle that uses the said DataSource.

    You can also create your own shared resources and expose them as services using blueprints, declarative service annotations or the "hard way" using activator and bundle context.

    I also recommend to checkout features pax-jdbc-config and pax-jms-config features as they allow you to create DataSource and ConnectionFactory type services from config files. These look for config files using org.ops4j.datasource org.ops4j.datasource prefixes in their name e.g org.ops4j.datasource-Example.cfg

    Only downside for using services is that they're specific to Karaf and OSGi so if you ever need to move your integrations to non-osgi environment you'll have to figure out another way to inject data sources to your integrations.

    [edit]

    With shared resources I mean resources you might want to access from multiple bundles. These can be anything from objects that contain shared data, connection objects for cloud blob storages, data access objects, slack or discord bots, services for sending mails etc.

    You can publish new services using blueprints from beans using service tag. Below is example from OSGi R7 Specification

    <blueprint>
       <service id="echoService" 
                interface="com.acme.Echo" ref="echo"/>
       <bean id="echo" class="com.acme.EchoImpl">
         <property name="message" value="Echo: "/>
       </bean>
    </blueprint> 
    
    public interface Echo {
      public String echo(String m);
    }
    public class EchoImpl implements Echo {
      String message;
      public void setMessage(String m) {
        this.message= m;
      }
      public void echo(String s) { return message + s; }
    } 
    

    With OSGi Declarative services (DS) / Service Component Runtime (SCR) annotations you can publish new services with java.

    @Component
    public class EchoImpl implements Echo {
      String message;
      public void setMessage(String m) {
        this.message= m;
      }
      public void echo(String s) { return message + s; }
    } 
    

    Another example can be found from Official Karaf examples in github.