Is there any way to run multiple instances of Sparkjava server in the same JVM? I am using it in a "plugin" software and based on external circumstances multiple instances of my plugin might be started up which then cause
java.lang.IllegalStateException: This must be done before route mapping has begun
at spark.SparkBase.throwBeforeRouteMappingException(SparkBase.java:256)
at spark.SparkBase.port(SparkBase.java:101)
at com.foo.bar.a(SourceFile:59)
It seems to me by looking at the code that it is heavily built around static fields in the code, so I am thinking about a classloader trick or working with SparkServerFactory
somehow eliminating SparkBase
.
From Spark 2.5 you can use ignite()
:
http://sparkjava.com/news.html#spark25released
Example:
public static void main(String[] args) {
igniteFirstSpark();
igniteSecondSpark();
}
static void igniteSecondSpark() {
Service http = ignite();
http.get("/basicHello", (q, a) -> "Hello from port 4567!");
}
static void igniteFirstSpark() {
Service http = ignite()
.port(8080)
.threadPool(20);
http.get("/configuredHello", (q, a) -> "Hello from port 8080!");
}
I personally initialize them something like this:
import spark.Service
public static void main(String[] args) {
Service service1 = Service.ignite().port(8080).threadPool(20)
Service service2 = Service.ignite().port(8081).threadPool(10)
}
I recommend to read about how to use those services outside your main method, which I think would be a great use here.