javaspring-bootapache-sparkjava-8spark-ui

How to fix SparkUI Executors , java.io.FileNotFoundException


I've deployed Spring boot server with Apache Spark and everything works stably. But http://X.X.X.X:4040/executors/ SparkUI executors endpoint throws java.io.FileNotFoundException and cannot find /opt/x/x!/BOOT-INF/lib/spark-core_2.11-2.2.0.jar. I checked the inner jar. This problem only happens on Linux, on Windows it works normally.

2019-04-23 07:01:24,038 WARN [org.spark_project.jetty.servlet.ServletHandler] [SparkUI-36] - 
org.spark_project.jetty.servlet.ServletHolder$1: org.glassfish.jersey.server.internal.scanning.ResourceFinderException: java.io.FileNotFound
Exception: /opt/x/x.jar!/BOOT-INF/lib/spark-core_2.11-2.2.0.jar (No such file or directory)
        at org.spark_project.jetty.servlet.ServletHolder.makeUnavailable(ServletHolder.java:594)
        at org.spark_project.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:658)
        at org.spark_project.jetty.servlet.ServletHolder.getServlet(ServletHolder.java:496)
        at org.spark_project.jetty.servlet.ServletHolder.ensureInstance(ServletHolder.java:788)
        at org.spark_project.jetty.servlet.ServletHolder.prepare(ServletHolder.java:773)
        at org.spark_project.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:578)
        at org.spark_project.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180)
        at org.spark_project.jetty.servlet.ServletHandler.doScope(ServletHandler.java:511)
        at org.spark_project.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112)
        at org.spark_project.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
        at org.spark_project.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:461)
        at org.spark_project.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:213)
        at org.spark_project.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
        at org.spark_project.jetty.server.Server.handle(Server.java:524)
        at org.spark_project.jetty.server.HttpChannel.handle(HttpChannel.java:319)
        at org.spark_project.jetty.server.HttpConnection.onFillable(HttpConnection.java:253)
        at org.spark_project.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:273)
        at org.spark_project.jetty.io.FillInterest.fillable(FillInterest.java:95)
        at org.spark_project.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93)
        at org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303)
        at org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148)
        at org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136)
        at org.spark_project.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671)
        at org.spark_project.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589)
        at java.lang.Thread.run(Thread.java:748)
Caused by: org.glassfish.jersey.server.internal.scanning.ResourceFinderException: java.io.FileNotFoundException: /opt/x/x.jar!/BOOT-INF/lib/spark-core_2.11-2.2.0.jar (No such file or directory)
        at org.glassfish.jersey.server.internal.scanning.JarZipSchemeResourceFinderFactory.create(JarZipSchemeResourceFinderFactory.java:90)

Solution

  • The cause to the problem is a limitation in Jersey – it can't cope with nested JAR files. You need to configure Spring Boot to automatically unpack any JARs containing spark-core resources when your app launches. In my case (I'm using spark-core_2.12), the solution was to add the following section to the pom.xml file:

    <plugin>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-maven-plugin</artifactId>
        <configuration>
            <requiresUnpack>
                <dependency>
                    <groupId>org.apache.spark</groupId>
                    <artifactId>spark-core_2.12</artifactId>
                </dependency>
            </requiresUnpack>
        </configuration>
    </plugin>
    

    some sources: here and here