javaapache-sparkivy

Spark Driver NoClassDefFoundError after upgrading Ivy to 2.5.3—how to fix without downgrading?


I recently upgraded the Apache Ivy package in my Spark container from version 2.5.1 to 2.5.2 and redeployed it inside an OpenShift cluster. The Spark master and workers start and run without issues. However, when I submit a job, the driver encounters a NoClassDefFoundError, which appears in the stderr file as shown below:

25/05/14 16:19:24 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
25/05/14 16:19:24 INFO SecurityManager: Changing view acls to: 1000700000
25/05/14 16:19:24 INFO SecurityManager: Changing modify acls to: 1000700000
25/05/14 16:19:24 INFO SecurityManager: Changing view acls groups to: 
25/05/14 16:19:24 INFO SecurityManager: Changing modify acls groups to: 
25/05/14 16:19:24 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: 1000700000; groups with view permissions: EMPTY; users with modify permissions: 1000700000; groups with modify permissions: EMPTY
25/05/14 16:19:24 INFO Utils: Successfully started service 'Driver' on port 39873.
25/05/14 16:19:24 INFO DriverWrapper: Driver address: 10.254.20.59:39873
25/05/14 16:19:24 INFO WorkerWatcher: Connecting to worker spark://Worker@10.254.20.59:38753
Exception in thread "main" java.lang.NoClassDefFoundError: org.apache.ivy.plugins.resolver.DependencyResolver
    at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:182)
    at org.apache.spark.deploy.worker.DriverWrapper$.setupDependencies(DriverWrapper.scala:83)
    at org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:58)
    at org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.ivy.plugins.resolver.DependencyResolver
    at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(Unknown Source)
    at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(Unknown Source)
    at java.base/java.lang.ClassLoader.loadClass(Unknown Source)
    ... 4 more

I checked the /opt/spark/jars folder inside the Spark container and confirmed that ivy-2.5.2.jar is present, so it’s not missing. I also found this Jira ticket (https://issues.apache.org/jira/browse/SPARK-44968), which describes a Spark issue that arose when Ivy was upgraded to 2.5.2. That particular problem was resolved by downgrading to 2.5.1. While my issue is not identical, downgrading also resolves it in my case.

I'm wondering if there's a way to fix this issue without downgrading, or if there’s an incompatibility between Ivy 2.5.2 and Spark.

I’m fairly new to Spark and java, so I may be missing something obvious. Any guidance would be greatly appreciated!

Thanks


Solution

  • You should check the Ivy Dependencies. IF that don't fix it try to Explicitly Add Dependency by manually adding an older JAR that contains it might work as a temporary solution.