I am working with DeepLearning4j library. I am running everything on HPC and I generate a jar file to submit with spark-submit. I was using the version beta7 and I didn't have any problem. Recently, I wanted to upgrade the version to M2 but I got this error:
at org.nd4j.linalg.cpu.nativecpu.ops.NativeOpExecutioner.<init>(NativeOpExecutioner.java:79)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at org.nd4j.linalg.factory.Nd4j.initWithBackend(Nd4j.java:5129)
at org.nd4j.linalg.factory.Nd4j.initContext(Nd4j.java:5044)
at org.nd4j.linalg.factory.Nd4j.<clinit>(Nd4j.java:269)
at org.datavec.image.loader.NativeImageLoader.transformImage(NativeImageLoader.java:631)
at org.datavec.image.loader.NativeImageLoader.asMatrix(NativeImageLoader.java:554)
at org.datavec.image.loader.NativeImageLoader.asMatrix(NativeImageLoader.java:280)
at org.datavec.image.loader.NativeImageLoader.asMatrix(NativeImageLoader.java:255)
at org.datavec.image.loader.NativeImageLoader.asMatrix(NativeImageLoader.java:249)
at org.datavec.image.recordreader.BaseImageRecordReader.next(BaseImageRecordReader.java:247)
at org.datavec.image.recordreader.BaseImageRecordReader.nextRecord(BaseImageRecordReader.java:511)
at org.deeplearning4j.datasets.datavec.RecordReaderDataSetIterator.initializeUnderlying(RecordReaderDataSetIterator.java:194)
at org.deeplearning4j.datasets.datavec.RecordReaderDataSetIterator.next(RecordReaderDataSetIterator.java:341)
at org.deeplearning4j.datasets.datavec.RecordReaderDataSetIterator.next(RecordReaderDataSetIterator.java:421)
at org.deeplearning4j.datasets.datavec.RecordReaderDataSetIterator.next(RecordReaderDataSetIterator.java:53)
at com.examples.DeepLearningOnSpark.imageNet_image.streaming.NetworkRetrainingMain.entryPoint(NetworkRetrainingMain.java:55)
at com.examples.DeepLearningOnSpark.imageNet_image.streaming.NetworkRetrainingMain.main(NetworkRetrainingMain.java:31)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.RuntimeException: ND4J is probably missing dependencies. For more information, please refer to: https://deeplearning4j.konduit.ai/nd4j/backend
at org.nd4j.nativeblas.NativeOpsHolder.<init>(NativeOpsHolder.java:116)
at org.nd4j.nativeblas.NativeOpsHolder.<clinit>(NativeOpsHolder.java:37)
... 34 more
Caused by: java.lang.UnsatisfiedLinkError: no jnind4jcpu in java.library.path
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1867)
at java.lang.Runtime.loadLibrary0(Runtime.java:870)
at java.lang.System.loadLibrary(System.java:1122)
at org.bytedeco.javacpp.Loader.loadLibrary(Loader.java:1800)
at org.bytedeco.javacpp.Loader.load(Loader.java:1402)
at org.bytedeco.javacpp.Loader.load(Loader.java:1214)
at org.bytedeco.javacpp.Loader.load(Loader.java:1190)
at org.nd4j.linalg.cpu.nativecpu.bindings.Nd4jCpu.<clinit>(Nd4jCpu.java:14)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.nd4j.common.config.ND4JClassLoading.loadClassByName(ND4JClassLoading.java:62)
at org.nd4j.common.config.ND4JClassLoading.loadClassByName(ND4JClassLoading.java:56)
at org.nd4j.nativeblas.NativeOpsHolder.<init>(NativeOpsHolder.java:88)
... 35 more
Caused by: java.lang.UnsatisfiedLinkError: /home/h4/nore667e/.javacpp/cache/deepLearningSimpleOne-1.0-SNAPSHOT-jar-with-dependencies.jar/org/nd4j/linalg/cpu/nativecpu/bindings/linux-x86_64/libjnind4jcpu.so: /lib64/libm.so.6: version `GLIBC_2.27' not found (required by /home/h4/nore667e/.javacpp/cache/deepLearningSimpleOne-1.0-SNAPSHOT-jar-with-dependencies.jar/org/nd4j/linalg/cpu/nativecpu/bindings/linux-x86_64/libnd4jcpu.so)
at java.lang.ClassLoader$NativeLibrary.load(Native Method)
at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1941)
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1824)
at java.lang.Runtime.load0(Runtime.java:809)
at java.lang.System.load(System.java:1086)
at org.bytedeco.javacpp.Loader.loadLibrary(Loader.java:1747)
... 44 more
Here is my pom.xml file:
<artifactId>deepLearningSimpleOne</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<dl4j-master.version>1.0.0-M2</dl4j-master.version>
<!-- Change the nd4j.backend property to nd4j-cuda-X-platform to use CUDA GPUs -->
<!-- <nd4j.backend>nd4j-cuda-10.2-platform</nd4j.backend> -->
<nd4j.backend>nd4j-native</nd4j.backend>
<java.version>1.8</java.version>
<shadedClassifier>bin</shadedClassifier>
<scala.binary.version>2.12</scala.binary.version>
<maven-compiler-plugin.version>3.8.1</maven-compiler-plugin.version>
<maven.minimum.version>3.3.1</maven.minimum.version>
<exec-maven-plugin.version>1.4.0</exec-maven-plugin.version>
<maven-shade-plugin.version>2.4.3</maven-shade-plugin.version>
<jcommon.version>1.0.23</jcommon.version>
<jfreechart.version>1.0.13</jfreechart.version>
<logback.version>1.1.7</logback.version>
<jcommander.version>1.27</jcommander.version>
<azure.hadoop.version>2.7.4</azure.hadoop.version>
<azure.storage.version>2.0.0</azure.storage.version>
<spark.version>2.4.8</spark.version>
<aws.sdk.version>1.11.109</aws.sdk.version>
<jackson.version>2.5.1</jackson.version>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>${exec-maven-plugin.version}</version>
<executions>
<execution>
<goals>
<goal>exec</goal>
</goals>
</execution>
</executions>
<configuration>
<executable>java</executable>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>${maven-shade-plugin.version}</version>
<configuration>
<shadedArtifactAttached>true</shadedArtifactAttached>
<shadedClassifierName>${shadedClassifier}</shadedClassifierName>
<createDependencyReducedPom>true</createDependencyReducedPom>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>org/datanucleus/**</exclude>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>reference.conf</resource>
</transformer>
<transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
<!-- Added to enable jar creation using mvn command-->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>3.3.0</version>
<configuration>
<archive>
<manifest>
<mainClass>fully.qualified.MainClass</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<!-- bind to the packaging phase -->
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.5.1</version>
<configuration>
<source>${java.version}</source>
<target>${java.version}</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.deeplearning4j</groupId>
<artifactId>resources</artifactId>
<version>${dl4j-master.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.nd4j</groupId>
<artifactId>${nd4j.backend}</artifactId>
<version>${dl4j-master.version}</version>
</dependency>
<dependency>
<groupId>org.datavec</groupId>
<artifactId>datavec-spark_${scala.binary.version}</artifactId>
<version>${dl4j-master.version}</version>
</dependency>
<dependency>
<groupId>org.deeplearning4j</groupId>
<artifactId>dl4j-spark_${scala.binary.version}</artifactId>
<version>${dl4j-master.version}</version>
</dependency>
<dependency>
<groupId>org.deeplearning4j</groupId>
<artifactId>dl4j-spark-parameterserver_${scala.binary.version}</artifactId>
<version>${dl4j-master.version}</version>
</dependency>
<dependency>
<groupId>com.beust</groupId>
<artifactId>jcommander</artifactId>
<version>${jcommander.version}</version>
</dependency>
<!-- Used for patent classification example -->
<dependency>
<groupId>org.deeplearning4j</groupId>
<artifactId>deeplearning4j-nlp</artifactId>
<version>${dl4j-master.version}</version>
</dependency>
<dependency>
<groupId>org.deeplearning4j</groupId>
<artifactId>deeplearning4j-zoo</artifactId>
<version>${dl4j-master.version}</version>
</dependency>
<dependency>
<groupId>org.deeplearning4j</groupId>
<artifactId>deeplearning4j-core</artifactId>
<version>1.0.0-M2</version>
</dependency>
<dependency>
<groupId>org.jsoup</groupId>
<artifactId>jsoup</artifactId>
<version>1.10.2</version>
</dependency>
</dependencies>
</project>
Anyone can help me please. Thank you!
Caused by: java.lang.UnsatisfiedLinkError: /home/h4/nore667e/.javacpp/cache/deepLearningSimpleOne-1.0-SNAPSHOT-jar-with-dependencies.jar/org/nd4j/linalg/cpu/nativecpu/bindings/linux-x86_64/libjnind4jcpu.so: /lib64/libm.so.6: version `GLIBC_2.27' not found (required by /home/h4/nore667e/.javacpp/cache/deepLearningSimpleOne-1.0-SNAPSHOT-jar-with-dependencies.jar/org/nd4j/linalg/cpu/nativecpu/bindings/linux-x86_64/libnd4jcpu.so)
Your stacktrace tells you exactly what the problem is.
This part in particular:
/lib64/libm.so.6: version `GLIBC_2.27' not found
That means that the operating system this is running on is too old (or rather its glibc version).
There is a binary that is still compatible with older releases.
You can get it by adding the following dependency (adapted to fit the style you've posted here):
<dependency>
<groupId>org.nd4j</groupId>
<artifactId>${nd4j.backend}</artifactId>
<version>${dl4j-master.version}</version>
<classifier>linux-x86_64-compat</classifier>
</dependency>
This new classifier exists since version M1.1: https://deeplearning4j.konduit.ai/release-notes/1.0.0-m1.1
Breaking default compatibility with old OS versions was necessary to enable the use of some speedups that are available on newer versions of glibc.