I'm trying to run my class java that read a GML file and I use Tinkerpop and GMLReader for that , the problem is when I try to run it with spark it gives me an exception
I wrote a simple code for testing :
public static void main(String[] args) throws IOException {
TinkerGraph graphs = new TinkerGraph();
String in = "/home/salma/Desktop/celegansneural.gml";
GMLReader.inputGraph(graphs, in);
System.out.println(graphs);
}
The command that I'm using to run the class :
root@salma-SATELLITE-C855-1EQ:/usr/local/spark# ./bin/spark-submit --class graph_example.WordCount --master local[2] ~/workspace/graph_example/target/graph_example-0.0.1-SNAPSHOT.jar
Error :
Exception in thread "main" java.lang.NoClassDefFoundError:
com/tinkerpop/blueprints/impls/tg/TinkerGraph
at graph_example.WordCount.main(WordCount.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.tinkerpop.blueprints.impls.tg.TinkerGraph
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 10 more
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
You have to provide the dependency containing the TinkerGraph implementation. If I'm not mistaken, you need to provide this jar
Then you run spark-submit
as usual but with --jars /some/location/blueprints-core-2.6.0.jar
It is explained in the official documentation :
When using spark-submit, the application jar along with any jars included with the --jars option will be automatically transferred to the cluster.