spark-structured-streamingapache-bahir

Why does query throw ClassCastException "SerializedOffset cannot be cast to org.apache.spark.sql.execution.streaming.LongOffset" with MQTT Source?


I am getting the following exception when Spark Structured Streaming code

18/12/05 15:00:38 ERROR StreamExecution: Query [id = 48ec92a0-811a-4d57-a65d-c0b9c754e093, runId = 5e2adff4-855e-46c6-8592-05e3557544c6] terminated with error java.lang.ClassCastException: org.apache.spark.sql.execution.streaming.SerializedOffset cannot be cast to org.apache.spark.sql.execution.streaming.LongOffset at org.apache.bahir.sql.streaming.mqtt.MQTTTextStreamSource.getBatch(MQTTStreamSource.scala:152) at org.apache.spark.sql.execution.streaming.StreamExecution$$anonfun$org$apache$spark$sql$execution$streaming$StreamExecution$$runBatch$2$$anonfun$apply$7.apply(StreamExecution.scala:614)

This exception happens every time I start the query. It does work when I start it after deleting the checkpoint.

Spark Structured streaming code is as below, basically I am just reading data from a MQTT queue and writing to an ElasticSearch index.

spark
  .readStream
  .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
  .option("topic", "Employee")
  .option("username", "username")
  .option("password", "password")
  .option("clientId", "employee11")
  .load("tcp://localhost:8000")
  .as[(String, Timestamp)]
  .writeStream
  .outputMode("append")
  .format("es")
  .option("es.resource", "spark/employee")
  .option("es.nodes", "localhost")
  .option("es.port", 9200)
  .start()
  .awaitTermination()

Following are the dependencies used. I use MapR distribution.

  "org.apache.spark" %% "spark-core" % "2.2.1-mapr-1803",
  "org.apache.spark" %% "spark-sql" % "2.2.1-mapr-1803",
  "org.apache.spark" %% "spark-streaming" % "2.2.1-mapr-1803",
  "org.apache.bahir" %% "spark-sql-streaming-mqtt" % "2.2.1",
  "org.apache.bahir" %% "spark-streaming-mqtt" % "2.2.1",
  "org.elasticsearch" %% "elasticsearch-spark-20" % "6.3.2"

Spark-submit command

/opt/mapr/spark/spark-2.2.1/bin/spark-submit \
  --master yarn \
  --deploy-mode cluster \
  --jars spark-sql-streaming-mqtt_2.11-2.2.1.jar,org.eclipse.paho.client.mqttv3-1.1.0.jar,elasticsearch-spark-20_2.11-6.3.2.jar,mail-1.4.7.jar myjar_2.11-0.1.jar \
  --class <MAIN_CLASS>

Any help on this will be appreciated.


Solution

  • It seems to a bug in Apache Bahir.