apache-sparkspark-shell

How to kill a spark shell via Spark's REST API?


I'm running Spark version 2.0.1 and want to kill a spark shell via the REST API (cannot use any other methods such as the yarn commands, for instance). I managed to get the application id (with the spark-master:8080/json/ endpoint), but I could not find any API that allows to kill an application based on its ID (I'm familiar with the http://spark-master:6066/v1/submissions/kill/ endpoint, but the spark shell does not have a driver ID).

So is there a way to kill an application based on its ID, or should i just keep chasing pavements?


Solution

  • After some digging (including installing tcpdump on the Spark master server, for listening to REST requests coming from the Spark UI application), I've managed to find the correct request:

    curl -X POST http://{spark-master-server}:{spark-server-port}/app/kill/ -d 'id=spark-shell-app-id&terminate=true'