Code below is successfully created spark context when I submit using
spark submit
and running fine.
When I kill application under Running Applications
from Apache spark
web UI, application state shows killed
but, printing Test application
on screen after killing also:
Application running on apache spark web UI:
Application killed using "kill" button on spark web UI
Still printing message on screen after killing application
from pyspark import SparkConf
from pyspark import SparkContext
if __name__ == "__main__":
conf = SparkConf().setAppName("TEST")
conf.set("spark.scheduler.mode", "FAIR")
sc = SparkContext(conf=conf)
while True:
print("Test application")
I found a way to solve my issue with below code. Thanks for all your responses
from pyspark import SparkConf
from pyspark import SparkContext
if __name__ == "__main__":
conf = SparkConf().setAppName("TEST")
conf.set("spark.scheduler.mode", "FAIR")
sc = SparkContext(conf=conf)
while True:
if sc._jsc.sc().isStopped():
break
print("Test application")