pythonapache-sparkpysparkrdddstream

count number of elements in each pyspark Dstream


I'm looking to find a way to count the number of elements (or number of RDDs) which I receive each time in my Dstream created in pyspark, that I'm using. if you know a way that could help me, I will be pleased. Thanks.


Solution

  • I used the code below, to do give each data a one and then count those ones; just like a simple word count but instead of words, I count each data.

    I do that with the code below, but if you guys have any other solution feel free to add; Thanks.

    from pyspark.streaming import StreamingContext
    from pyspark import SparkContext
    
    # Create a local StreamingContext with two working thread and batch interval of 1 second
    sc = SparkContext('local[2]', 'Networkcount')
    ssc = StreamingContext(sc, 10)
    
    # Create a DStream that will connect to hostname:port, like localhost:7777
    data_received = ssc.socketTextStream("127.0.0.1", 7776)
    
    lines = data_received.map(lambda data: 1)
    count = lines.reduce(lambda x, y: x + y)
    count.pprint()