pythonrandompacket-loss

Simulate packet loss in UDP in python


I am supposed to simulate a packet loss rate of 10^-2 in a Stop-and-wait protocol, that is 0.01, which means that 1 out of 100 packets transmitted get lost. Suppose I'm sending 1000 packets, how to drop exactly 1 random packet out of the 100 packets sent throughout the transmission ?


Solution

  • Having a rate of 0.01 doesn't mean that exactly 1 out of 100 packets gets dropped. It means that each packet has a 1% chance of getting lost. Under the assumption that losses are independent of each other, the actual number of lost packets will follow a binomial distribution.

    For each packet you generate, check if a random Uniform(0,1) is less than or equal to the proportion of losses p, in your case 0.01. If it is, that packet is lost, otherwise it goes through. This approach scales if you increase or decrease the N, the total number of packets. The expected number of losses will be N * p, but if you repeat the experiment multiple times there will be variability.