interruptsimpybatching

Simulating a system of batching jobs with interrupting set-up/switch-on times using Simpy


I am new to Simpy and have a problem with combining batching jobs and interrupting set-up time. So could you please help me?

I would like to create a system with servers that need time to set up before being ready to serve. The system starts to set up whenever enough M (2, 3,...) customers are in the queue. If the number of customers in the system reaches the maximum number of K(50), the coming customer will balk.

When a batch( group of M customers) leaves the system, we check if there are M customers(a batch) who are waiting to be served. If so, we keep the server remaining ON, otherwise, we turn off the server immediately.

I found some code for quite the same problem in a Simpy google group about Covid test simulation that uses Stores Resources and the answer for interrupting set-up time with Container Resources by Michael R. Gibbs https://groups.google.com/g/python-simpy/c/iFYaDlL4fq0

Interrupt an earlier timeout event in Simpy

I tried to combine 2 codes but It didn't work.

Example, when M = 2, K = 50

  1. Customer 1 arrives and waits

  2. Customer 2 arrives, enough 2 customers then request a server.

  3. Server 1 is SETUP in t1 secs.

  4. Customer 3 arrives and waits

  5. Customer 4 enough 2 customers then request a server.

  6. Server 2 is SETUP in t1 secs.

  7. Server 1 is ON.

  8. Customers 1 and 2 occupied server 1

  9. Customer 1 and 2 completes the service and leaves the system.

  10. Customers 3 and 4 occupied server 1 (because when server 1 finishes

  11. Server 2 is still in the setup process)

  12. Server 2 (still in SETUP mode) is turned off...

  13. ... Customer 100 arrives and sees the system has 50 customers, then customer 100 balk


Solution

  • Broke customer arrivals into two parts. A first queue that where the customer waits until there is enough customers to make a batch. When I have enough customers to make a batch, I do so, popping the batched customers from the batching queue, and putting the batch in a processing queue. I count the customers in both queues to see if a arriving customer aborts entry.

    When a batch is put in the processing queue, I also start up a server. This means that the number of batches in the processing queue will equal the number of servers starting up. This also means that when a server finishes starting up, there will be a batch to process. Since there will never be a wait for a batch, I use a simple list for my queue.

    When a batch starts up, it grabs a batch, and removes itself from the list of starting servers. After the server finishes processing a batch, it checks if there is a batch in the processing queue. If so, grab the batch and keep processing, but also kill the server that is starting up to process the batch. If no batches in the processing queue, shut down.

    Here is the code. You should see in the log the queues max out and customers abort, but also see servers start to shut down towards the end

    """
    Simulation of servers processing batches
    
    Customers enter a queue where they wait for
    enough customers to make a batch
    
    If the there are too many customers in the queues
    the arriving customer will abort
    
    When a batch is made, it is put into a second 
    processing queue where the batch waits to be processed.
    
    When a batch is put into the processing queue, it
    starts a server.  The server has a start up delay 
    then loops by seizing a batch, process the batch, release
    the batch, checking if another batch is in the 
    processing queue.  If there is another batch, stop a server
    that is starting up and process the batch, else end loop
    and shutdown server
    
    Programmer: Michael R. Gibbs
    
    """
    
    import simpy
    import random
    
    max_q_size = 50
    batch_size = 2
    server_start_time = 55
    processing_time = lambda : random.triangular(5,20,10)
    arrival_gap = lambda : random.triangular(1,1,1)
    
    # there is no wating so normal lists are good enough
    batching_q = list()
    processing_q = list()
    server_q = list() # servers that are still starting up
    
    class Server():
        """
        Server that process batches
    
        Has two states: starting up, and batch processing
        """
        def __init__(self, id, env, processing_q, server_q):
    
            self.id = id
            self.env = env
            self.processing_q = processing_q
            self.server_q = server_q
    
            self.start_process = self.env.process(self.start_up())
    
        def start_up(self):
            """
            starts up the server, then start processing batches
    
            start up can be interrupted, stoping the server
            """
    
            # start up
            try:
                print(f'{self.env.now} server {self.id} starting up')
                yield self.env.timeout(server_start_time)
    
                print(f'{self.env.now} server {self.id} started')
    
                self.env.process(self.process())
            
            except simpy.Interrupt:
                print(f'{env.now} server {self.id} has been interupted')
    
        def process(self):
            """
            process batches
            keeps going as long as there are batches in queue
    
            If starts second batch, also interupts starting up server
            """
    
            while True:
                print(f'{self.env.now} server {self.id} starting batch process')
                b = processing_q.pop(0)
                yield self.env.timeout(processing_time())
    
                print(f'{self.env.now} server {self.id} finish batch process')
    
                if len(self.processing_q) > 0:
                    # more processes to do,
                    # steal batch from starting up server
    
                    s = self.server_q.pop() # lifo
                    s.stop()
    
                else:
                    print(f'{env.now} server {self.id} no more batches, shutting down')
                    break
    
        def stop(self):
            """
            Interrupts server start up, stoping server
            """
            try:
                self.start_process.interrupt()
            except:
                pass
    
    def gen_arrivals(env, batching_q, processing_q, server_q):
        """
        Generate arring customers
    
        If queues are too big customer will abort
    
        If have enough customers, create a batch and start a server
        """
    
        id = 1
        while True:
            yield env.timeout(arrival_gap())
    
            q_size = len(batching_q) + (batch_size * len(processing_q))
            if q_size >= max_q_size:
                print(f'{env.now} customer arrived and aborted, q len: {q_size}')
            
            else:
                print(f'{env.now} customer has arrived, q len: {q_size}')
    
                customer = object()
                batching_q.append(customer)
    
                # check if a batch can be creatd
                while len(batching_q) >= batch_size:
                    batch = list()
                    while len(batch) < batch_size:
                        batch.append(batching_q.pop(0))
    
                    # put batch in processing q
                    processing_q.append(batch)
    
                    # start server
                    server = Server(id, env, processing_q, server_q)
                    id += 1
                    server_q.append(server)
    
    
    # boot up sim
    env = simpy.Environment()
    env.process(gen_arrivals(env, batching_q, processing_q, server_q))
    
    env.run(100)