pythonloggingmultiprocessingconcurrent.futures

Python multiprocessing: Does print/logging to stdout block main process?


When using python multiprocessing, for example with a process pool executor like in the sample below, does logging to the same console in the main processes and worker processes may cause any blocking in main process and/or worker processes?

By logging I mean using pythons standard logging module without the usage of QueueHandler or anything.

import concurrent.futures  
self.worker_pool = futures.ProcessPoolExecutor(max_workers=2, mp_context=multiprocessing.get_context("forkserver"))

Solution

  • The target of logging is stdout which is an IO device like any other so it is certainly possible that writing to it may block the current thread if no counter-measures are taken (e.g., using async routines). It is also a shared resource which means that, internally, there's a synchronization mechanism (e.g., a mutex lock) to avoid intermittent writes.

    You can verify this behavior by swapping out the target file from sys.stdout to, for example, a FIFO that has no consumer (which will block forever when written to).

    a.py

    fd = open('myfifo', 'w')
    print('test', file=fd)
    

    b.py

    from concurrent.futures import ProcessPoolExecutor
    
    def run(i):
        fd = open('myfifo', 'w')
        print("test", file=fd)
        print(f"written from {i}")
    
    n_workers = 2
    
    with ProcessPoolExecutor(max_workers=2) as exc:
        for i in range(n_workers):
            exc.map(run, [i])
    

    Test scenario:

    $ mkfifo myfifo
    $ python a.py # or b.py
    

    You will observe that both won't terminate because they're blocking while writing to the file. Standard out is, in theory, not different. If that's a problem for your application is for you to decide but often times it doesn't matter since stdout is buffered and consumed promptly.