pythonloggingmultiprocessingmultiprocess

Python logging to cmd with multiprocessing


I am having issues with info log messages from a spawned process not coming through. The code below shows the warning message coming from the worker, but not the info. I am running on windows 10, with python 3.10.5

from multiprocessing import Process, Queue
import logging

class worker():
    def __init__(self, log_queue):
        self.logger = logging.getLogger(__name__)
        self.logger.setLevel(logging.INFO)

        self.logger.info('worker info')
        self.logger.warning('worker warning')
    

class boss():
    def __init__(self):
        self.logger = logging.getLogger()
        self.logger.info('boss here')


if __name__ == '__main__':    
    queue = Queue()

    logging.basicConfig(level=logging.DEBUG)

    worker_p = Process(target=worker, args=(queue,))
    worker_p.start()
    boss_instance = boss()
    worker_p.join()

The code above gives:

INFO:root:boss here
worker warning

Is there a simple way to get messages from the worker show in the command window/main log?

I tried with a logging queue and handlers, but it seems I am just missing a command to get the spawned process to issue info logs.


Solution

  • Simplest way is probably to initialize a separate logger for each process. I think what is missing from your code is that you haven't added any handlers for your loggers in the processes, which use the root logger stream handler instead and have some hard for me to understand configuration.

    Easy solution is to have separate loggers in each process with their own configuration.

    from multiprocessing import Process, Queue
    import logging
    
    
    class worker():
        def __init__(self, log_queue):
            self.logger = logging.getLogger(self.__class__.__name__)
            self.logger.addHandler(logging.StreamHandler())
            self.logger.setLevel(logging.INFO)
    
            self.logger.info('worker info')
            self.logger.warning('worker warning')
    
    
    class boss():
        def __init__(self):
            self.logger = logging.getLogger(self.__class__.__name__)
            self.logger.addHandler(logging.StreamHandler())
            self.logger.setLevel(logging.INFO)
            self.logger.info('boss here')
    
    
    if __name__ == '__main__':
        queue = Queue()
        worker_p = Process(target=worker, args=(queue,))
        worker_p.start()
        boss_instance = boss()
        worker_p.join()
    

    This should be fine for just logging to standard output, but if you want add life logging, then you need to implement a logging queue: https://docs.python.org/3/howto/logging-cookbook.html#logging-to-a-single-file-from-multiple-processes