Right now I have a central module in a framework that spawns multiple processes using the Python 2.6 multiprocessing
module. Because it uses multiprocessing
, there is module-level multiprocessing-aware log, LOG = multiprocessing.get_logger()
. Per the docs, this logger (EDIT) does not have process-shared locks so that you don't garble things up in sys.stderr
(or whatever filehandle) by having multiple processes writing to it simultaneously.
The issue I have now is that the other modules in the framework are not multiprocessing-aware. The way I see it, I need to make all dependencies on this central module use multiprocessing-aware logging. That's annoying within the framework, let alone for all clients of the framework. Are there alternatives I'm not thinking of?
The only way to deal with this non-intrusively is to:
select
from the pipes' file descriptors, perform merge-sort on the available log entries, and flush to centralized log. Repeat.)