I wanna create a seperate python process with loaded functions, which can be acessed/used/called later with different arguments multiple times (to overcome GIL). This is kind of like multiprocessing
, but creating a new process via multiprocessing
is too slow, because (as I understood) the process is unloaded after its execution. I'm not sure what exactly do I need to use.
How can I start a process multiple times with different args without a need to create it again?
I tried using multiprocessing
, but I can't use it several times (AssertionError: cannot start a process twice
). I also checked celery
, but I'm not sure if this is the right way
Example:
from multiprocessing import Process
def add(a, b):
print(a + b)
if __name__ == "__main__":
p = Process(target=add, args=(5, 5))
p.start()
p.join()
p.start() # Error
You can create a separate Python script that loads the necessary functions and runs a loop to listen for incoming requests. Then communicate with this script from main process using a suitable interprocess communication mechanism such as pipes, sockets, or shared memory.
import multiprocessing as mp
def add(a, b):
return a + b
def worker():
while True:
task, args = q.get()
if task == 'add':
result = foo(*args)
q.put(result)
else:
break
if __name__ == '__main__':
q = mp.Queue()
p = mp.Process(target=worker)
p.start()
if __name__ == '__main__':
q = mp.Queue()
p = mp.Process(target=worker)
p.start()
q.put(('add', [5, 5])) # Compute add(5, 5)
result = q.get() # Get the result (10)
q.put(('add', [3, 3])) # Compute bar(3, 3)
result = q.get() # Get the result (6)
q.put(('stop', [])) # Stop the worker process
p.join()