when using a code like this
def execute_run(list_out):
... do something
pool = ThreadPoolExecutor(6)
for i in list1:
for j in list2:
pool.submit(myfunc, list_out)
pool.join()
assuming the threads modify list_out, do they do that in a synchronous manner?
If your goal is calculate something in multiprocessing way, it's better do not share state.
I propose you to use simple map
from multiprocessing if it's possible:
from multiprocessing import Pool
input_list = []
for i in list1:
for j in list2:
input_list.append((i, j))
p = Pool()
result_list = p.map(do_something, input_list)
map
works like for-loop:
def naive_map(input_list, do_something):
result = []
for i in input_list:
result.append(do_something(i))
return result
So, if you want to use function that is accepting several arguments, you can use lambda function for unpacking tuple.
>> def your_function(v1, v2):
>> return v1+v2
>> f = lambda (x,y): your_function(x, y)
>> map(f, [(1,2),(3,4),(5,6)])
[3, 7, 11]