pythonmatlabparallel-processingparfor

Pass Python object as argument to function in "parfeval"


I am trying to pass one Python object as an argument to a function that I am evaluating in the background with parfeval. The Python object is an instance of a Python class, and I detail it below. However, to reproduce the error, I will exemplify with a Python dictionary... However, simply using struct(pydict) would not work because I would lose all the attributes and methods in the Python class.

Assume the Python dictionary is

o = py.dict(pyargs('soup',3.57,'bread',2.29,'bacon',3.91,'salad',5.00));

and the function is

function t = testFunc(x)
t = x{'soup'};
end

If I evaluate the function, I get the correct answer:

>> testFunc(o)
ans =
3.5700

However, if I use parfeval, I get the following error:

>> f = parfeval(@testFunc,1,o);
>> fetchOutputs(f)
Error using parallel.Future/fetchOutputs
One or more futures resulted in an error.
Caused by:
Error using testFunc (line 2)
Invalid or deleted object.

Is there a workaround to this error that doesn't mean I have to recode my whole Python class? Here is the preview of the object I want to pass as a function to parfeval:

clt = 
    Python Client with properties:
    enforce_enums: 1
    api_key: [1×45 py.str]
    request_number: [1×1 py.int]
    logger: [1×1 py.logging.Logger]
    session: [1×1 py.authlib.integrations.httpx_client.oauth2_client.OAuth2Client]
    token_metadata: [1×1 py.tda.auth.TokenMetadata]
    <tda.client.synchronous.Client object at 0x000001ECA08EAE50>

I didn't find any restrictions in the documentation that says that parfeval function inputs cannot be anything... https://www.mathworks.com/help/matlab/ref/parfeval.html

"X1,...,Xm — Input arguments comma-separated list of variables or expressions... Input arguments, specified as a comma-separated list of variables or expressions"


Solution

  • One of the limitations of the MATLAB->Python support is that Python objects cannot be serialized. parfeval (and other parallel constructs) require serialization to transfer data from one MATLAB process to another.

    You might be able to work around this by having each worker build the data structure directly and storing it / accessing it via parallel.pool.Constant, like this:

    oC = parallel.pool.Constant(@() py.dict(pyargs('soup',3.57,'bread',2.29,'bacon',3.91,'salad',5.00)));
    fetchOutputs(parfeval(@(c) c.Value{'salad'}, 1, oC))