c++cclientopc-uaopen62541

open62541 client fails when calling method with custom datatype input argument


I'm using open62541 to connect to an OPC/UA server and I'm trying to call methods that a certain object on that server provides. Those methods have custom types as input arguments; for example, the following method takes a structure of three booleans:

    <opc:Method SymbolicName="SetStatusMethodType" ModellingRule="Mandatory">
      <opc:InputArguments>
        <opc:Argument Name="Status" DataType="VisionStatusDataType" ValueRank="Scalar"/>
      </opc:InputArguments>
      <opc:OutputArguments />
    </opc:Method>

Here, VisionStatusDataType is the following structure:

  <opc:DataType SymbolicName="VisionStatusDataType" BaseType="ua:Structure">
    <opc:ClassName>VisionStatus</opc:ClassName>
    <opc:Fields>
      <opc:Field Name="Camera" DataType="ua:Boolean" ValueRank="Scalar"/>
      <opc:Field Name="StrobeController" DataType="ua:Boolean" ValueRank="Scalar"/>
      <opc:Field Name="Server" DataType="ua:Boolean" ValueRank="Scalar"/>
    </opc:Fields>
  </opc:DataType>

Now, when calling the method, I'm encoding the data into an UA_ExtensionObject, and wrap that one as an UA_Variant to provide it to UA_Client_call. The encoding looks like this:

void encode(const QVariantList& vecqVar, size_t& nIdx, const DataType& dt, std::back_insert_iterator<std::vector<UAptr<UA_ByteString>>> itOut)
{
    if (dt.isSimple())
    {
        auto&& qVar = vecqVar.at(nIdx++);
        auto&& uaVar = convertToUaVar(qVar, dt.uaType());
        auto pOutBuf = create<UA_ByteString>();
        auto nStatus = UA_encodeBinary(uaVar.data, dt.uaType(), pOutBuf.get());
        statusCheck(nStatus);
        itOut = std::move(pOutBuf);
    }
    else
    {
        for (auto&& dtMember : dt.members())
            encode(vecqVar, nIdx, dtMember, itOut);
    }
}

UA_Variant ToUAVariant(const QVariant& qVar, const DataType& dt)
{
    if (dt.isSimple())
        return convertToUaVar(qVar, dt.uaType());
    else
    {
        std::vector<UAptr<UA_ByteString>> vecByteStr;
        auto&& qVarList = qVar.toList();
        size_t nIdx = 0UL;
        encode(qVarList, nIdx, dt, std::back_inserter(vecByteStr));

        auto pExtObj = UA_ExtensionObject_new();
        pExtObj->encoding = UA_EXTENSIONOBJECT_ENCODED_BYTESTRING;
        auto nSizeAll = std::accumulate(vecByteStr.cbegin(), vecByteStr.cend(), 0ULL, [](size_t nSize, const UAptr<UA_ByteString>& pByteStr) {
            return nSize + pByteStr->length;
        });
        auto&& uaEncoded = pExtObj->content.encoded;
        uaEncoded.typeId = dt.uaType()->typeId;
        uaEncoded.body.length = nSizeAll;
        auto pData = uaEncoded.body.data = new UA_Byte[nSizeAll];
        nIdx = 0UL;
        for (auto&& pByteStr : vecByteStr)
        {
            memcpy_s(pData + nIdx, nSizeAll - nIdx, pByteStr->data, pByteStr->length);
            nIdx += pByteStr->length;
        }

        UA_Variant uaVar;
        UA_Variant_init(&uaVar);
        UA_Variant_setScalar(&uaVar, pExtObj, &UA_TYPES[UA_TYPES_EXTENSIONOBJECT]);
        return uaVar;
    }
}

The DataType class is a wrapper for the UA_DataType structure; the original open62541 type can be accessed via DataType::uaType().

Now, once a have the variant (containing the extension object), the method call looks like this:

    auto uavarInput = ToUAVariant(qvarArg, dtInput);
    UA_Variant* pvarOut;
    size_t nOutSize = 0UL;
    auto nStatus = UA_Client_call(m_pClient, objNode.nodeId(), m_uaNodeId, 1UL, &uavarInput, &nOutSize, &pvarOut);

The status is 2158690304, i.e. BadInvalidArgument according to UA_StatusCode_name.

Is there really something wrong with the method argument? Are we supposed to send ExtensionObjects, or what data type should the variant contain? Is it possible that the server itself (created using the .NET OPC/UA stack) is not configured correctly?

N.B., the types here are custom types; that is, the encoding is done manually (see above) by storing the byte representation of all members next to each other in an UA_ByteString - just the opposite of what I'm doing when reading variables or output arguments, which works just fine.


Solution

  • The problem is the typeId of the encoded object. For the server in order to understand the received data, it needs to know the NodeId of the encoding, not the actual NodeId of the type itself. That encoding can be found by following the HasEncoding reference (named "Default Binary") of the type:

            auto pRequest = create<UA_BrowseRequest>();
            auto pDescr = pRequest->nodesToBrowse = UA_BrowseDescription_new();
            pRequest->nodesToBrowseSize = 1UL;
            pDescr->nodeId = m_uaNodeId;
            pDescr->resultMask = UA_BROWSERESULTMASK_ALL;
            pDescr->browseDirection = UA_BROWSEDIRECTION_BOTH;
            pDescr->referenceTypeId = UA_NODEID_NUMERIC(0, UA_NS0ID_HASENCODING);
    
            auto response = UA_Client_Service_browse(m_pClient, *pRequest);
            for (auto k = 0UL; k < response.resultsSize; ++k)
            {
                auto browseRes = response.results[k];
                for (auto n = 0UL; n < browseRes.referencesSize; ++n)
                {
                    auto browseRef = browseRes.references[n];
                    if (ToQString(browseRef.browseName.name).contains("Binary"))
                    {
                        m_nodeBinaryEnc = browseRef.nodeId.nodeId;
                        break;
                    }
                }
            }
    

    Once you have that NodeId, you pass it to UA_ExtensionObject::content::encoded::typeId:

            auto pExtObj = UA_ExtensionObject_new();
            pExtObj->encoding = UA_EXTENSIONOBJECT_ENCODED_BYTESTRING;
            auto nSizeAll = std::accumulate(vecByteStr.cbegin(), vecByteStr.cend(), 0ULL, [](size_t nSize, const UAptr<UA_ByteString>& pByteStr) {
                return nSize + pByteStr->length;
            });
            auto&& uaEncoded = pExtObj->content.encoded;
            uaEncoded.typeId = dt.encoding();
            uaEncoded.body.length = nSizeAll;
            auto pData = uaEncoded.body.data = new UA_Byte[nSizeAll];
            nIdx = 0UL;
            for (auto&& pByteStr : vecByteStr)
            {
                memcpy_s(pData + nIdx, nSizeAll - nIdx, pByteStr->data, pByteStr->length);
                nIdx += pByteStr->length;
            }