c++excelvbavariant

Is it expected that Application.Run() in Excel reduces the number of dimensions of a returned xloper12 array? And is there a way to prevent this?


I wrote an XLL library to provide some functions written with the Excel C API.

When I return an array (as a xloper12 type xltypeMulti), I notice a strange behaviour. If the function returns an array n x m (where both n and m are greater than 1), this is correctly understood by Excel in both the sheet and in VBA when the called with Application.Run().

On the other hand if the returned array is 1 x m, this is understood correctly in the sheet but Application.Run() reduces the number of dimensions from 2 to 1. Therefore in VBA the array appears as a one-dimensional array with m elements.

I can handle this in VBA, but I wonder if there is a different way to return the VARIANT that is understood by both the sheet and VBA and preserves the number of dimensions of the variable.

As example I wrote two functions, one that returns a 1x2 array and one that returns a 2x2 array. The registration (following the Microsoft API reference manual) are :

    Excel12f(xlfRegister, 0, 10,
        (LPXLOPER12)&p_xllName,
        (LPXLOPER12)TempStr12(L"returns1x2table"),
        (LPXLOPER12)TempStr12(L"Q"),
        (LPXLOPER12)TempStr12(L"returns1x2table"),
        (LPXLOPER12)TempStr12(L""),
        (LPXLOPER12)TempInt12(1),
        (LPXLOPER12)TempStr12(L"Test"),
        (LPXLOPER12)TempStr12(L""),
        (LPXLOPER12)TempStr12(L""),
        (LPXLOPER12)TempStr12(L"Returns a variant 1x2" "));

The second function is defined in the same way and called "returns2x2table".

The implementation simply creates an xloper12 (I destroy after all in the callback), set it to be a multi and initializes the number of dimensions.

 extern "C" __declspec(dllexport) LPXLOPER12 WINAPI returns1x2table()
    {
        xloper12* result=(xloper12*)malloc(sizeof(xloper12));
        result->xltype = xltypeMulti | xlbitDLLFree;
        result->val.array.rows = 1;
        result->val.array.columns = 2;
        result->val.array.lparray = (xloper12*)malloc(sizeof(xloper12) * 2);
        result->val.array.lparray[0].xltype = xltypeNum;
        result->val.array.lparray[0].val.num = 3.1415;

        result->val.array.lparray[1].xltype = xltypeNum;
        result->val.array.lparray[1].val.num = 2.7182;
        return result;
    }

    extern "C" __declspec(dllexport) LPXLOPER12 WINAPI returns2x2table()
    {
        xloper12* result=(xloper12*)malloc(sizeof(xloper12));
        result->xltype = xltypeMulti | xlbitDLLFree;
        result->val.array.rows = 2;
        result->val.array.columns = 2;
        result->val.array.lparray = (xloper12*)malloc(sizeof(xloper12) * 4);
        result->val.array.lparray[0].xltype = xltypeNum;
        result->val.array.lparray[0].val.num = 3.1415;

        result->val.array.lparray[1].xltype = xltypeNum;
        result->val.array.lparray[1].val.num = 2.7182;
    
        result->val.array.lparray[2].xltype = xltypeNum;
        result->val.array.lparray[2].val.num = 1.4142;

        result->val.array.lparray[3].xltype = xltypeNum;
        result->val.array.lparray[3].val.num = 1.7321;
        return result;
    }

In the spreadsheet the two functions return the expected shape. The first one: returns1x2table() returns:

| 3.1415   | 2.7182   |

and the second one returns2x2table() returns:

| 3.1415   | 2.7182   |
| 1.4142   | 1.7321   |

So it seems that the functions are defined correctly.

But when I call in VBA:

    sub a()
        dim v1 as variant
        dim v2 as variant

        v1 = application.Run("returns1x2table")
        v2 = application.Run("returns2x2table")
    End Sub

I can see in the watches window that v1 is a "Variant/Variant(1 to 2)" hence the 2d was reduced to a single dimension and the values can be obtained with v1(1) and v1(2) instead of v1(1,1) and v1(1,2).

v2 instead is as expected: "Variant/Variant(1 to 2, 1 to 2)" and the elements are v(1,1), v(1,2), v(2,1) v(2, 2).

Is this expected? Is is inevitable that Application.Run() tries to convert the object to the minimum number of dimensions needed to store all the values?

Or is there a different way to declare the function so that this transformation doesn't happen?

Thank you in advance!


Solution

  • Consider the following VBA code:

    Option Explicit
    
    Function MyUDF(arr() As Variant) As Long
        MyUDF = GetArrayDimsCount(arr)
    End Function
    
    Function OneRow2DArray() As Variant()
        Dim res() As Variant: ReDim res(1 To 1, 1 To 2)
        OneRow2DArray = res
    End Function
    
    Public Function GetArrayDimsCount(ByRef arr As Variant) As Long
        Const MAX_DIMENSION As Long = 60 'VB limit
        Dim dimension As Long
        Dim tempBound As Long
        '
        On Error GoTo FinalDimension
        For dimension = 1 To MAX_DIMENSION
            tempBound = LBound(arr, dimension)
        Next dimension
    FinalDimension:
        GetArrayDimsCount = dimension - 1
    End Function
    
    Sub TestDimensions()
        Debug.Print GetArrayDimsCount(OneRow2DArray)
        Debug.Print GetArrayDimsCount(Application.Run("OneRow2DArray"))
        Debug.Print GetArrayDimsCount(Evaluate("OneRow2DArray()"))
        ActiveCell.Formula = "=MyUDF(OneRow2DArray())"
        Debug.Print ActiveCell.Value2
    End Sub
    

    Running TestDimensions returns:

    2
    2
    1
    1
    

    So, it seems that Application.Run is not the one causing this behaviour but rather the fact that an Evaluate call must be going on behind the scene when an User Defined Function (UDF) receives an array as argument.

    This actually goes further. If we were to change OneRow2DArray to:

    Function OneRow2DArray() As Variant
        Dim res() As Variant: ReDim res(1 To 1, 1 To 2)
        Dim res2(0 To 0) As Variant: res2(0) = res
        Dim res3(0 To 0, 0 To 0, 0 To 0) As Variant: res3(0, 0, 0) = res2
        OneRow2DArray = res2
    End Function
    

    which is basically an array inside an array inside another array, then Evaluate("OneRow2DArray()") still returns the lowest array as a 1D array with 2 values and simply discards the top 2 arrays. Same for the UDF.

    So, in certain scenarios, not only the number of dimensions is reduced but also any arrays that contain a single other array get removed. This is because Evaluate cannot return a nested array. For example Evaluate("NestedArray()") returns Error 2015 for:

    Function NestedArray() As Variant
        Dim arr(1 To 1, 1 To 2) As Variant
        Dim wrapper(0 To 1) As Variant: wrapper(0) = arr
        NestedArray = wrapper
    End Function
    

    Same for a UDF - Excel does not have the concept of nested arrays, only VBA does.

    I don't think there is a way to avoid this behaviour. I can only think of the following 2 solutions:

    1. convert the 1D array back to 2D using VBA code by copying each element into a 2D array
    2. convert the 1D array back to 2D using VBA code that manipulates the underlying SAFEARRAY structure. This would only be useful if the array is really large and it would be too slow to copy element by element