I am using Google's C++ testing framework (nuget package Microsoft.googletest.v140.windesktop.msvcstl.static.rt-dyn, version 1.8.1.7) and I am parametrizing tests. I have a problem with the array of parameters when it is too big : in a simple case with two parameter 2
and 96
like in the following :
INSTANTIATE_TEST_CASE_P(
PricingTests,
Bulk1,
::testing::Values(2,96));
there's no problem because we have
template <typename T1, typename T2>
internal::ValueArray2<T1, T2> Values(T1 v1, T2 v2) {
return internal::ValueArray2<T1, T2>(v1, v2);
}
and
template <typename T1, typename T2>
class ValueArray2 {
public:
ValueArray2(T1 v1, T2 v2) : v1_(v1), v2_(v2) {}
template <typename T>
operator ParamGenerator<T>() const {
const T array[] = {static_cast<T>(v1_), static_cast<T>(v2_)};
return ValuesIn(array);
}
ValueArray2(const ValueArray2& other) : v1_(other.v1_), v2_(other.v2_) {}
private:
// No implementation - assignment is unsupported.
void operator=(const ValueArray2& other);
const T1 v1_;
const T2 v2_;
};
in the framework allowing to handle an array of two paremeters. And Google was nice and implemented everything up to ValueArray50
. But I have an array of more than 6000 parameters, each representing a different test cast that I have to test. I can I still make ::testing::Values
is case of arrays with huge sizes ?
It was changed in v1.10.0 almost 5 years ago: New variadic implementation for gtest-param-test. You need to update GoogleTest.