Consider the following code:
#include <stdio.h>
#include <limits.h>
#include <inttypes.h>
#include <stddef.h>
int main(){
size_t cnt = SIZE_MAX;
size_t sz = sizeof(long[cnt]);
printf("%zu\n", sz);
}
6.5.3.4/p2
:
If the type of the operand is a variable length array type, the operand is evaluated; otherwise, the operand is not evaluated and the result is an integer constant.
The question is if such too large sizeof
evaluation well defined? Since size_t
is unsigned
, the Standard guarantees that unsigned
integer overflow has well-defined behavior (unlike signed
where implementation defined signal might be raised).
The main issue I'm confused about is that
size_t sz = sizeof(long[SIZE_MAX]); //error: size of unnamed array is too large
does not even compile Godbolt live example
sizeof (long[SIZE_MAX])
won't compile because attempting to form the type long[SIZE_MAX]
is a constraint violation. From §6.2.5 28 of the C23 draft standard:
A complete type shall have a size that is less than or equal to
SIZE_MAX
.
The constraint in question is not listed under a "Constraints" heading, so compilers are not required to issue a diagnostic for this. In this case both GCC and Clang choose to fail and issue an error message, but more generally sizeof (long[SIZE_MAX])
has undefined behavior since it violates a "shall" outside of an explicit constraints clause. But I'd like to think that reasonable implementations would fail to compile with an error like this when an attempt to declare an array which cannot be supported is made.
It appears that this language did not appear in previous standards, but the Standards Committee determined "...that all interpret the current standard that huge objects make the behavior implicitly undefined." The Committee views this change not as introducing an undefined behavior, but as a clarification that makes this explicit.