I was hoping somebody could explain why
#include <stdbool.h>
printf("size of bool %d\n", sizeof(bool));
printf("size of int %d\n", sizeof(int));
outputs to
size of bool 1
size of int 4
I've looked at http://pubs.opengroup.org/onlinepubs/009695399/basedefs/stdbool.h.html which seems to indicate that bool is essentially a macro for _Bool which, when set to true or false, is really just a macro for an integer constant. If it is an integer, why is it not the same size?
I'm asking because it took us far too long to debug a program for which we did not allocate enough memory.
The _Bool type in C99 (typedef'ed to bool in stdbool.h) doesn't have a standard defined size, but according to section 6.2.5 of the C99 Standard:
2 An object declared as type _Bool is large enough to store the values 0 and 1.
In C, the smallest addressable object (aside from bitfields) is the char, which is at least 8-bits wide, and sizeof(char) is always 1.
_Bool and bool therefore have a sizeof of at least 1, and in most implementations that I've seen, sizeof(bool) / sizeof(_Bool) is 1.
If you take a look at GCC's stdbool.h, you'll get this:
#define bool _Bool
#if __STDC_VERSION__ < 199901L && __GNUC__ < 3
typedef int _Bool;
#endif
#define false 0
#define true 1
So if using an older version of GCC and an old version of the C standard when compiling, you will use int as a _Bool type.
Of course, as an interesting thing, check this out:
#include <stdio.h>
#include <stdbool.h>
int main() {
printf("%zu\n", sizeof(_Bool));
printf("%zu\n", sizeof(true));
printf("%zu\n", sizeof(false));
}
Output:
λ > ./a.out
1
4
4
GCC 4.2.4, Clang 3.0, and GCC 4.7.0 all output the same. As trinithis points out, sizeof(true) and sizeof(false) produce larger sizes because they are taking the size of an int literal, which is at least sizeof(int).