I was trying to calculate the length of a string using sizeof()
, but I noticed something unexpected:
#include <stdio.h>
int main() {
printf("%lu\n", sizeof("Hello")); // Output: 6
return 0;
}
I expected the output to be 5 (since "Hello"
has 5 characters), but the result is 6.
Why does sizeof("Hello")
return 6 instead of 5?
How is sizeof
different from strlen()
in this case?
1. String Literals Include a Null Terminator
When you write "Hello"
, the compiler automatically appends a null character (\0
) at the end to indicate the end of the string.
So, "Hello"
in memory is actually:
['H', 'e', 'l', 'l', 'o', '\0']
Thus, sizeof("Hello")
counts all 6 bytes, including the \0
2. Difference Between sizeof and strlen
sizeof("Hello")
returns 6
at compile-time because it includes the null terminator
strlen("Hello")
returns 5
at runtime because it counts only characters until \0
Also, %lu
is the wrong conversion specifier when the argument is the return value from strlen
or the value of the sizeof
operator since the type is size_t
, not unsigned long
. %zu
is the correct conversion specifier for size_t
.
Example:
#include <stdio.h>
#include <string.h>
int main() {
printf("sizeof: %zu\n", sizeof("Hello")); // Output: 6
printf("strlen: %zu\n", strlen("Hello")); // Output: 5
return 0;
}
How to Avoid This Confusion?
Use strlen()
when you need the actual string length
Use sizeof()
only if you need memory size allocation, such as for a character array