I have 2 strings to compare, and I thought using strncmp
would be better than using strcmp
because I know one of the strings length.
char * a = "hel";
char * b = "he"; // in my real code this is scanned so it user dependent
for(size_t i = 0; i < 5; i++){
printf("strncmp: %d\n", strncmp(a,b,i));
}
I expected the output to be
0
0
0
1 // which is the output of printf("strcmp: %d\n", strncmp(a,b));
1
since only in the 4th iteration (i = 3
) the strings start to differ, but instead I got
0
0
0
108 // guessing this is due to 'l' == 108 in ascii
108
and I don't understand why, as man says:
The
strcmp()
function compares the two strings s1 and s2. It returns an integer less than, equal to, or greater than zero if s1 is found, respectively, to be less than, to match, or be greater than s2.The
strncmp()
function is similar, except it only compares the first (at most) n bytes of s1 and s2.
which means it should stop after reaching a '\0'
and thus just returning 1 (like strcmp
), wouldn't it?
From the quote you've posted:
... It returns an integer less than, equal to, or greater than zero ...
Both 1
and 108
are integers greater than 0. There's no guarantee the function has to return 1
or -1
.