When converting an int like so:
char a[256];
sprintf(a, "%d", 132);
what's the best way to determine how large a should be? I assume manually setting it is fine (as I've seen it used everywhere), but how large should it be? What's the largest int value possible on a 32 bit system, and is there some tricky way of determining that on the fly?
The max possible number of bits in an int is CHAR_BIT * sizeof(int)
, and a decimal digit is "worth" at least 3 bits, so a loose upper bound on the space required for an arbitrary int
is (CHAR_BIT * sizeof(int) / 3) + 3
. That +3 is one for the fact that we rounded down when dividing, one for the sign, one for the nul terminator.
If by "on a 32 bit system" you mean that you know int
is 32 bits, then you need 12 bytes. 10 for the digits, one for the sign, one for the nul terminator.
In your specific case, where the int to be converted is 132
, you need 4 bytes. Badum, tish.
Where fixed-size buffers can be used with a reasonable bound, they are the simpler option. I not-so-humbly submit that the bound above is reasonable (13 bytes instead of 12 for 32 bit int
, and 23 bytes instead of 21 for 64 bit int
). But for difficult cases, in C99 you could just call snprintf
to get the size, then malloc
that much. That's overkill for such a simple case as this.