cint128

Squaring 2**32 into a 128-bit type fails


I would expect the following C code to produce 8, 16 for the first printf (it does), and 1, 0 for the second (however, it produces 0, 0). Why is that? (This is on MacOS on an Intel Core i7.)

typedef unsigned long long ll;
typedef unsigned __int128 bigint;

void main() {
  ll q;
  bigint s;
  q = 4294967296ULL; // 2**32
  s = q*q;
  printf("%d, %d\n", sizeof(ll), sizeof(unsigned __int128));
  printf("%lld, %lld\n", s / q, s % q);
}

Edit: After modifying the code to what is below, the results are the same (and note: in the original code, I was not trying to print any 128-bit values).

void main() {
  ll q;
  bigint s;
  q = 4294967296ULL; // 2**32
  s = q*q;
  printf("%zu, %zu\n", sizeof(ll), sizeof(unsigned __int128));
  printf("%llu, %llu\n", (ll)(s / q), (ll)(s % q)); 
}

Solution

  • In s = q*q;, q is a 64-bit unsigned long long, and the multiplication is performed with 64-bit arithmetic. Since q has the value 232, the mathematical product, 264, does not fit in a 64-bit unsigned long long. It wraps modulo 264, so the computed result is 0.

    Thus s is assigned the value 0, and this is what is printed (aside from the problems with the printf conversion specifiers noted in the comments).

    You can request 128-bit arithmetic by converting at least one of the operands to the desired type:

    s = (bigint) q * q;