coptimizationaveragenumerics

Take the average of two signed numbers in C


Let us say we have x and y and both are signed integers in C, how do we find the most accurate mean value between the two?

I would prefer a solution that does not take advantage of any machine/compiler/toolchain specific workings.

The best I have come up with is:(a / 2) + (b / 2) + !!(a % 2) * !!(b %2) Is there a solution that is more accurate? Faster? Simpler?

What if we know if one is larger than the other a priori?

Thanks.

D


Editor's Note: Please note that the OP expects answers that are not subject to integer overflow when input values are close to the maximum absolute bounds of the C int type. This was not stated in the original question, but is important when giving an answer.


Solution

  • Edit: version fixed by @chux - Reinstate Monica:

    if ((a < 0) == (b < 0)) {  // a,b same sign
      return a/2 + b/2 + (a%2 + b%2)/2;
    } else {
      return (a+b)/2;
    }
    

    Original answer (I'd have deleted it if it hadn't been accepted).

    a/2 + b/2 + (a%2 + b%2)/2
    

    Seems the simplest one fitting the bill of no assumption on implementation characteristics (it has a dependency on C99 which specifying the result of / as "truncated toward 0" while it was implementation dependent for C90).

    It has the advantage of having no test (and thus no costly jumps) and all divisions/remainder are by 2 so the use of bit twiddling techniques by the compiler is possible.