Consider integer division:
a = bq + r
where a, b, q, r are respectively: dividend, divisor, quotient, and remainder. Particularly when b = 0, there is no unique q that satisfies the equation for a given a, and hence it makes sense that the quotient q should be undefined in such case.
However, there is indeed a unique r in such case, namely, r = a. Under the premise that the quotient and the remainder are always defined together, it would follow that r is not defined whenever q is undefined, but in programming, we often want to use the remainder operation %
irrespective of division /
. I actually came across a situation where I want if b == 0 then a else a % b end
.
Is there/Was there an operator in any programming language such that it is the same as %
but returns the dividend instead of a zero division error when the divisor is 0?
Is there any reason that most (or all) programming languages return a zero division error for % 0
?
Mathematically, the remainder is between 0 and b-1, where b is the divisor. Therefore, when b = 0, r is undefined since it has to be >= 0.