I've been finding a strange behaviour of log
functions in C++ and numpy about the behaviour of log
function handling complex infinite numbers. Specifically, log(inf + inf * 1j)
equals (inf + 0.785398j)
when I expect it to be (inf + nan * 1j)
.
When taking the log of a complex number, the real part is the log of the absolute value of the input and the imaginary part is the phase of the input. Returning 0.785398 as the imaginary part of log(inf + inf * 1j)
means it assumes the inf
s in the real and the imaginary part have the same length.
This assumption does not seem to be consistent with other calculation, for example, inf - inf == nan
, inf / inf == nan
which assumes 2 inf
s do not necessarily have the same values.
Why is the assumption for log(inf + inf * 1j)
different?
Reproducing C++ code:
#include <complex>
#include <limits>
#include <iostream>
int main() {
double inf = std::numeric_limits<double>::infinity();
std::complex<double> b(inf, inf);
std::complex<double> c = std::log(b);
std::cout << c << "\n";
}
Reproducing Python code (numpy):
import numpy as np
a = complex(float('inf'), float('inf'))
print(np.log(a))
EDIT: Thank you for everyone who's involved in the discussion about the historical reason and the mathematical reason. All of you turn this naive question into a really interesting discussion. The provided answers are all of high quality and I wish I can accept more than 1 answers. However, I've decided to accept @simon's answer as it explains in more detail the mathematical reason and provided a link to the document explaining the logic (although I can't fully understand it).
See Edit 2 at the bottom of the answer for a mathematical motivation (or rather, at least, the reference to one).
The value of 0.785398 (actually pi/4) is consistent with at least some other functions: as you said, the imaginary part of the logarithm of a complex number is identical with the phase angle of the number. This can be reformulated to a question of its own: what is the phase angle of inf + j * inf
?
We can calculate the phase angle of a complex number z
by atan2(Im(z), Re(z))
. With the given number, this boils down to calculating atan2(inf, inf)
, which is also 0.785398 (or pi/4), both for Numpy and C/C++. So now a similar question could be asked: why is atan2(inf, inf) == 0.785398
?
I do not have an answer to the latter (except for "the C/C++ specifications say so", as others already answered), I only have a guess: as atan2(y, x) == atan(y / x)
for x > 0
, probably someone made the decision in this context to not interpret inf / inf
as "undefined" but instead as "a very large number divided by the same very large number". The result of this ratio would be 1, and atan(1) == pi/4
by the mathematical definition of atan
.
Probably this is not a satisfying answer, but at least I could hopefully show that the log
definition in the given edge case is not completely inconsistent with similar edge cases of related function definitions.
Edit: As I said, consistent with some other functions: it is also consistent with np.angle(complex(np.inf, np.inf)) == 0.785398
, for example.
Edit 2: Looking at the source code of an actual atan2
implementation brought up the following code comment:
note that the non obvious cases are y and x both infinite or both zero. for more information, see Branch Cuts for Complex Elementary Functions, or Much Ado About Nothing's Sign Bit, by W. Kahan
I dug up the referenced document, you can find a copy here. In Chapter 8 of this reference, called "Complex zeros and infinities", William Kahan (who is both mathematician and computer scientist and, according to Wikipedia, the "Father of Floating Point") covers the zero and infinity edge cases of complex numbers and arrives at pi/4 for feeding inf + j * inf
into the arg
function (arg
being the function that calculates the phase angle of a complex number, just like np.angle
above). You will find this result on page 17 in the linked PDF. I am not mathematician enough for being able to summarize Kahan's rationale (which is to say: I don't really understand it), but maybe someone else can.