I've installed a fresh linux system. On this system, the clock was wrong by being two days and a few hours slow. (Failing CMOS batteries, probably). There also are problems with internet connectivity, so I have to edit some files. As a minimal install, it doesn't come with an editor, though it does come with a compiler, so I'm manually transferring an editor's source code over then compiling it.
Which took a rather unnaturally long time.
After a bit of digging through the output, apparently, nano's configure script, with an offset clock, gives the warning that somehow its configure.ac script output is 200,000 (and some odd) seconds into the future. (I corrected the clock and the thing completes within a few minutes).
Is it because the files' modification time from the other system is newer?
If so, then why does it loop-and-fork forever, spawning more and more copies of make
and configure
instead of outputting an error message then quitting? This is about nano
, a project made by the same group that made the autotools themselves. Are autotools just that hard to use, that even the makers have these kinds of strange bugs?
Not a bug, but the consequence of bizarre outcomes when the system clock is off.
You transferred source files with future timestamps onto a system with a clock set 2 (and a half) days behind. (configure.ac script output is 200,000 (and some odd) seconds into the future)
As a result, make
saw them as always outdated and entered a rebuild loop.
make
keeps rerunning autoconf
and automake
because it thinks certain files are always out of date, due to incorrect timestamps caused by the system clock.
Each rebuild spawns more processes, but since the timestamps never appear up-to-date, the cycle repeats.
This isn’t a bug in nano or autotools, but rather a consequence of how make
relies on file modification times to decide what needs rebuilding.