Consider the following code:
struct timespec ts;
uint64_t start_time;
uint64_t stop_time;
if (clock_gettime(CLOCK_REALTIME, &ts) != 0) {
abort();
}
start_time = ts.tv_sec * UINT64_C(1000000000) + ts.tv_nsec;
/* some computation... */
if (clock_gettime(CLOCK_REALTIME, &ts) != 0) {
abort();
}
stop_time = ts.tv_sec * UINT64_C(1000000000) + ts.tv_nsec;
printf("%" PRIu64 "\n", (stop_time - start_time + 500000000) / 1000000000);
In the vast majority of cases, the code works as I expected, i.e., prints the number of seconds that took the computation.
Very rarely, however, one anomaly occurs.
The program reports the number of seconds like 18446743875, 18446743877, 18446743962, etc.
I figured this number roughly matched 264 nanoseconds (~584 years).
So I got the suspicion that ts.tv_nsec
is sometimes equal to −1.
So my question is: What's wrong with my code? Where and why does adding 264 nanoseconds happen?
I don't see anything wrong with your code. I suspect your OS is occasionally delivering an anomalous value for CLOCK_REALTIME — although I'm surprised, and I can't quite imagine what it might be.
I suggest rewriting your code like this:
struct timespec start_ts, stop_ts;
uint64_t start_time;
uint64_t stop_time;
if (clock_gettime(CLOCK_REALTIME, &start_ts) != 0) {
abort();
}
start_time = start_ts.tv_sec * UINT64_C(1000000000) + start_ts.tv_nsec;
/* some computation... */
if (clock_gettime(CLOCK_REALTIME, &stop_ts) != 0) {
abort();
}
stop_time = stop_ts.tv_sec * UINT64_C(1000000000) + stop_ts.tv_nsec;
uint64_t elapsed = (stop_time - start_time + 500000000) / 1000000000;
printf("%" PRIu64 "\n", elapsed);
if(elapsed > 365 * 86400 * UINT64_C(1000000000)) {
printf("ANOMALY:\n");
printf("start_ts = %lu %lu\n", start_ts.tv_sec, start_ts.tv_nsec);
printf("stop_ts = %lu %lu\n", stop_ts.tv_sec, stop_ts.tv_nsec);
}
Then, if/when it happens again, you'll have more information to go on.