I would like to know time as seconds since the epoch. Notably, I would not like it to matter where the machine doing the conversion is, the time zone string should be enough.
I have this test program, pt.cc:
#include <assert.h>
#include <errno.h>
#include <iostream>
#include <stdio.h>
#include <string>
#include <string.h>
#ifndef _XOPEN_SOURCE
#define _XOPEN_SOURCE
#endif
#include <time.h>
using namespace std; // To be brief, don't do this in real life.
int main(int argc, char* argv[]) {
(void)argc; (void)argv; // Skip compile warning.
// I expect both of these to transform to 1440671500.
cout << "1440671500 expected" << endl;
const char utc_example[] = "2015-08-27T11:31:40+0100";
struct tm tm;
memset(&tm, 0, sizeof(struct tm));
char* end = strptime(utc_example, "%Y-%m-%dT%H:%M:%S%z", &tm);
assert(end);
assert(*end == '\0');
time_t seconds_since_epoch = mktime(&tm);
cout << "utc example: " << seconds_since_epoch << " or maybe "
<< seconds_since_epoch - tm.tm_gmtoff + (tm.tm_isdst ? 3600 : 0) << endl;
const char tz_example[] = "2015-08-27T10:31:40Z";
memset(&tm, 0, sizeof(struct tm));
end = strptime(tz_example, "%Y-%m-%dT%H:%M:%S%nZ", &tm);
assert(end);
assert(*end == '\0');
seconds_since_epoch = mktime(&tm);
cout << " tz example: " << seconds_since_epoch << " or maybe "
<< seconds_since_epoch - tm.tm_gmtoff + (tm.tm_isdst ? 3600 : 0) << endl;
return 0;
}
This is the output:
jeff@birdsong:tmp $ clang++ -ggdb3 -Wall -Wextra -std=c++14 pt.cc -o pt
jeff@birdsong:tmp $ ./pt
1440671500 expected
utc example: 1440671500 or maybe 1440667900
tz example: 1440667900 or maybe 1440664300
jeff@birdsong:tmp $ TZ=America/New_York ./pt
1440671500 expected
utc example: 1440693100 or maybe 1440711100
tz example: 1440689500 or maybe 1440707500
jeff@birdsong:tmp $ TZ=Europe/London ./pt
1440671500 expected
utc example: 1440675100 or maybe 1440675100
tz example: 1440671500 or maybe 1440671500
jeff@birdsong:tmp $
Note how the return value of mktime()
changes depending on the ambient time zone. The man page entry for mktime()
suggests it interprets the broken down time as local time. So I tried subtracting the GMT offset and compensating for timezone in case it was ignoring those values (the "or maybe" value).
Any tips on how to do this correctly? (Should it matter, I only need this to work on linux.)
Here's an answer that does what you want using Google's https://github.com/google/cctz
#include <chrono>
#include <iostream>
#include <string>
#include "src/cctz.h"
using namespace std;
int main(int argc, char* argv[]) {
const char kFmt[] = "%Y-%m-%dT%H:%M:%S%Ez";
// I expect both of these to transform to 1440671500.
const char utc_example[] = "2015-08-27T11:31:40+0100";
const char tz_example[] = "2015-08-27T10:31:40Z";
cout << "1440671500 expected" << endl;
// Required by cctz::Parse(). Only used if the formatted
// time does not include offset info.
const auto utc = cctz::UTCTimeZone();
std::chrono::system_clock::time_point tp;
if (!Parse(kFmt, utc_example, utc, &tp)) return -1;
cout << "utc example: " << std::chrono::system_clock::to_time_t(tp) << "\n";
if (!Parse(kFmt, tz_example, utc, &tp)) return -1;
cout << " tz example: " << std::chrono::system_clock::to_time_t(tp) << "\n";
return 0;
}
The output is:
1440671500 expected
utc example: 1440671500
tz example: 1440671500
Note that other answers that involved adding/subtracting offsets from, say, a time_t are using a technique called "epoch shifting" and it doesn't actually work. I explain why at 12:30 in this talk from CppCon: https://youtu.be/2rnIHsqABfM?t=12m30s