i want to calculate delta time and use it to consume the input duration, in a while loop like in the code block:
void _Test_duration_print(float duration)
{
using chrono_clock = std::chrono::steady_clock;
//using chrono_clock = std::chrono::high_resolution_clock;
float timer = 0.0f;
auto startTime = chrono_clock::now();
auto now = chrono_clock::now();
auto lastUpdate = chrono_clock::now();
auto dt_rcp = 1.0 / 1000000.0;
uint32_t frame_count = 0;
while (true)
{
now = chrono_clock::now();
auto delta_duration = now - lastUpdate;
auto deltaTime = std::chrono::duration_cast<std::chrono::microseconds>(delta_duration).count();
auto deltaTime_sec = deltaTime * dt_rcp;
auto deltaTime_sec_2 = delta_duration.count();
auto sinceStartDuration = now - startTime;
auto timeSinceStart = std::chrono::duration_cast<std::chrono::microseconds>(sinceStartDuration).count();
auto timeSinceStart_sec = timeSinceStart * dt_rcp;
timer += deltaTime_sec;
printf("Timer:[%f]|TimeSinceStart:[%lld]|TimeSinceStart(Second):[%f]DeltaTime=[%lld]|d_deltaTime:[%f]|d_deltaTime_2:[%lld]|\n",
timer, timeSinceStart, timeSinceStart_sec, deltaTime, deltaTime_sec, deltaTime_sec_2);
lastUpdate = now;
frame_count++;
if (timer < duration)continue;
printf(">>>>End with timer:[%f]", timer);
break;
}
std::cout << "End Frame Count = " << frame_count << std::endl;
}
if input "duration" was 1.0f, the print out will stop ROUGHLY in 1 second (feels about it)
but, what is unexpected, its that when the printf statement in the loop body removed,
the loop will take much much longer to finish if the same 1.0f duration was feed into the method (feels like 3+ sec on my pc).
question : why the above method would produce wrong duration result with chrono::duration delta time? the "timer" param actually took much longer to reach "duration" when the code took less time to run ?
not much, from what i seem on google on topic how to calculate delta time, chrono duration is what most people use for time frame related stuff
Without the print your loop likely takes very little time to execute, as you cast the delta time to microseconds the cumulative rounding errors will make your timer take longer to reach the target duration. For example if the loop takes 1.8 microseconds to execute, deltaTime
will be truncated to 1 microsecond and the loop will take nearly double the time to execute. If the average execution time is under a microsecond the effect will be even worse and for most loop iterations the timer won't advance at all, it'll only advance occasionally when the loop happens to be slower.
You should base your timer updates on timeSinceStart
instead.