Long story short:
I am trying to measure the time difference between the start & end of an event (a device executing a command), because I want to reproduce that later on as a return Home command.
But, it seems that for the same event, the time is way different, it varies by a factor of x2.
Here is my testing code:
#include <stdio.h>
#include <time.h>
#include <iostream>
using namespace std;
int main ()
{
clock_t t;
t = clock();
for(int i =0; i < 50; i++) { cout << i << " ";}
t = clock() - t;
printf ("\nIt took me %d clicks (%f seconds).\n", t, ((double) t) / CLOCKS_PER_SEC);
return 0;
}
Does anyone know anything better to get better results?
Context:
Sending commands to a device via BlueTooth, like rotating both motors at X speed while the VK_UP key is pressed, and stopping motors when VK_UP is released. The idea is to map all these commands using execution time between keypressed-keyreleased and later on to build a Return Home function.
Well, first, consider measure time using C++ facilities like in this question:
Measuring execution time of a function in C++
Then, the execution time of a piece of code may be impacted by many circumstances, so you should treat it like a random variable of sorts, and measure it multiple times, or measure the sum of times and divide by the number of repetitions.
Finally, your computer is not spending all of its time just executing your program; so - you may want to use something like time
program, or a profiler application, to figure out how much processor time your program actually got.