Are this methods a reliable way to measure a script:
$time = ($_SERVER['REQUEST_TIME_FLOAT'] - $_SERVER['REQUEST_TIME']);
or
$time = (microtime(true) - $_SERVER['REQUEST_TIME_FLOAT']);
Which one should be used?
And what's the difference of each one?
They return very different measurements.
$time = ($_SERVER['REQUEST_TIME_FLOAT'] - $_SERVER['REQUEST_TIME']);
This will never give you execution time of you PHP script. Because both the values are used for storing start of request. The difference is, $_SERVER['REQUEST_TIME_FLOAT']
is more precise and stores time value with microsecond precision, while $_SERVER['REQUEST_TIME']
in seconds.
$time = (microtime(true) - $_SERVER['REQUEST_TIME_FLOAT']);
I guess this is what should be used at the end of the PHP script and I think you know why.
Also keep in mind $_SERVER['REQUEST_TIME_FLOAT'] is available since PHP 5.4.0.