cdebugginggccexecution-time

Does the gcc -g debugging flag affect program execution?


I've just been testing a program I'm working, and I see that it's executing 3μs faster (a statistically significant change) when I compile it with -g. This makes no sense to me - I thought that the -g flag wasn't supposed to affect the program execution, and that even if it did it would make it run slower, not faster.

Can anyone tell me why this is happening? And whether it changes the programs execution flow? I am not compiling with -O because I need it to execute exactly as written, but if -g can somehow make it run faster with changing the instruction order I should obviously be using that.

So I need to know exactly what changes the -g flag makes to the program.

Edit: The more tests I run, the bigger the t-value gets (= the more statistically significant the difference becomes). This is definitely not measurement error - something is going on.


Solution

  • As others have said, debugging symbols will not change the control flow of your code unless that there is an (unlikely) bug in the compiler.

    It changes execution, though, because the executable becomes bigger, and the executed code is spread more widely on more pages. You can expect more cache misses and IO signals. On a multi-tasking environment (and even a Linux/busybox system is such a thing) this can result is slightly different scheduling behavior.

    On the other hand, measuring such tiny time differences as you describe them is an art in its own rights. You are probably in an Heisenberg setting, where your measurements influence execution times. Your measurements may show statistically significant deviation, but I would be extremely careful in interpreting them as saying such and such option makes faster code.