if-statementoptimizationcoding-style

Does if (myBool == true) differ computational wise to if(myBool)?


I read a coding style suggestion about comparing bools that said to write

if (myBoolValue == true) { ... }
if (myBoolValue == false) { ... }

instead of writing

if (myBoolValue) { ... }
if (!myBoolValue) { ... }

since it increases readability of the code even though they are equivalent statements. I am well aware that it is not a usual coding practice but I will agree that it may increase readability in some cases.

My question is if there is any difference between the two in regards of optimization of code execution or if (a well implemented) compiler translate them to the same thing?


Solution

  • The productions are not the same in all languages.

    For instance, they may produce different results for some "non-boolean" values of "myBoolValue" in both JavaScript and C.

    // JavaScript
    [] == true         // false
    [] ? true : false  // true
    
    // C
    #define true 1
    #define false 0
    int x = -1;
    x == true         // 0 (false)
    x ? true : false  // true (1)
    

    To see what a specific compiler does for a specific programming language (there is both what it is allowed to do and then what it will do), check the generated assembly/machine/byte code.

    (Anyway, I prefer and use the latter form exclusively.)