demorgans-law

When should I apply demorgans law in programming?


I think any programmer who has taken an intro to programming class had to memorize DeMorgan's law.

In case you don't know what it is, here is the gist

!(A && B) = !A || !B
!(A || B) = !A && !B

I was under the assumption that if i had to memorize it, I would find it applicable in a programming situation. But I haven't had to use it at all.

Is there any reason why I should use it in programming? Does it make the program faster, or does it make conditions easier to read?


Solution

  • Keeping code readable is a big reason, and I'm sure whoever works on your code after you will agree it's an important one. And you'll save a CPU cycle or two if you only invert (!) one value instead of two.

    Another reason is to bypass short-circuiting. When many languages see !A || !B they stop evaluating if !A is true because it doesn't matter if !B is true or not. !A is enough for the OR to be true.

    If A and B are functions instead of variables, and you want both to execute, you're gonna have a problem:

    if( !save_record() || !save_another_record() ) { echo 'Something bad happened'; }

    Demorgan's laws let you replace that OR with an AND. Both sides of the AND need to be evaluated to make sure it's true:

    if( !( save_record() && save_another_record() ) ) { echo 'Something bad happened'; }