optimizationgpuperformancecpu-speedmoores-law

Will optimizing code become unnecessary?


If Moore's Law holds true, and CPUs/GPUs become increasingly fast, will software (and, by association, you software developers) still push the boundaries to the extent that you still need to optimize your code? Or will a naive factorial solution be good enough for your code (etc)?


Solution

  • Poor code can always overcome CPU speed.

    For an excellent example, go to this Coding Horror column and scroll down to the section describing the book Programming Pearls. Reproduced there is a graph showing how, for a certain algorithm, a TRS-80 with a 4.77MHz 8-bit processor can beat a 32-bit Alpha chip.TRS-80 vs. Alpha
    (source: typepad.com)

    The current trend in speedups is to add more cores, 'cause making individual cores go faster is hard. So aggregate speed goes up, but linear tasks don't always benefit.

    The saying "there is no problem that brute force and ignorance cannot overcome" is not always true.