I seem to notice two schools of thought emerging in regards to optimization:
On the face of things, they seem to be fairly opposed viewpoints. The thing is, I see the merit in both schools of thought. I can also think of times when both of these ways of thinking have helped me write better and faster software.
Is there any way to reconcile these two ideas? Is there a middle ground? Is there a time when one idea is the best tool for the job? Or am I presenting a false dichotomy and both views can co-exist peacefully?
What I usually do is to apply those optimzations that don't cost me anything (or almost nothing). I also stay on the lookout for algorithms that don't scale well and are called very often. Other than that, I do not optimize until the software runs and I get a chance to fire up the profiler. Only then I will invest some serious time into optimization.