I've been struggling with an application I'm writing and I think I'm beginning to see that my problem is premature optimization. The perfectionist side of me wants to make everything optimal and perfect the first time through, but I'm finding this is complicating the design quite a bit. Instead of writing small, testable functions that do one simple thing well, I'm leaning towards cramming in as much functionality as possible in order to be more efficient.
For example, I'm avoiding multiple trips to the database for the same piece of information at the cost of my code becoming more complex. One part of me wants to just not worry about redundant database calls. It would make it easier to write correct code and the amount of data being fetched is small anyway. The other part of me feels very dirty and unclean doing this. :-)
I'm leaning towards just going to the database multiple times, which I think is the right move here. It's more important that I finish the project and I feel like I'm getting hung up because of optimizations like this. My question is: is this the right strategy to be using when avoiding premature optimization?
This is the right strategy in general. Get the code to work, thoroughly covered with automated tests.
You can then run the automated tests while the program is under control of a profiler, to find out where the program is spending time and/or memory. That will show you where to optimize.
And it will be showing you how to optimize working code, not code that may or not work when it's all put together.
You don't want code that fails optimally.
The quote I was failing to remember is from Mich Ravera:
If it doesn't work, it doesn't matter how fast it doesn't work.