The lure of local optimizationsJust because you've found the best burger in town, doesn't mean you've found the best burger in the world.
Once upon a time, I was working on some software that did some heavy PostgreSQL queries. I spent several months optimizing these queries to reduce the execution time from 10-20 minutes to a few seconds.
I employed data partitioning
I employed new indexes.
I employed materialized views.
I read books, blog posts, attended conferences, asked StackOverflow questions, all to learn how to optimize the data and queries.
It could have been tempting to think that I was approaching the maximum performance for these queries in our application. But it would have been a trap.
I see many people fall into the same trap. They spend months, or even years, on local optimizations. I’ve seen it with technical problems, such as my database optimization problem. I’ve seen it with organizational problems, such as how to optimally organize teams. I’ve seen it with regard to how to produce high-quality software quickly and reliably. We often become so focused on what is ultimately a local optimization, that we don’t even notice that others around us may be way ahead of us on the same problem.
Just because you’ve found the best burger in town, doesn’t mean you’ve found the best burger in the world.
In my case, I knew I wasn’t even close to maximum performance. I knew this, despite having absolutley no idea how to improve performance any further. How did I know this? Because Google was able to do much more sophisticated queries, on much more vast data sets, in a fraction of the time.
It always gets worse before it gets better
Whenever you introduce a new quality control measure, things often seem to get worse before they get better.