There is something compelling in knowing that you've figured something out; not just for the sake of solving it, but understanding it to the point of not doubting yourself.
Complexity notation is one of those things that helps you determine how long something will take you in proportion to how much of it there is.
This is useful in that you can gauge the most amount of work (worst case scenario) for a task that you will ever have to do.
Complexity can be of any order but here is what you're generally up against, for a problem where n is your input size.
O(1): Order 1: This is the holy grail. Regardless of how much work you have, it will always take you the same amount of time to complete.
O(n): Order n: This is usually most common and linear, basically the amount of time it will take you to do your work is proportional to the amount of work you have on your plate.
O(n²): Order n squared: This is where shit may well and truly have hit the fan; the time it will take you to do you work is growing beyond the rate at which it is landing on your plate. This might be manageable if it's a small amount of work but it is going to hurt if you can't knock it down quick.
Big O Notation is usually used for describing algorthimic optimisations but I think it has applications that transcend far beyond any one discipline. In one way or another it helps my fragile little mind know which fights to pick and which to flee.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment