How do O and Ω relate to worst and best case?

Question Detail: 

Today we discussed in a lecture a very simple algorithm for finding an element in a sorted array using binary search. We were asked to determine its asymptotic compelxity for an array of $n$ elements.

My idea was, that it is obvisously $O(\log n)$, or $O(\log_2 n)$ to be more specific because $\log_2 n$ is the number of operations in the worst case. But I can do better, for example if I hit the searched element the first time - then the lower bound is $\Omega(1)$.

The lecturer presented the solution as $\Theta(\log n)$ since we usually consider only worst case inputs for algorithms.

But when considering only worst cases, whats the point of having $O$ and $\Omega$-notation when all worst cases of the given problem have the same complexity ($\Theta$ would be all we need, right?).

What am I missing here?

Asked By : Smajl
Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/23068

Answered By : Raphael

Landau notation denotes asymptotic bounds on functions. See here for an explanation of the differences among $O$, $\Omega$ and $\Theta$.

Worst-, best-, average or you-name-it-case time describe distinct runtime functions: one for the sequence of highest runtime of any given $n$, one for that of lowest, and so on..

Per se, the two have nothing to do with each other. The definitions are independent. Now we can go ahead and formulate asymptotic bounds on runtime functions: upper ($O$), lower ($\Omega$) or both ($\Theta$). We can do either for worst-, best- or any other case.

For instance, in binary search we get a best-case runtime asymptotic of $\Theta(1)$ and a worst-case asymptotic of $\Theta(\log n)$.

No comments

Powered by Blogger.