Selection Sort runtime in terms of Big O

Question Detail: 

I'm trying to understand why the sorting algorithm Selection Sort has asymptotic runtime in $O(n^2)$.

Looking at the math, the runtime is

$\qquad T(n) = (n-1) + (n-2) + \dots + 2 + 1$.

And this is stated to be equal to

$\qquad O(n^2)$.

However I just don't understand the intuition. I have tried several practical experiments for n=10 up to n=5000, and all point to that the time complexity of e.g. 5000 can never be greater T(5000) = 12.497.500 -- not T(5000) = 5000^2 = 25.000.000.

Now, I know that 5000 is not the same as infinity, but I just don't understand the intuition behind

$\qquad (n-1) + (n-2) + \dots + 2 + 1 = O(n^2)$.

Does someone have a great pedagogical explanation that my dim-witted mind can understand?

Asked By : Pætur Magnussen
Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/19094

Answered By : David Richerby

Remember what $O(n^2)$ means: it means that there is a constant $c$ such that, for large enough $n$, the number of steps is at most $cn^2$. The constant doesn't have to be $1$; in this case, it looks like the constant is about $\tfrac12$.

Why is $(n-1) + (n-2) + \dots + 2+1$ equal to $O(n^2)$? Because you're adding up $n-1$ numbers and the average of those numbers is half-way between the first one and the last one, i.e., $\tfrac12(1 + (n-1))$. So the total is $$(n-1)\times \tfrac12(1 + (n-1)) = \tfrac12(n^2-n).$$Plug in $n=5000$ and you get... *drumroll*... $12497500$.

No comments

Powered by Blogger.