Saturday, September 14, 2013

Understanding recurrence for running time

Understanding recurrence for running time

I'm doing the exercises in Introduction to Algorithm by CLRS. This is not
graded homework or anything, I'm just trying to understand the problem.
The problem is as follows:
We can express insertion sort as a recursive procedure as follows. In
order to sort A[1..n], we recursively sort A[1..n-1] and then insert A[n]
into the sorted array A[1..n-1]. Write a recurrence for the running time
of this recursive version of insertion sort.
The solution to the problem:
Since it takes O(n) time in the worst case to insert A[n] into the sorted
array A[1. .n −1], we get the recurrence T(n) = O(1) if n = 1 ,
T(n−1)+ O(n) if n > 1 . The solution to this recurrence is T(n) =
O(n^2).
So I get that if n=1, then it is already sorted, therefore it takes O(1)
time. But I don't understand the second part of the recurrence: The O(n)
part is the step where we insert the element being sorted into the array
which takes in the worst case O(n) time - the case where we have to go
through the entire array and insert at the end of it.
I'm having trouble understanding the recursive part of it (T(n-1)). Does
T(n-1) mean that each recursive call we are sorting one less element of
the array? That doesn't seem right.
Thanks, I'm just trying to wrap my brain around this.

No comments:

Post a Comment