Complexity analysis: Represent the time complexity of the following recursive al
ID: 3568579 • Letter: C
Question
- Complexity analysis:Represent the time complexity of the following recursive algorithm, T(n), as a recurrence equation:
int mult_7(int n){ if ( n==1) return 7; if ( n > 1) return ( 7 + mult_7(n-1) ); } - Complexity analysis: analyze the time complexity of the Top-Down implementation of the MergeSort algorithm on the following wikipedia webpage:
http://en.wikipedia.org/wiki/Merge_sort
You are required to represent the time complexity of TopDownMerge() as a polynomial function of the input size. Then represent the time complexity of TopDownSplitMerge(A[], iBegin, iEnd, B[]) as a recurrence equation. You don't need to solve this equation.
Explanation / Answer
Analysing recursive functions (or even evaulating them) is a nontrivial task. A (in my opinion) good introduction can be found in Don Knuths Concrete Mathematics.
However, let's analyse these examples now:
We define a function that gives us the time needed by a function. Let's say that t(n) denotes the time needed by mult(n), i.e. a function of n.
Then we can conclude, that t(0)=c, because if we call multx(0), we have to check wether (n==0), and then return 7, wich can be done in constant time (hence the constant c).
Now we consider the other case: n>0. Here we obtain t(n) = d + t(n-1). That's because we have again to check n==1, compute mult(x, n-1), hence (t(n-1)), and multiply the result by n. Checking and multiplying can be done in constant time (constant d), the recursive calculation of mult needs t(n-1).
Now we can "expand" the term t(n):
So, how long does it take until we reach t(1)? Since we start at t(n) and we subtract 1 in each step, it takes n-1 steps to reach t(n-(n-1)) = t(1). That, on the other hands, means, that we get n-1 times the constant d, and t(1) is evaluated to c.
So we obtain:
So we get that t(n)=(n-1) * d + c wich is element of O(n).
Because at each iteration you split the array into two sublists, and recursively invoke the algorithm. At best case you split it exactly to half, and thus you reduce the problem (of each recursive call) to half of the original problem. You need log_2(n) iterations, and each iteration takes exactly O(n) (each iteration is on all sublists, total size is still n), so at total O(nlogn).
However, with a simple preprocessing to check if the list is already sorted - it can be reduced to O(n).
Since checking if a list is sorted is itself O(n) - it cannot be done in O(1). Note that the "best case" is the "best case" for general n, and not a specific size.
how about the time complexity of bottom-up merge sort in worst case, best case and average case.
The same approach can give you O(n) best case to bottom up (simple pre processing). The worst case and best case of bottom up merge sort is O(nlogn) - since in this approach the list is always divided to 2 equally length (up to difference 1) lists.
Related Questions
drjack9650@gmail.com
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.