Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

With a sly wink, Dumbledore says his real goal was actually to calculate and ret

ID: 3889581 • Letter: W

Question

With a sly wink, Dumbledore says his real goal was actually to calculate and return the largest value in the matrix B, that is, the largest subarray sum in A. Butting in, Professor Hagrid claims to know a fast divide and conquer algorithm for this problem that takes only O(nlogn) time (compared to applying a linear search to the B matrix, which would take O(n2) time).

Hagrid says his algorithm works like this:

• Divide the array A into left and right halves

• Recursively nd the largest subarray sum for the left half

• Recursively nd the largest subarray sum for the right half

• Find largest subarray sum for a subarray that spans between the left and right halves

• Return the largest of these three answers

Following is a pseudocode of his plan:

hagridSolve(A) {

if(A.length()==0) { return 0 }

return hagHelp(A,1,A.length())

}
hagHelp(A, s, t) {

if (s > t) { return 0 }

if (s == t) { return max(0, A[s]) }

m = (s + t) / 2

leftMax = sum = 0

for (i = m, i > s, i--) {

sum += A[i]

if (sum >= leftMax) { leftMax = sum }

}
rightMax = sum = 0

for (i = m, i <= t, i++) {

sum += A[i]

if (sum > rightMax) { rightMax = sum }

}
spanMax = leftMax + rightMax

halfMax = max( hagHelp(s, m), hagHelp(m+1, t) )

return max(spanMax, halfMax)
}

(iii) give the recurrence relation for its running time

Explanation / Answer

Recurrence Relations

Many algorithms, particularly divide and conquer al- gorithms, have time complexities which are naturally modeled by recurrence relations.

A recurrence relation is an equation which is dened in terms of itself.

1.Many natural functions are easily expressed as re- currences:

an = an;1 + 1 a1

= 1

;!

an = n

(polynomial)

an = 2an;1 a1 =

1 ;! an = 2n;1

(exponential)

                     an = nan;1 a1 = 1 ;! an = n! (weird function)

2.It is often easy to nd a recurrence as the solution of a counting problem. Solving the recurrence can be done for many special cases as we will see, although it is somewhat of an art.

Recursion is Mathematical Induction!

In both, we have general and boundary conditions, with the general condition breaking the problem into smaller and smaller pieces.

The initial or boundary condition terminate the recur- sion.

As we will see, induction provides a useful tool to solve recurrences { guess a solution and prove it by induction.

Tn = 2  Tn;1 + 1 T0 = 0

n

0

1

2

3

4

5

6

7

Tn

0

1

3

7

15

31

63

127

Guess what the solution is?

Prove Tn = 2n ; 1 by induction:

1.Show that the basis is true: T0 = 20 ; 1 = 0.

2.Now assume true for Tn;1.

3.Using this assumption show:

Tn = 2  Tn;1 + 1 = 2(2n;1 ; 1) + 1 = 2n ; 1

Solving Recurrences

No general procedure for solving recurrence relations is known, which is why it is an art. My approach is:

Realize that linear, nite history, constant coe client recurrences always can be solved

Check out any combinatorics or di erential equations book for a procedure.

Consider an = 2an;1 + 2an;2 + 1, a1 = 1, a2 = 1

It has history = 2, degree = 1, and coe
cients of 2 and 1. Thus it can be solved mechanically! Proceed:

Find the characteristic equation, eg. 2 ; 2 = 2 = 0.

Solve to get roots, which appear in the exponents.

Takeparts.care of repeated roots and inhomogeneous

Find the constants to nish the job.

an = ;1=3+(1;p3)n (1+p3)=3+(1+p3)n(;1+p3)=3

Systems like Mathematica and Maple have packages for doing this.

Try backsubstituting until you know what is going on

Also known as the iteration method. Plug the recur- rence back into itself until you see a pattern.

Example: T (n) = 3T (bn=4c)+n. Try backsubstituting:

T (n) = n + 3(bn=4c + 3T (bn=16c)

=

n + 3bn=4c

+ 9(bn=16c + 3T(bn=64c))

=

n + 3bn=4c

+ 9bn=16c + 27T (bn=64c)

The (3=4)n term should now be obvious.

Although there are only log4 n terms before we get to T (1), it doesn't hurt to sum them all since this is a fast growing geometric series:

T (n)  n 1 3i + (nlog4 3  T (1))

X 4

i=0

T (n) = 4n + o(n) = O(n)

Recursion Trees

Drawing a picture of the backsubstitution process gives you a idea of what is going on.

We must keep track of two things { (1) the size of the remaining argument to the recurrence, and (2) the additive stu to be accumulated during this call.

Example: T (n) = 2T (n=2) + n2

T(n)

T(n/2)

T(n/2)

T(n/4)

T(n/4)

T(n/4)

T(n/4)

2

2

n

n

2

2

2

(n/2)

(n/2)

n/2

2

2

2

2

2

(n/4)

(n/4)

n/4

(n/4)

(n/4)

The remaining arguments are on the left, the additive terms on the right.

Although this tree has height lgn, the total sum at each level decreases geometrically, so:

11

T (n) = Xn2=2i = n2 X1=2i = (n2)

i=0 i=0

The recursion tree framework made this much easier to see than with algebraic backsubstitution.

See if you can use the Master theorem to provide an instant asymptotic solution

The Master Theorem: { Let a  1 and b > 1 be con- stants, let f (n) be a function, and let T (n) be dened on the nonnegative integers by the recurrence

T (n) = aT (n=b) + f (n)

where we interpret n=b to mean either bn=bc or dn=be. Then T (n) can be bounded asymptotically as follows:

1.

If f (n) = O(nlogb a;) for some constant  > 0,

then T (n) = O(nlogb a;).

2.

If f (n) = (nlogb a ), then T (n) = (nlogb a lgn).

3.

If f (n) = (nlogb a+) for some constant  > 0,

and if af (n=b)  cf (n) for some constant c < 1,

and all su
ciently large n, then T (n) = (f (n)).

Is 4(n=2)3

Examples of the Master

Theorem

Which case of the Master Theorem applies?

T (n) = 4T (n=2) + n

Reading from the equation, a = 4, b = 2, and

f (n) = n.

Is n = O(nlog2 4;) = O(n2;)?

Yes, so case 1 applies and T (n) = O(n2).

T (n) = 4T (n=2) + n2

Reading from the equation, a = 4, b = 2, and f (n) = n2.

Is n2 = O(nlog2 4;) = O(n2;)?

No, if  > 0, but it is true if  = 0, so case 2 applies and T (n) = (n2 logn).

T (n) = 4T (n=2) + n3

Reading from the equation, a = 4, b = 2, and f (n) = n3.

Is n3 = (nlog2 4+ ) = (n2+ )?

Yes, for 0 < 1, so case 3 might apply.

c  n3?

Yes, for c  1=2, so there exists a c < 1 to sat-

isfy the regularity condition, so case 3 applies and T (n) = (n3).

Why should the Master

Theorem be true?

Consider T (n) = aT (n=b) + f (n).

Suppose f(n) is small enough

Say f (n) = 0, ie. T (n) = aT (n=b).

Then we have a recursion tree where the only contri- bution is at the leaves.

There will be logb n levels, with al leaves at level l.

T (n) = alogb n = nlogb a Theorem 2.9 in CLR

0

0

0

1

1

1

1

so long as f (n) is small enough that it is dwarfed by this, we have case 1 of the Master Theorem!

Suppose f(n) is large enough

If we draw the recursion tree for T (n) = aT (n=b)+f (n).

T(n)

f(n)

T(n/b)

...

...

T(n/b)

f(n/b)

...

...

f(n/b)

If f (n) is a big enough function, the one top call can be bigger than the sum of all the little calls.

Example: f (n) = n3 > (n=3)3 + (n=3)3 + (n=3)3. In fact this holds unless a  27!

In case 3 of the Master Theorem, the additive term dominates.

In case 2, both parts contribute equally, which is why the log pops up. It is (usually) what we want to have happen in a divide and conquer algorithm.

Famous Algorithms and their Recurrence

Matrix Multiplication

The standard matrix multiplication algorithm for two n  n matrices is O(n3).

2

3

13

18

23

2

3

4

3

4

3

4

5

18

25

32

4

5

23

32

41

Strassen discovered a divide-and-conquer algorithm which takes T(n) = 7T (n=2) + O(n2) time.

Since O(nlg 7) dwarfs O(n2), case 1 of the master the- orem applies and T (n) = O(n2:81).

This has been improved" by more and more compli- cated recurrences until the current best in O(n2:38).

Polygon Triangulation

Given a polygon in the plane, add diagonals so that each face is a triangle None of the diagonals are allowed to cross.

Triangulation is an important rst step in many geo- metric algorithms.

The simplest algorithm might be to try each pair of points and check if they see each other. If so, add the diagonal and recur on both halves, for a total of O(n3).

However, Chazelle gave an algorithm which runs in T (n) = 2T(n=2)+O(p(n)) time. Since n1=2 = O(n1;),

by case 1 of the Master Theorem, Chazelle's algorithm is linear, ie. T (n) = O(n).

Sorting

The classic divide and conquer recurrence is Merge- sort's T (n) = 2T (n=2) + O(n), which divides the data into equal-sized halves and spends linear time merging the halves after they are sorted

Since n = O(nlog2 2) = O(n) but not n = O(n1; ), Case 2 of the Master Theorem applies and T (n) = O(n logn).

In case 2, the divide and merge steps balance out per- fectly, as we usually hope for from a divide-and-conquer algorithm.

Mergesort Animations

XEROX FROM SEDGEWICK

Approaches to Algorithms Design

Incremental

Job is partly done - do a little more, repeat until done. A good example of this approach is insertion sort

Divide-and-Conquer

A recursive technique

Divide problem into sub-problems of the same kind.

For subproblems that are really small (trivial), solve them directly. Else solve them recursively. (con- quer)

Combine subproblem solutions to solve the whole thing (combine)

A good example of this approach is Mergesort.

an = an;1 + 1 a1

= 1

;!

an = n

(polynomial)

an = 2an;1 a1 =

1 ;! an = 2n;1

(exponential)

Hire Me For All Your Tutoring Needs
Integrity-first tutoring: clear explanations, guidance, and feedback.
Drop an Email at
drjack9650@gmail.com
Chat Now And Get Quote