Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

Consider the following implementation of a list. We use a pair of arrays where o

ID: 3730687 • Letter: C

Question

Consider the following implementation of a list. We use a pair of arrays where one array is longer than the other, and both arrays are kept sorted. The length of the shorter array is the square root of the length of the longer array; although, it may have empty entries if it is not yet full. To add an element, you use insertion sort to insert the element into the shorter array. If the short array fills up, re-allocate the shorter and longer arrays and merge the two old arrays into the new longer array. The new short array is empty. Then, to find an element, use binary search on both the long and the short arrays.

(a) [50 points] Given this data structure, prove that the data structure yields O(lg n) time for searching but O( n) amortized time for adding an element. Be sure to be thorough, complete, and rigorous in your proof.

Explanation / Answer

Answer:

'Because both the arrays are sorted, binary search takes logarithmic time in both. Suppose there are m elements in the larger array and at most m elements in the smaller. Then the lookup time in the larger O(log m) and in the smaller, O(log m) = O( 1 2 log m) = O(m). So the full lookup is O(log m). Because m is O(n), lookup is O(log n). When we insert an element into the smaller array, we pay O( m) cost moving all the elements to make space for the new one. However, there is also the cost of doing periodic merges. The intuition is that merges take O(n) work, but they only have to be done every m insertions. So inserting m elements incurs O(m) work, with an amortized cost of O(m/ m) = O( n). This simple analysis was enough to get credit, but we can make this analysis more precise by pretending that the computer has a meter inside it that has to be fed money to keep it going. When an operation is called, enough money has to be supplied to finish the operation. If an operation works correctly with O( n) money, then we can say that the operation takes O( n) time. For a proper amortized analysis, the key idea is charge extra on some operations, and store the extra money to inside the data structure to pay for later operations. In this case, we want to save enough money to pay for the periodic merges, while still paying only O( n) money on any given insertion. Suppose we want to insert n elements into the data structure, doing some number of merges along the way. Suppose that at the ith merge, the long array has length mi and the short array, mi . Then (non-merging) insertion takes time that is O( mi ) and merging takes time mi + mi . We can charge 3 n money for each insertion, using the first n to do the insertion (since n > mi), saving the remaining 2 n for later. When it comes time to do a merge, we will have saved up ( mi) · (2 n) money, which is at least 2mi , greater than the mi + mi needed to pay for merging. Since each insertion costs O( n) money, this is the amortized cost.

Hire Me For All Your Tutoring Needs
Integrity-first tutoring: clear explanations, guidance, and feedback.
Drop an Email at
drjack9650@gmail.com
Chat Now And Get Quote