Monday, April 26, 2010

Finding the maximum sum in two sorted arrays

Given two sorted postive integer arrays A(n) and B(n) (let's say they are decreasingly sorted), we define a set S = {(a,b) | a \in A and b \in B}. Obviously there are n^2 elements in S. The value of such a pair is defined as Val(a,b) = a + b. Now we want to get the n pairs from S with largest values. in O(n).


  1. Any responses to this as I am not able to find out solution for it?

  2. Do you have any solution for that? I am still stuck at O(n log n) using a heap.
    Essentially, you have n sorted lists, each of which is indexed by the first component of the sum, and you want to merge them to find out the largest (or smallest, whatever) n. So I don't see how you can break O(n log n) unless you drop the
    comparison model assumption.

    I tried either to generate elements one at a time, or to implement randomized quickselect
    trying to rewrite partition "blindly", but still
    no luck.

  3. but the list are already sorted, don't forget it. is this that alessio?

  4. I see.. however my question still stands. The first element (the largest) of the output array is the biggest element among the ending ones in the lists.
    You extract that, and you go on. If you do that with a heap you have a O(n log n) algorithm. And I don't think that it can be beaten so easily.
    Where am I seeing this wrong?

  5. dynamic programming, with some additional observation -- giuseppe