0

We are given table_1: 6 3 2 5 2, table_2: 7 4 1 6 8, table_3:15,14,5,4,9. The number of elements in table_1 is equal to the number of elements in table_2. How to find FAST if there exist such elements in table_1, table_2 that for particular element from table_3: table_3[i]=table_1[k]+table_2[p].

Example: 9=table_3[4]=table_1[1]+table_2[3];

I tried to sorting and binary_search algorithm so solve this problem :/ but it gives still too slow solution.

rkachach
  • 15,286
  • 6
  • 38
  • 62
Walter White
  • 51
  • 1
  • 1
  • 9
  • What do you need as result? The indices of table1 and table2, the index of table3, the sum value, the count of possible matches, all of them...? – deviantfan Oct 18 '15 at 16:34
  • Possible duplicate of [Find a pair of elements from an array whose sum equals a given number](http://stackoverflow.com/questions/4720271/find-a-pair-of-elements-from-an-array-whose-sum-equals-a-given-number) – rkachach Oct 18 '15 at 16:38
  • This is a duplicate of http://stackoverflow.com/questions/4720271/find-a-pair-of-elements-from-an-array-whose-sum-equals-a-given-number. The only difference is that in this question you have just one number. In your case you have an array of numbers (table_3) – rkachach Oct 18 '15 at 16:39
  • As a result I need answear: YES or NO. For table_3[0],table_3[2]. The answet would be NO. – Walter White Oct 18 '15 at 16:40
  • So, for every pair a answer? That's n^2 answers, so it can't be faster than O(n^2). – deviantfan Oct 18 '15 at 16:41
  • 1
    It's not a duplicate of the other question really. It's similar, but you can't treat it as a special case of the other because it involves 2 different lists and many algorithms that work for the other question won't work for this one. – Michael Oct 18 '15 at 16:42
  • O(n^2) is very slow. I found O(n*logn) solution but it is still to slow :/ – Walter White Oct 18 '15 at 16:43
  • 1
    @WalterWhite There can't be a O(n*logn) solution if you want a yes/no for every pair. – deviantfan Oct 18 '15 at 16:43
  • Do tables 1 and 2 change a lot? Is memory a factor? – Michael Oct 18 '15 at 16:44
  • Memory is not a factor. Howeer table 1 and 2 can have even 1000 elements. – Walter White Oct 18 '15 at 16:46
  • Even 1000 elements means you could create all the permutations, only a million elements, with the sums calculated, and hashed. Lookup would be very fast then. – Michael Oct 18 '15 at 16:47
  • deviantfan, O(n*longn) is too slow. I tried it already will binary search. – Walter White Oct 18 '15 at 16:50
  • 1
    @WalterWhite ... Either you didn't describe your problem correctly, or you don't understand it yourself. – deviantfan Oct 18 '15 at 16:56
  • How to you understand the problem? – Walter White Oct 18 '15 at 17:01
  • I think the problem is clear, just some details missing, like how much tables 1 and 2 change vs how many queries are made, and how big they can be (op says 1000 in comments). – Michael Oct 18 '15 at 17:08
  • @deviantfan : isn't it one yes/NO for each `particular element from table_3`? – greybeard Oct 18 '15 at 17:10
  • Yes. For tab_3[0]=NO, tab_3[1]=YES,tab_3[2]=NO,tab_3[3]=YES,tab_3[4]=YES. – Walter White Oct 18 '15 at 17:12

1 Answers1

2

You say that O(nlogn) is too slow. Do you mean that your particular implementation of some O(nlogn) solution was too slow or any growth of O(nlogn) is going to be too slow? If that latter is the case then you are, as my son would put it, boned.

Let's look at this from a data structures and algorithm analysis standpoint. If you allow table_1 and table_2 to remain unordered then you will need to make n^2 pairwise comparisons. Therefore, table_1 and table_2 must be submitted to some sort of ordering process.

There is no chance you will be able to order the two tables in less than linear time simply because each element will need to be subjected to at least one access. There are some sorting methods whose best time complexity is linear but I assume you need a general solution which means you have no guarantees on the state of the data before you start. I think sorting is a loser's bet anyway since you'll pay the O(nlogn) sort cost only to find you need, on average, n/2 average case compares to check the pairwise summations.

Now, you could create minheaps in linear time but table_3 is O(n) in size as are table_1 and table_2. You still lose since you will still need to do n * n/2 average comparisons to check the values of table_3 against the combination of pairs from table_1 and table_2.

What are the time constraints on this solution? 1 ms, 1 sec? Is there some structure or pre-knowledge about the data that can allow for a solution that takes advantage of the data? If not, I do not see a solution.

P. Hinker
  • 299
  • 2
  • 7
  • The time constraints are 0.100s-0.155s. – Walter White Oct 19 '15 at 08:56
  • What is the range of the numbers in the tables? Can there be duplicates in the tables? For three tables of 1000 integers where the integers are in the range of 1-1000 the odds of t1[i] + t2[j] = t3[k] for some i, j, k are extremely high (99%+). – P. Hinker Oct 19 '15 at 13:35
  • 1
    What hardware platform is this to run on? Even the most simplistic, brute force approach will only take roughly 500 ms on a standard laptop/desktop. If the numbers in the table are as described above (i.e. in a range from 1-1000) then a brute force pairwise algorithm that stops on a given t3[i] when it's been found runs in less than 10 ms. I wonder if there are more requirements then you've listed. – P. Hinker Oct 19 '15 at 13:57