Given an $N\times N$ array, where elements are decreasing in every row and every column.
What is the fastest way to find the $(i,j)$ of a given element if it exists in the array, or return no if it does not exist in the array?
-
2$\begingroup$ We discourage posts that simply state a problem and expect the community to provide an answer. Assuming you have tried to solve it, it is expected of you to tell us your partial progress, thoughts and where you got stuck. It will help draw more better answers to your post faster. Otherwise, the question might be closed or downvoted. Have you read how to ask a good homework question?? Have you tried the search engine of on top of this page? $\endgroup$喜欢算法和数学– 喜欢算法和数学2019年02月25日 00:46:20 +00:00Commented Feb 25, 2019 at 0:46
-
$\begingroup$ Has been asked, and answered, before. Spend some time locating the question. $\endgroup$Yuval Filmus– Yuval Filmus2019年02月25日 03:01:55 +00:00Commented Feb 25, 2019 at 3:01
-
$\begingroup$ See here for the answer: cs.stackexchange.com/questions/98575/…. $\endgroup$Yuval Filmus– Yuval Filmus2019年02月25日 04:14:04 +00:00Commented Feb 25, 2019 at 4:14
-
$\begingroup$ Duplicate of a SO post. $\endgroup$xskxzr– xskxzr2019年02月25日 04:18:40 +00:00Commented Feb 25, 2019 at 4:18
1 Answer 1
(Copied from a post on StackOverflow)
Here's a simple approach:
- Start at the bottom-left corner.
- If the target is less than that value, it must be above us, so move up one.
- Otherwise we know that the target can't be in that column, so move right one.
- Goto 2.
For an NxM
array, this runs in O(N+M)
. I think it would be difficult to do better. :)
Edit: Lots of good discussion. I was talking about the general case above; clearly, if N
or M
are small, you could use a binary search approach to do this in something approaching logarithmic time.
Here are some details, for those who are curious:
History
This simple algorithm is called a Saddleback Search. It's been around for a while, and it is optimal when N == M
. Some references:
- David Gries, The Science of Programming . Springer-Verlag, 1989.
- Edsgar Dijkstra, The Saddleback Search . Note EWD-934, 1985.
However, when N < M
, intuition suggests that binary search should be able to do better than O(N+M)
: For example, when N == 1
, a pure binary search will run in logarithmic rather than linear time.
Worst-case bound
Richard Bird examined this intuition that binary search could improve the Saddleback algorithm in a 2006 paper:
- Richard S. Bird, Improving Saddleback Search: A Lesson in Algorithm Design , in Mathematics of Program Construction, pp. 82--89, volume 4014, 2006.
Using a rather unusual conversational technique, Bird shows us that for N <= M
, this problem has a lower bound of Ω(N * log(M/N))
. This bound make sense, as it gives us linear performance when N == M
and logarithmic performance when N == 1
.
Algorithms for rectangular arrays
One approach that uses a row-by-row binary search looks like this:
- Start with a rectangular array where
N < M
. Let's sayN
is rows andM
is columns. - Do a binary search on the middle row for
value
. If we find it, we're done. - Otherwise we've found an adjacent pair of numbers
s
andg
, wheres < value < g
. - The rectangle of numbers above and to the left of
s
is less thanvalue
, so we can eliminate it. - The rectangle below and to the right of
g
is greater thanvalue
, so we can eliminate it. - Go to step (2) for each of the two remaining rectangles.
In terms of worst-case complexity, this algorithm does log(M)
work to eliminate half the possible solutions, and then recursively calls itself twice on two smaller problems. We do have to repeat a smaller version of that log(M)
work for every row, but if the number of rows is small compared to the number of columns, then being able to eliminate all of those columns in logarithmic time starts to become worthwhile.
This gives the algorithm a complexity of T(N,M) = log(M) + 2 * T(M/2, N/2)
, which Bird shows to be O(N * log(M/N))
.
Another approach posted by Craig Gidney describes an algorithm similar the approach above: it examines a row at a time using a step size of M/N
. His analysis shows that this results in O(N * log(M/N))
performance as well.
Performance Comparison
Big-O analysis is all well and good, but how well do these approaches work in practice? The chart below examines four algorithms for increasingly "square" arrays:
algorithm performance vs squareness
(The "naive" algorithm simply searches every element of the array. The "recursive" algorithm is described above. The "hybrid" algorithm is an implementation of Gidney's algorithm. For each array size, performance was measured by timing each algorithm over fixed set of 1,000,000 randomly-generated arrays.)
Some notable points:
- As expected, the "binary search" algorithms offer the best performance on rectangular arrays and the Saddleback algorithm works the best on square arrays.
- The Saddleback algorithm performs worse than the "naive" algorithm for 1-d arrays, presumably because it does multiple comparisons on each item.
- The performance hit that the "binary search" algorithms take on square arrays is presumably due to the overhead of running repeated binary searches.
Summary
Clever use of binary search can provide O(N * log(M/N)
performance for both rectangular and square arrays. The O(N + M)
"saddleback" algorithm is much simpler, but suffers from performance degradation as arrays become increasingly rectangular.