kaldiuo's blog

By kaldiuo, history, 7 years ago, In English

Suppose that we have a divide and conquer algorithm f(n) such that O(f(n))=O(min(a, b)+f(a)+f(b)) where a+b=n, then what is the worst time complexity of f(n) (explicitly)?

  • Vote: I like it
  • +5
  • Vote: I do not like it

»
7 years ago, # |
  Vote: I like it +32 Vote: I do not like it

The worst case running time is . I'll provide a situation with this exact time complexity and prove the running time:

Suppose a line of length N. We will consider it as "connected" and as colored with 1 color. You will be given queries, in each query you are given some position P. with each query, what you will do is you need to look at the current line segment containing position P. You must break this line at position P and color only one side of the now-broken line with a new color. You will color the shorter part that you got. The queries will keep coming until you end with single "points" you can't break. This is the exact time complexity you gave: with each query I have a line of some length X, then I divide it at some random point and operate on the shorter part, and now I have 2 segments with length summed up to X.

Proof of running time being : For each position, let's count the amount of times we colored it (the coloring is the only process we're actually putting time into in the problem). Note that if the current position had some color, and after an operation it had a different color, it means we colored it, and we colored it because when we broke its line, it came out on the smaller part. This smaller part is at least twice as small as the previous segment it was in. Therefor, for each of the N positions we do at most , so we proved the running time.

If you wonder about the specific case when this happens, it's when a = b, and the recurrence becomes:

T(n) = O(n) + 2T(n / 2), which is a well known recurrence that ends up on .