radoslav11's blog

By radoslav11, history, 7 years ago, In English

Hello,

A friend of mine gave me and a couple of friends a problem which he couldn't solve. In the end, it turned out he misread it and it appeared to be quite easy. But now I'm curious if there is a polynomial solution to the first problem. Here is the problem statement:

We have 2 sequences a and b. We can preform a operation set(l, r, v) on the first sequence. By applying it, all elements of a in the range [l;r] become equal to v. The cost of each operation depends on the length of the interval we apply it to. In other words, we have an array c, such that the cost of operation set(l, r, v) is c[r - l + 1]. The question is:

What is the minimum sum of costs to convert sequence a to sequence b.

Note that there are no constraints for the costs. For example c[1] might be greater than c[5] and less than c[7].

We can get rid of sequence a by doing this dp:

DP[l][r] — answer for subarray [l;r].

We try fixing l ≤ mid ≤ r, such that a[mid] = b[mid].

We make DP[l][r] = min ( DP[l][mid-1] + DP[mid + 1][r]).

Now we are left with one more case — we cover every element of a with at least one set() operation. Then we don't care about array a. If we have created another array P[l][r] such that this value is equal to the minimum cost to create the corresponding subarray of array b, then we simply need to preform DP[l][r] = min(DP[l][r], P[l][r]) and we will solve the problem.

So does anyone have an idea about the solution of the problem or is anyone able to prove that the problem has no polynomial solution?

  • Vote: I like it
  • +46
  • Vote: I do not like it