cancaneed's blog

By cancaneed, history, 20 months ago, In English

There are $$$n$$$ points on the line, the position of $$$i$$$ th point is $$$p[i]$$$. there are Q querys. you will got a segment which length is $$$D[i]$$$, ask at least how many segments are needed to cover all points.

$$$ 1 \le n \le 10^5 $$$, $$$ 1 \le p[i] \le 10^9 $$$, $$$ 1 \le q \le 10^5 $$$

  • Vote: I like it
  • +29
  • Vote: I do not like it

»
20 months ago, # |
  Vote: I like it -17 Vote: I do not like it

I got a weird heuristic solution. But I didn't know how to hack it or prove it.

So the first problem is that if q=1, there's a greedy way to sort the p array, and for every line segment put the left end point on the left end point that's not covered.

Let's start with D[i]=1. The answer is clearly n.

And then increment D[i]. If we want the answer to change, then we just need to increase D[i] to the smallest difference among all the current points.

Then you do a brute force reconstruction for each point where the difference is equal to the minimum. If the reconstructed point overlaps with the original point, then break and reconstruct the next point equal to the minimum.

I guess it's n log^2n + q log n.

  • »
    »
    20 months ago, # ^ |
    Rev. 2   Vote: I like it -18 Vote: I do not like it

    If you want to maintain optimality of the construction, then whenever D[i] changes, all segments in front of the first point of change must be modified too, right, how are you keeping track of all this?

    • »
      »
      »
      20 months ago, # ^ |
      Rev. 2   Vote: I like it -18 Vote: I do not like it

      It only increases to the minimum difference value each time. It wouldn't have changed before first modified point.

      The length of the line segment may change, but since none of the longer line segments reaches the next point, the answer will not change according to the greedy above, so there is no maintenance.

      • »
        »
        »
        »
        20 months ago, # ^ |
          Vote: I like it 0 Vote: I do not like it

        But since the length of segment is changing, so the start point of segments in the back obvious changed, how do you maintain it?

»
20 months ago, # |
  Vote: I like it 0 Vote: I do not like it

I’m not sure if this fits, but…

try to use this
  • »
    »
    20 months ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    still not idea. This problem is from an old problem i didn't solve, and I have already forgot the source.

    • »
      »
      »
      20 months ago, # ^ |
        Vote: I like it 0 Vote: I do not like it

      What was my idea:

      1. Sort the queries and the points increasingly by $$$D[i]$$$ and $$$p[i]$$$ respectively.
      2. For each query, if $$$D[i]$$$ is less than $$$\sqrt{10^9}$$$, then solve it in $$$O(n)$$$ by the simple greedy.
      3. For all other queries, keep a list of at most $$$\sqrt{10^9}$$$ pointers tracking the leftmost endpoints of the ranges. As you go from one query to the next, advance each of the pointers to the right to transition to the new solution. Each pointer only goes to the right, therefore it advances at most $$$n$$$ times.

      Total complexity is $$$O((n + q) \sqrt{10^9})$$$ which might or might not fit in the TL, depending on how tight it is.

»
20 months ago, # |
Rev. 3   Vote: I like it -36 Vote: I do not like it

If $$$q = 1$$$, you can just sort the $$$p$$$ array, and greedily put the left most point of the segment on the left most point you haven't covered yet. The complexity is $$$O(ans * \log n)$$$, the implementation is very simple, you just spam upper_bound.

Now, if $$$1 \leq q \leq 10^{5}$$$, we won't be answering queries straight away. We'll call $$$T[i]$$$, $$$1 \leq i \leq n$$$, the minimum $$$l$$$ such that all point can be covered with no more than $$$i$$$ segment of length $$$l$$$. To get the answer for each $$$T[i]$$$, we can simply binary search to find the answer, then we can answer each query in $$$O(\log n)$$$ using the array $$$T$$$.

The complexity of this algorithm is $$$O(n * (\log n)^{2} * \log (max (D)) + q * \log n)$$$, because for each $$$i$$$, it takes $$$O(i * \log n * \log (max (D)))$$$ to find $$$T[i]$$$, therefore the total number of operation is $$$ (1 + 2 + 3 + ... + n) * \log n * \log (max (D))$$$ = $$$n * (\log n)^{2} * \log (max (D))$$$. To answer the query, you just have to binary search on $$$T$$$, so the complexity is $$$q * \log n$$$

Note that it is important that you set the lower bound of the binary search for each $$$1 \leq i < n$$$ to be $$$T[i + 1]$$$, as the solution depends on the fact that the complexity to check a given $$$l$$$ is $$$O(ans * \log n)$$$, and we don't want $$$ans$$$ to wander above $$$i$$$.

The complexity is really big, but I think that you can apply some heuristic tricks to make the $$$\log$$$ constant much smaller to barely fits through the constraints. Edit: sorry, had a brainfart, my solution was wrong

»
20 months ago, # |
Rev. 2   Vote: I like it +11 Vote: I do not like it

Here's an algorithm with $$$O(n\sqrt{q}\log n)$$$ running time, which is unlikely to pass but is at least faster than brute force.

We process the queries in increasing length order. Say the current length is $$$\ell$$$. Maintain a dynamic forest, where the parent of a point $$$p$$$ is its successor when executing the greedy algorithm (i.e., the first point after $$$p$$$ that has distance $$$>\ell$$$). Using this dynamic forest, we can simulate each step of greedy in $$$O(\log n)$$$ time.

The issue is this dynamic forest will change a lot. To handle this issue, we only update the parent of a node $$$p$$$ if there are only $$$\leq b$$$ points between $$$p$$$ and its parent, where $$$b$$$ be a parameter to be set later. Otherwise we say $$$p$$$ is "corrupted" and do not explicitly maintain its parent. In this way, the parent of a node will only change $$$O(b)$$$ times. This part overall takes $$$O(nb\log n)$$$ time.

On the other hand, there's a greedy algorithm that runs in $$$O(\mathrm{ans}\log n)$$$ time (repeatedly find the successor). Similar to this idea, we repeatedly find the next corrupted node $$$q$$$ that the greedy algorithm will encounter, which takes $$$O(\log n)$$$ time using dynamic forest, and then find the parent of $$$q$$$ by 1D successor search in $$$O(\log n)$$$ time. There will be only $$$O(\frac{n}{b})$$$ corrupted node in the optimal solution, so each query takes $$$O(\frac{n}{b}\log n)$$$ time. Balancing the terms, set $$$b=\sqrt{q}$$$, the total running time is $$$O(n\sqrt{q}\log n)$$$.