Hi everyone!
Recently, at the MIPT: The Fall training camp on the contest from Alexander Milanin was a problem from Petr Mitrichev Contest 7. We were given a graph and a set of queries like "suppose we removed from the graph k ≤ 4 edges. Check whether graph is still connected?" I want to talk further about the solution of a more general problem, when the edges are added and removed without additional constraints in offline. The first algorithm with such an assessment was offered by David Eppstein in 1992, reducing it to fully dynamic minimum spanning tree problem, but here we will focus on a simple algorithm, proposed in 2012 by Sergei Burunduk1 Kopeliovich.
Let's assume that there are three types of queries & mdash; add the edge (+
), remove the edge (-
) and find out some information about the graph (?
) (in this case, let it be the number of connected components of the graph). We assume that we received a k queries. Consider the k + 1 points of time & mdash; initial and k points after each query. For convenience, we transform the requests of the first kind in the queries of the form "i-th edge is present in the column from the time l to the time r" (!
).
Thus, suppose we have a graph G = < V, E > and the set of queries. Let be a set of edges, which are always present in it (that were originally there and were no requests for their removal). Let's compress each connected component formed from such edges in one vertex and construct a new graph of these vertices. Also delete all vertices that are not mentioned in the list of requests (to work with the graph of k vertices). Remade queries so that if in the initial graph query was assigned to a pair of vertices (a, b), now it will be assigned to a pair of vertices (comp(a), comp(b)). We see that the execution of ?
requests in new graph will have exactly the same result as in the initial. It is further proposed an algorithm: divide the time interval which is currently being processed in two halves and recursively solve firstly the left and then the right side, and thus obtain answers for the entire set of queries. Base & mdash; a single point of time, answered trivially & mdash; at this point we fed to the input graph without edges, therefore, for any query answer will be the number of vertices in the graph. At each step, the query processing subsegment [l;r), we will keep only those vertices that are mentioned on this subsegments, then the request [l;r) will be processed in the O(r - l), which will be in the sum over all subsegments.
Sergei also proposed a similar idea of algorithm for a biconnected components (and bridges) in a dynamically changing graph in offline. You can read about it in his diploma, which is attached below.
To summarize, I would like to offer traditionally solve several problems on the topic. Especially for this was made training. Good Luck & Have Fun!
P.S. More details about the structure and other algorithms for solving the problem (as well as the proof of some trivial facts that have been omitted) can be found in Burunduk1's diploma.
Nice problem. I thought about a somewhat different solution for the offline problem, which seems a bit easier to me, from a conceptual point of view. Let's construct a segment tree over the k time moments. Then, each operation of type "edge i is available from time L to time R" is inserted in the O(log(k)) segment tree nodes which cover the interval [L,R]. This step takes O(k*log(k)) time and memory.
Then we traverse the segment tree by using DFS starting from the root. During the traversal we maintain the connected components of the graph using disjoint sets. When we enter a segment tree node, we perform a Union operation for each edge which is stored in that node. We also store the successful Union operations in a stack of operations so that, when we exit a segment tree node, we are able to undo all the Union operations performed when entering the node. Before exiting a leaf node of the segment tree node which corresponds to a "?" operation we have the answer as the number of sets in the disjoint sets data structure (which we maintain as a separate variable which is updated when a Union is successful and when a Union is undone).
If we use only union by rank or union by size for the disjoint sets the overall time complexity is O(k*log^2(k)). If we also use path compression then the time complexity should be lower, but I don't know how to compute how low it would go (i.e. I'm not sure if it would become O(k*log(k)*alpha(k)) or something else, where alpha is the inverse Ackermann function).
Hi
That seems easier to me too. But can you elaborate the approach to undo all the operation done while traversing the segment tree?
Here is what I am doing while traversing the tree. I am adding all the edges present on the segment tree node using disjoint-union data structure (which changes the representative array of disjoint union data structure). To do these operations I need to copy the representative array into another array. Considering segment tree can have 4*(3*10^5) nodes and the graph we are considering can have 3*10^5 nodes the above operation will surely TLE.
Can you help me to reduce the time complexity here?
Thanks :D
Correct me if I'm wrong, but I think "segment tree" is an incorrect translation and the actual term is "interval tree" (at least according to this thread)
that is "segment tree"中文线段树,no problem.
I don't think you can very easily optimize a DSU data structure with path compression to support undo operations (without writing all the path compressions to memory). Moreover, I highly doubt the "true" complexity will be O(α(n)), as undo operations tend to break all algorithms that relied on amortized operations. I really wonder how the author's proposed algorithm truely is O(klog(k)).
There is a provable O(klogk) solution to the offline dynamic connectivity problem, but with a larger constant factor. The key idea is to use link-cut tree to maintain the maximal spanning tree where the weight of edges are their deletion time. With a piece of link-cut tree code, it's fairly easy to implement.
code: link
You can also use link-cut trees for dynamic bridges (problem D) in . It tricky to implement, but faster in practice than the divide&conquer+dfs solution.
Is there an English version of the diploma available anywhere?
there is a mistake in the statement of problem b. 1 ≤ M ≤ 10^5, but the M in the sample test is 0. maybe 0 ≤ M ≤ 10^5 ? :)
can we use disjoint sets with sqrt decomposition for this problem ?? a solution of similar problem (disjoint set union division) is given in this site http://e-maxx.ru/algo/sqrt_decomposition but i didn't understand this completely
Can anybody give link to a problem "just implement DCP-offline"?
100551A - Connect and Disconnect
Is there anyway to get the actual test case content for 100551A — Connect and Disconnect? I can't figure out why I keep getting a runtime error, even if the sample case works just fine; seems to be some memory issue, but it's quite frustrating.
Did u remember to read/write to input/output files, becos this question requires u to do that.
...that's true, thanks a lot Ryan, completely forgot about that. Kudos for the fast response!
Ur welcome :)
Is it legal to cast pointer to int. (In log2 implementation)? I don't think it is right, but it passes everywhere I submit this version of DSU.
1303F is another example.
It seems like it's very hard to get dynamic connectivity to pass for that problem.
https://codeforces.net/blog/entry/73812?#comment-580151
https://codeforces.net/blog/entry/73812?#comment-580000
Here's a slightly more code-heavy solution that I haven't seen anyone explain yet:
Suppose we want to sweep through the time steps and maintain a maximum spanning tree of the currently alive edges. When we hit a new edge e=(u->v) that we want to process, we have to find the first edge that gets deleted on the path from u to v. If that edge gets deleted before e, we should remove that edge from our MST and replace it with e. Otherwise, all edges in that path are better than e, so we should just ignore that we tried to add e. When we get to a time that we have to delete and edge, if it is in the MST, we simply delete it from the MST. Otherwise, we skip the query.
Clearly, all that matters in this situation is whether two nodes are connected in the MST: if they are, then they would be in the graph. If they aren't connected in the MST, then they aren't connected in the graph either; if there was some edge that got deleted later than now, then by definition, it would be in the MST right now.
All that is left is finding the latest-deleted edge along a path in some data structure that supports adding and deleting edges. We can do this with a link cut tree fairly easily by conceptually splitting each edge into two edges and one node of degree 2. For instance edge e=u->v would look like [u — e — u] where e is the node in the LCT that represents its corresponding edge.
Can someone please explain this in more detail for me? It seems like a divide-and-conquer over time but the explanation of how the problem is actually divided is missing.
Edit: I figured it out. The part where the always connected components are collapsed is done for every recursive call, not just once at the beginning.
The last problem Disconnected Graph might have another solution in $$$O(K2^c)$$$。
After building the dfs tree of the graph, we can assign a random value to every non-dfs-tree-edge, and define the value of each dfs-tree-edge as the XOR of the values of all the non-dfs-tree-edges which cover it. Then, when querying each small set of $$$c$$$
edges, we can $$$O(2^c)$$$ check each subset of the set and see if the XOR of the edges are $$$0$$$. If it is, then we can almost claim that after deleting this subset the graph will be disconnected.