This trick is well known in China.
For poor people that cannot use youtube:
https://www.bilibili.com/video/BV1uT4y1P7CX/
For international users:
# | User | Rating |
---|---|---|
1 | tourist | 3985 |
2 | jiangly | 3814 |
3 | jqdai0815 | 3682 |
4 | Benq | 3529 |
5 | orzdevinwang | 3526 |
6 | ksun48 | 3517 |
7 | Radewoosh | 3410 |
8 | hos.lyric | 3399 |
9 | ecnerwala | 3392 |
9 | Um_nik | 3392 |
# | User | Contrib. |
---|---|---|
1 | cry | 169 |
2 | maomao90 | 162 |
2 | Um_nik | 162 |
4 | atcoder_official | 161 |
5 | djm03178 | 158 |
6 | -is-this-fft- | 157 |
7 | adamant | 155 |
8 | awoo | 154 |
8 | Dominater069 | 154 |
10 | luogu_official | 150 |
For poor people that cannot use youtube:
https://www.bilibili.com/video/BV1uT4y1P7CX/
For international users:
Hereinafter assume div 1 notation is used.
Problem ?1: insignificant
Problem ?2: nonsense
Problem @: silly
Problem A: trivial
Problem B: dumb
Problem C: uninteresting
Problem D: stupid
Problem E: toxic
Problem F: not read
also bump when release editorial
Hello CodeForces!
Instead of posting a not-so-meaningful blog that helps farm my contribution, today, I would like to have the CodeForces community being benefited at the same time. Therefore, I would like to make a blog about one of my favourite techniques applied on trees, the Kruskal Reconstruction Tree. Although this technique is not too commonly used in CodeForces rounds, it has become more and more popular in other competitions such as the IOI. Also, I found out that there are some very limited resources about this technique that is written in English. Therefore, I would like to write a blog on KRT. If you find this blog useful, please DOWNVOTE this blog. It will give me immense support to do other topics like these in the future.
Finding MST by using Kruskal's Algorithm
Disjoint Set Union
Finding lowest common ancestor in $$$O(n log n)$$$ precomputation time and $$$O(log n)$$$ for each query.
Given a graph with $$$N$$$ nodes and $$$M$$$ weighted, bidirectional edges, we want to find the what the maximum possible weight of the lightest edge (or the opposite, that is, the minimum possible weight of the heaviest edge) of a path between two nodes $$$u$$$ and $$$v$$$. We aim to do this in $$$O(n log n)$$$ precomputation time and $$$O(log n)$$$ time for answering each query. For example, if we have this graph:
then the maximum possible weight of the lightest edge between $$$2$$$ and $$$6$$$ is $$$5$$$ because the path $$$6-1-5-2$$$ has the answer $$$min(7,6,5)=5$$$. Similarly, the maximum possible weight of the lightest edge between $$$2$$$ and $$$8$$$ is 5 because the path $$$2-5-4-3-8$$$ has the answer $$$min(5,8,6,6)=6$$$.
To find the maximum possible weight of the lightest edge, we do something similar to finding the minimum spanning tree using Kruskal's algorithm. That is, we sort the edges in decreasing order of their weights, if the two nodes are connected, then leave them alone; otherwise, connect two nodes nodes using this edge.
Therefore, it can clearly be seen that among the $$$M$$$ edges, only the $$$N-1$$$ edges that appear in the maximum spanning tree are actually useful to us.
Now here comes the fun part: we transform the spanning tree with $$$N$$$ nodes into a binary tree of $$$2N-1$$$ nodes (that is, we add $$$N-1$$$ more auxiliary nodes). In particular, all leaf nodes in the new binary tree is a node from the original tree, and all non-leaf nodes in the new binary tree is a newly added auxiliary node. All the new nodes are assigned a value, and edges no longer have weights in the new tree.
So how to do the transformation? It's actually very similar to how you do the Kruskal's algorithm for MST. Here, I provide the psuedocode to carry out the transform process.
initially the new graph has n nodes and no edges
sort the edges from the original graph in descending order of their weights
for each edge e:
let u and v be the nodes connected by the edge e
if u and v are not connected in the new graph:
create a new auxiliary node
assign the value of the new auxiliary node to be the weight of edge e
assign the left node of the new auxiliary node to be the root of the tree which contains u
assign the right node of the new auxiliary node to be the root of the tree which contains v
Clearly, $$$N-1$$$ new nodes will be created, and all nodes will be connected in the end. So, the new tree of the graph given above is as follows:
Here, the nodes $$$A_1$$$ to $$$A_8$$$ are the $$$8$$$ auxiliary nodes. $$$A_1$$$ is the first auxiliary node being constructed, then $$$A_2$$$, then $$$A_3$$$, and so on, and $$$A_8$$$ is the last auxiliary node being constructed. The values of $$$A_1,A_2,A_3,A_4,A_5,A_6,A_7,A_8$$$ should be $$$9,8,7,6,6,6,5,2$$$ respectively. It is also worth noting that the $$$A_i \ge A_j$$$ for all $$$1 \le i < j \le N-1$$$.
Note that the newly constructed tree may not be unique when at least two edges have the same weight.
As you might have already guessed, our required value between two nodes $$$u$$$ and $$$v$$$ is nothing but the value of node $$$A_{LCA(u,v)}$$$. If this is not too obvious to you, you might want to think from the perspective of Kruskal's Algorithm for MST.
Among all the tree algorithms out there, the KRT is in my opinion, one of the easier ones to be implemented. I will provide the C++ source code here for your reference.
// insert DSU template here
// insert LCA template here
int N;
vector <pair <pair <int, int>, int> > edge;
bool cmp(pair <pair <int, int>, int> a, pair <pair <int, int>, int> b) {
return a.second > b.second;
}
int value[N];
void construct() {
sort(edge.begin(), edge.end(), cmp);
int new_node_label = N + 1;
// initialize DSU of 2N-1 nodes
for (auto e: edge) {
if (dsu_parent(e.first.first) == dsu_parent(e.first.second)) {
continue;
}
value[new_node_label] = e.second;
lca_parent[dsu_parent(e.first.first)][0] = new_node_label;
lca_parent[dsu_parent(e.first.second)][0] = new_node_label;
dsu_union(e.first.first, new_node_label); // don't mess up the order!
dsu_union(e.first.second, new_node_label); // e.first.first and e.first.second are the new children, new_node_label is the new parent
new_node_label++;
}
// precompute the lca_parent[][] array
}
int query(int u, int v) {
return value[LCA(u, v)];
}
void input() {
cin >> N; // number of nodes
for (int i = 1; i < N; i++) {
int u, v, weight;
cin >> u >> v >> weight;
edge.push_back(make_pair(make_pair(u, v), weight));
}
}
Finally, we will look into the task "Werewolf" which appeared in IOI 2018 Day 1 Problem 3. This problem was the third easiest task in that year's competition and 17 contestants got full marks in this problem. Although the editorial describes a solution that does not require the use of the Kruskal Reconstruction Tree, this problem can be solved effectively with the help of the KRT.
Abridged Problem Statement: You are given a connected graph of $$$N$$$ nodes and $$$M$$$ unweighted, bidirectional edges. Your task is to start travelling from node $$$S$$$ in human form and end your trip at node $$$E$$$ in wolf form. During the trip, you should pick exactly one node on the path and transform from human form to wolf form at that moment. Humans can only visit nodes numbered $$$\ge L$$$, and wolves can only visit nodes numbered $$$\le R$$$. You have to determine if this is possible or not. You have to answer $$$Q$$$ queries, each given in the form $$$S_i,E_i,L_i,R_i$$$.
A very common and clever trick to these type of problems (change from one form to another during a path) is to construct two separate graphs (one for human, one for wolf). For each edge in the original graph that connects two nodes $$$u$$$ and $$$v$$$, assign the edge weight in the human graph to be $$$min(u,v)$$$, and assign the weight in the wolf graph to be $$$max(u,v)$$$. Then, sort the edges in the human graph in descending order of their weights, and sort the edges in the wolf graph in ascending order of their weights. This is useful and helpful because it can correctly represent the minimum/maximum values attained by walking through that edge. This then motivates us to have the KRT constructed for each of the two graphs.
However, this problem requires a bit more information to be stored on each of the auxiliary nodes. Apart from storing the weight of the edge that connects the two children, let's also store the set of leaf nodes that are descendants of the current auxiliary node. (Note: this cannot be naively stored in each of the vertices, try to think a more memory efficient method that can store the information when performing dfs along the new tree.) For the last step, we simply need to find the furthest ancestors that have values $$$\ge L$$$ and $$$\le R$$$ respectively, and then the answer for the query is YES if and only if the sets of leaf nodes have at least one element in common. This can be efficiently computed by either a binary indexed tree or a wavelet tree in $$$O(n log n)$$$ precomputation and $$$O(log n)$$$ for each query.
Implementation is left as an exercise for the reader.
Although the Kruskal Reconstruction Tree is not a technique that appears super frequently during contests, it is nevertheless a quite elegant algorithm that does not require the use of too complex prerequisites, ideas, and thoughts. I find the KRT to be one of the most interesting tree algorithms, and through this blog, I hope you are able to learn the basics of KRT. Once again, thank you for reading this blog, and if you have read till this far, please do me a favour and DOWNVOTE this blog so that I can reach rank one of the contribution leaderboard. Your effort is very much appreciated.
I wish you every success in the upcoming contests!
Name |
---|