How can I find the minimum number of edges need to be added to convert a tree graph to binconnected component such that if we remove an edge from that graph, the graph remain connected? Please give me some idea to do that! Thanks in advance :)
# | User | Rating |
---|---|---|
1 | tourist | 3985 |
2 | jiangly | 3814 |
3 | jqdai0815 | 3682 |
4 | Benq | 3529 |
5 | orzdevinwang | 3526 |
6 | ksun48 | 3517 |
7 | Radewoosh | 3410 |
8 | hos.lyric | 3399 |
9 | ecnerwala | 3392 |
9 | Um_nik | 3392 |
# | User | Contrib. |
---|---|---|
1 | cry | 169 |
2 | maomao90 | 162 |
2 | Um_nik | 162 |
4 | atcoder_official | 161 |
5 | djm03178 | 158 |
6 | -is-this-fft- | 157 |
7 | adamant | 155 |
8 | Dominater069 | 154 |
8 | awoo | 154 |
10 | luogu_official | 150 |
How can I find the minimum number of edges need to be added to convert a tree graph to binconnected component such that if we remove an edge from that graph, the graph remain connected? Please give me some idea to do that! Thanks in advance :)
Name |
---|
See the solutions to the problem called network here. Basically one way is to find the node x such that if x and all its edges are removed, no more than half of the leaf nodes of the original tree are in a single connected component. This can be done with an algorithm similar to finding the centroid of a tree. Then, the edges to add can be found greedily by connecting two edges from the two components with the most remaining leaves and repeating.
Isn't that whole centroid-finding step unnecessary overkill?
All you need to do is add the new edges in such a way that each old edge will lie on some cycle.
To do that, just root the tree at any non-leaf vertex and run a plain dfs. Each time you encounter a leaf, append its number to
leaves[]
. At the end, lets = leaves.size()/2
. For each valid i, connectleaves[i]
andleaves[i+s]
.Why does this work? Consider any edge u — parent(u) in the original tree. The leaves in vertex u's subtree form a contiguous interval in the array
leaves
. Then we can easily see that some of the new edges we added must have exactly one endpoint in this interval. Each such new edge forms a cycle that contains the u — parent(u) old edge.Hi, I have some troubles with that problem, I use the mkirsche's idea of finding the centroid and after that match leaves from different subtrees.
For the matching I use the misof's method connecting leaves[i] and leaves[i+s].
However I'm getting WA. Now I know that the step of the centroid is overkill, but it should be correct. I really appreciate if you could help me finding my error.
I posted my code and method yesterday in CF POST
Thanks
I've reduced my code to the following CODE
I assign a different color to each adjacent of the centroid (root) and propagate that color to its subtree.
I build a vector of pairs <color of i, i> in the dfs, so all the leaves of the same color are adjacent. Doing the match of leaves[i] and leaves[i+s] whit s = leaves.size()/2 I get AC.
However, when I sort the vector of leaves () (line 79 of my code) I get WA. According to me, sorting the vector of pairs doesn't change the order of colors, just the order between nodes of the same color, so the matching should still working (match 2 leaves of different color).
I can't understand why my solutions es wrong, I hope you could help me. PD: Maybe you dont remember me but I am one of the students who assisted to your training camp in Peru last year.