Given a graph with n(<=100) vertices and m( <=n(n-1)/2 ) edges, you are asked to find a spanning tree with the minimum difference between the biggest and the smallest edge of the tree.
The problem can be found Here (UVA 1395). I can't figure out an efficient way to solve this. Any kind of help is appreciated.
Update: Got AC using fake123_loves_me's approach. this is my code if anyone is interested.
You can do it like this. Sort edges by cost and then do something like two pointers with link-cut tree. while the vertexes will be connected cut the last edge(and make sure to add the last one you erased back). O(MlogN)
I'm not familiar with link-cut tree. Could you please explain your approach a bit more?
You can solve it with $$$O(M^2 \cdot \alpha(N))$$$ time complexity.
At the beginning let's sort edges by their cost. Now let's iterate through them and fix an edge with minimum cost that will be in our spanning tree. So now we should try to build a spanning tree using all edges from current to the last. We can do it with $$$O(M \cdot \alpha(n))$$$ time complexity with Kruskal's algorithm. When we build a spanning tree let's calculate the difference between the maximum and the minimum cost and compare it with our answer.
That's brilliant! Thanks a lot.
A more efficient soln does exists. Dynamic connectivity problem in $$$O(M.\alpha(N).\log(M))$$$
Key Idea — If you want to increase weight of minimum edges you dont need to build a complete spanning tree again.
Interesting. I'll look into it. Thanks.
Auto comment: topic has been updated by Aritra741 (previous revision, new revision, compare).