I tried solving the problem the following way: 1) Using dfs I maintained the euler tour of the graph,height(or depth) of nodes,along with first and last occurence of index in euler tour. 2) Built a segment tree to give the minimum depth node in an interval along with its index. 3) Built another segment tree to give the number of different colors within an interval in the euler tour array.(I used Bitsets in java here). 4) for each query I calculate first the lca of the given nodes, call it X,using first segment tree. 5) then using first and last pos of X in the euler tour I calculate the number of different colors in this interval,using second segment tree.
But unfortunately I have been getting TLE at 15th test case. I tried optimising it a lot but its not working. Any help would be appreciated!
UPDATE: Instead of using segment tree to find the number of different color,instead for each node I calculated the number of different colors in a subtree using BitSet while dfs itself only. Now I'm calculating only the lca in log(n) time while answering the number of different colors in a subtree in O(1). But guess what? Now I'm getting NZEC on the same test case 15 which I don't get why:(. Any help would be appreciated:)
UPDATE: GOT accepted with c++ the same logic!
Maybe you can make your solution slightly faster by replacing minimum segment tree with sparse table. Also it would be easier to help you if you share your source code.
Won't the built time of sparse table hurt the complexity?
Sparse table can be built in nlogn time but with it you can find LCA in O(1) time, so it's still same complexity but it would be faster because you answer queries faster and have less constant factor.
OK I'll try! Thx for your time!
Auto comment: topic has been updated by SLIMSHADYNICK (previous revision, new revision, compare).
Auto comment: topic has been updated by SLIMSHADYNICK (previous revision, new revision, compare).