I tried solving the problem the following way: 1) Using dfs I maintained the euler tour of the graph,height(or depth) of nodes,along with first and last occurence of index in euler tour. 2) Built a segment tree to give the minimum depth node in an interval along with its index. 3) Built another segment tree to give the number of different colors within an interval in the euler tour array.(I used Bitsets in java here). 4) for each query I calculate first the lca of the given nodes, call it X,using first segment tree. 5) then using first and last pos of X in the euler tour I calculate the number of different colors in this interval,using second segment tree.
But unfortunately I have been getting TLE at 15th test case. I tried optimising it a lot but its not working. Any help would be appreciated!
UPDATE: Instead of using segment tree to find the number of different color,instead for each node I calculated the number of different colors in a subtree using BitSet while dfs itself only. Now I'm calculating only the lca in log(n) time while answering the number of different colors in a subtree in O(1). But guess what? Now I'm getting NZEC on the same test case 15 which I don't get why:(. Any help would be appreciated:)