Hello codeforces,today I want to introduce a technique that can solve some problems related to distinct values queries on a subtree.The only way I know to solve array distinct values querys with updates is using MO's algorithm .The approach I will introduce can solve the subtree version of this problem online with updates in $$$O(Nlog^2(N))$$$.So let's get started.
Prerequisites:
1.Heavy light decomposition (it's enough to know how it works and how to implement it).
2.Lowest common ancesstor.
3.Euler tours (not neccessary for this specific problem but I will use it to prove some fact).
Problem Statment:
You are given a tree rooted at vertex 1. You have to perform two types of queries:
1 $$$i val$$$ : update the value of node $$$i$$$ to $$$val$$$.
2 $$$u$$$ : output the sum of distinct values in the subtree of node $$$u$$$.
Solution:
Whenever you encounter such a strange problem try to make a shift in prespective .What is the first thing you thought when you have seen this problem? Euler tour? Persistent Segment trees? Isn't it similar to the array distinct values queries? But wait ,I just said it's hard to solve this problem. Didn't I?
Can we change the problem into path queries problem?
Wait what are the nodes that are affected by an update?
I turns out that the affected nodes are all in a range on the ancesstory path of the updated node.Don't it seem like a standard HLD problem? Not yet.Here comes the idea.Consider our new problem: We want to update the answers in a range on the ancesstory path of a node.So,where does this range end? If we figure out this then the problem becomes a standard path update and queries problem that can be easily solved using HLD. Now, here is the fun part. Isn't it obvious the endpoint is the first node on the ancesstory path that doesn't have an occurence of (val)? How can we find such a node? well this is the same as finding the first node on the path that have an occurence then going down by one edge. It turns out that this node is the first node that have the closest occurence of (val) in its subrtree. But what does it mean for a node to be be the closest? Well,let's consider this node to be the LCA of the updated node and the closet node since the LCA is the first node on the ancesstory path from the updated node to the root.Now the problem boils down to finding the maximum depth LCA among all possibles LCAs ,but how can we do this fast? Well, if we maintain a map of sets that stores for each number the (time in)s in DFS order for the nodes with a value equal to that number then the node we are seaching for is the node with maximum depth among LCA(updated node,the first node with time in bigger than the time in of the updated node) and LCA(updated node,the first node with time in smaller than the time in of the updated node).But why is this true in general? Proof of the above fact: Let's consider the case of (the first node with time in bigger than the time in of the updated node(let's call this node u)) then the second case will go through an identical reasoning. Consider taking a node with a bigger time in than u (let's call it v). Let x be the first node on the ancesstory path then: if a node d is the closest node to i then d must be in the subtree of x. Because in[i]<in[u]<in[v] if v was in the subtree of x and i was in the subtree of x then also u is in the subtree of x.Thus taking u is at least as optimal as taking node v. A recap for what we did: You could recflect on how we transformed a subtree queries problem into a path queries problem which turns out to be much easier if we think that way. Then we used a fact that facilitates using the bruteforce approach by finding optimal values to try . Implementation: There are many implementation deatiles that I skipped above but if you are curious here is my code: code I couldn't find any link for a problem that uses this technique so I made one but I am not sure about the quality of tests for TLE solutions. link: You can also optimize heavy light decomposition see this. Thank you for reading my blog. P.S. This is my first educational blog I hope you loved it and sorry for long text :) .