Are there any good rules of thumb on when a square root decomposition will be too slow? I suspect if we have an interval of length 105 and 3 × 105 queries a square root decomposition is too slow but this is only based on my attempts to solve http://codeforces.net/problemset/problem/765/F with a square root decomposition where I calculated the complexity of my algorithm to be and exceeding the time limit. In general if you are given an interval of length N and Q queries what sort of values of do you want in order to be confident that a square root decomposition will pass the time constraint? Also are there standard limits used in Codeforces deliberately set to prevent square root decomposition based approaches?
Auto comment: topic has been updated by usernameson (previous revision, new revision, compare).
Is it really ever possible to be confident in square root decomposition?
Probably as long as complexity yield less than 200m operations you are fine.
Thanks I will use 200m operations as a rule of thumb for future problems.