I was trying out implementing a incremental version of CHT with sqrt decomposition. I noticed a very wierd thing though. When I made my block size = 350, the code TLEd with >2000ms. However when i made it 351, it passed in 390ms
Here are two sets of submissions that differ in only 1 aspect, the block size.
blocksize = 350 : TLE
This seems very wierd to me, since blocksizes of <350 also give AC(although with more time. blocksize = 200 with AC in 1669 ms). However, its the specific value of 350 that times out.
What could be a possible for reason for this? I am just curious. (As far as i can see, my code does not have any undefined behaviour.)
I didn't read the code, but if I were the author then I'd possibly make countertests for some popular values of block size, 350 being among them.
The problem isnt meant to be solved by sqrt decomp. Optimal soln is O(n) CHT. (slopes and query values are already sorted in the problem). I was just trying out my implementation on it. Hence I doubt there is any special case made by the author to fail a certain block-size.
There is no problem with block-size tests. Just infinite loop in your code.
I just see, that there is a possibility that your code will work infinite and check it.
53469476
ahh thanks a lot. stupid bug