Hi all,
I've recently come across this http://main.edu.pl/en/archive/pa/2010/fra. The problem statement is as follow: given up to n = 5000 disjoint intervals of the form [ai, bi], we let S be all the integers in these intervals (a1, a1 + 1, ..., b1, a2, ..., b2, .., bn Now, we have up to q = 500000 queries of strings, count the total number of times a string s will appear in the numbers of set S.
The only thing I can solve is if n = 1 or something, and I barely can see a way to approach this problem, especially since we can't iterate through all the intervals for each query.
If somebody could help, that would be great!
Best,
minimario
I wrote a trie-based solution to this problem, but I can't fit it into the memory limit (MLE 4/10 cases). If this helps anyone, they can look at my code.
I changed my approach and finally got AC. I believe my solution is O(m log m + maxlen*(n log n + m)).
If we are given a range [a,b], we can view this as (the answer for range [0,b]) — (the answer for range [0,a-1]). This lets us view the problem as O(n) ranges in the form [0,someValue] (which either add or subtract from the answer) and m sequences.
Let's look at all possible occurrences that start X-1 digits from the end of a number for all possible X values. For example in the number 14211, if we are looking at the pattern "42", then the X value would be 4. For a given pattern P of length L at a fixed value of X, the number of occurrences is the number of values in the given ranges that are also in the range [P*10^(X-L),(P+1)*10^(X-L)-1] when under modulus 10^X. We can find these values by treating the beginning and ends of such ranges as events and sorting the events, we create these events for all given ranges and given patterns at each possible value of X.
While this is the key motivation behind my solution, there are specifics I didn't fully explain, you can see my code here. Note: this took me a decent amount of constant factor optimization to pass case 6b.