I have an array a
of values +1 and -1 in random order, and a separate array s
which is initialised with 0 in every position. I run through the array from left to right. At every index i
, I add the value of a[i]
across s[0], s[1], ... , s[i-1], s[i]
. And at every index i
, after doing the previous operation, I want to return the rightmost index in the range [0, i]
with value 0. If no such index exists, I'll return some invalid number like -1.
The adding operation can be done using a Fenwick tree. But I have no clue to find the rightmost index in logarithmic complexity. I'm trying to do the entire process in O(n)
or O(nlogn)
complexity. Any idea would be much appreciated, thanks!