Suppose we are given two increasing arrays A and B , we need to find max((a[j]-a[i-1] )- (b[j]-b[i-1])) , where j>=i .
Can we solve it in better than O(n^2) time complexity . If so , please give me any Hint .
# | User | Rating |
---|---|---|
1 | tourist | 3856 |
2 | jiangly | 3747 |
3 | orzdevinwang | 3706 |
4 | jqdai0815 | 3682 |
5 | ksun48 | 3591 |
6 | gamegame | 3477 |
7 | Benq | 3468 |
8 | Radewoosh | 3462 |
9 | ecnerwala | 3451 |
10 | heuristica | 3431 |
# | User | Contrib. |
---|---|---|
1 | cry | 167 |
2 | -is-this-fft- | 162 |
3 | Dominater069 | 160 |
4 | Um_nik | 158 |
5 | atcoder_official | 156 |
6 | Qingyu | 155 |
7 | djm03178 | 152 |
7 | adamant | 152 |
9 | luogu_official | 150 |
10 | awoo | 147 |
Suppose we are given two increasing arrays A and B , we need to find max((a[j]-a[i-1] )- (b[j]-b[i-1])) , where j>=i .
Can we solve it in better than O(n^2) time complexity . If so , please give me any Hint .
Name |
---|
You are basically asking for $$$max((a[j] - b[j]) - (a[i] - b[i]))$$$ where
i < j
holds. Let $$$c_i = a_i - b_i$$$. Then it becomes: $$$max(c[j] - pref[j - 1])$$$ where $$$pref[i]$$$ is minimum element of $$$c[0...i]$$$. This can be solved in linear time.