You are given two arrays of integers a, b both with length n <= 10^5. For each 0 <= x <= n print the sum of a[i]*b[x-i] for all 0 <= i <= x.
It's obvious this can be done in quadratic time, but can we do any better?
# | User | Rating |
---|---|---|
1 | tourist | 3803 |
2 | jiangly | 3707 |
3 | Benq | 3627 |
4 | ecnerwala | 3584 |
5 | orzdevinwang | 3573 |
6 | Geothermal | 3569 |
6 | cnnfls_csy | 3569 |
8 | Radewoosh | 3542 |
9 | jqdai0815 | 3532 |
10 | gyh20 | 3447 |
# | User | Contrib. |
---|---|---|
1 | awoo | 163 |
2 | maomao90 | 162 |
3 | adamant | 161 |
4 | maroonrk | 152 |
5 | -is-this-fft- | 151 |
6 | nor | 150 |
7 | atcoder_official | 147 |
7 | SecondThread | 147 |
9 | TheScrasse | 146 |
10 | Petr | 145 |
Help with problem
You are given two arrays of integers a, b both with length n <= 10^5. For each 0 <= x <= n print the sum of a[i]*b[x-i] for all 0 <= i <= x.
It's obvious this can be done in quadratic time, but can we do any better?
Name |
---|