Question -> https://codeforces.net/problemset/problem/264/A
My Code -> https://pastebin.com/DpxZ8wEp
# | User | Rating |
---|---|---|
1 | jiangly | 4039 |
2 | tourist | 3841 |
3 | jqdai0815 | 3682 |
4 | ksun48 | 3590 |
5 | ecnerwala | 3542 |
6 | Benq | 3535 |
7 | orzdevinwang | 3526 |
8 | gamegame | 3477 |
9 | heuristica | 3357 |
10 | Radewoosh | 3355 |
# | User | Contrib. |
---|---|---|
1 | cry | 167 |
2 | -is-this-fft- | 165 |
3 | atcoder_official | 160 |
3 | Um_nik | 160 |
5 | djm03178 | 158 |
6 | Dominater069 | 156 |
7 | adamant | 153 |
8 | luogu_official | 151 |
8 | awoo | 151 |
10 | TheScrasse | 147 |
Question -> https://codeforces.net/problemset/problem/264/A
My Code -> https://pastebin.com/DpxZ8wEp
Name |
---|
Precision problem, "long double" data type support a limited accuracy.
When x,y are small enough, there is no guarantee on the inequality $$$L<(L+R)/2<R$$$ and compiler may gives you something like $$$L=(L+R)/2\leq R$$$. For instance all char in your problem are 'l' and your position are $$$1/2,1/4...,1/2^n$$$. Can the computer differentiate $$$1/2^{999990}$$$ and $$$1/2^{999991}$$$ properly? (In fact it treats $$$1/2^{16446}$$$ as $$$0$$$)