Question -> https://codeforces.net/problemset/problem/264/A
My Code -> https://pastebin.com/DpxZ8wEp
# | User | Rating |
---|---|---|
1 | tourist | 3985 |
2 | jiangly | 3814 |
3 | jqdai0815 | 3682 |
4 | Benq | 3529 |
5 | orzdevinwang | 3526 |
6 | ksun48 | 3517 |
7 | Radewoosh | 3410 |
8 | hos.lyric | 3399 |
9 | ecnerwala | 3392 |
9 | Um_nik | 3392 |
# | User | Contrib. |
---|---|---|
1 | cry | 169 |
2 | maomao90 | 162 |
2 | Um_nik | 162 |
4 | atcoder_official | 161 |
5 | djm03178 | 158 |
6 | -is-this-fft- | 157 |
7 | adamant | 155 |
8 | awoo | 154 |
8 | Dominater069 | 154 |
10 | luogu_official | 150 |
Question -> https://codeforces.net/problemset/problem/264/A
My Code -> https://pastebin.com/DpxZ8wEp
Name |
---|
Precision problem, "long double" data type support a limited accuracy.
When x,y are small enough, there is no guarantee on the inequality $$$L<(L+R)/2<R$$$ and compiler may gives you something like $$$L=(L+R)/2\leq R$$$. For instance all char in your problem are 'l' and your position are $$$1/2,1/4...,1/2^n$$$. Can the computer differentiate $$$1/2^{999990}$$$ and $$$1/2^{999991}$$$ properly? (In fact it treats $$$1/2^{16446}$$$ as $$$0$$$)