Hi guys! I was wondering if this problem:- https://www.codechef.com/problems/BITMASK4 can be solved using DFS within the time constraints. Any help will be appreciated. Thanks you!
# | User | Rating |
---|---|---|
1 | tourist | 3985 |
2 | jiangly | 3814 |
3 | jqdai0815 | 3682 |
4 | Benq | 3529 |
5 | orzdevinwang | 3526 |
6 | ksun48 | 3517 |
7 | Radewoosh | 3410 |
8 | hos.lyric | 3399 |
9 | ecnerwala | 3392 |
9 | Um_nik | 3392 |
# | User | Contrib. |
---|---|---|
1 | cry | 169 |
2 | maomao90 | 162 |
2 | Um_nik | 162 |
4 | atcoder_official | 161 |
5 | djm03178 | 158 |
6 | -is-this-fft- | 157 |
7 | adamant | 155 |
8 | awoo | 154 |
8 | Dominater069 | 154 |
10 | luogu_official | 150 |
Hi guys! I was wondering if this problem:- https://www.codechef.com/problems/BITMASK4 can be solved using DFS within the time constraints. Any help will be appreciated. Thanks you!
Name |
---|
How do you want to use DFS here? I see one way: to make a graph out of lines by connecting those that are parallel and then searching for the biggest connectivity component, but that has quadratic time complexity as you have to check each pair.
I see 2 correct solutions here, and neither use DFS or any graphs in any way
What you wrote is exactly what I had in mind and wanted to know if there was a way to optimise the time complexity. Also, do you mind briefly explaining the 2 solutions you see, which don't use DFS?
Welp I'm sorry as I have realised the second one wasn't a solution at all so I'll tell the easier one.
Two lines (A1, B1, C1) and (A2, B2, C2) are parallel if and only if A1/B1 == A2/B2 (or B1 == 0 && B2 == 0), so we can make a map from A/B ratios to number of lines with such ratio (and a separate counter for lines with B == 0) and just go over the lines and increase corresponding count each time. Then just take the maximum. If you are nervous about ratios being non-integers then you can tell your map to count two numbers as equal if they differ less than some EPS (around 1e - 7 I suppose).
If you use an ordered map, your algorithm's time complexity is going to be O(TN * log(N)), but it won't exceed TL anyway if you use C++ or such.
If you use an unordered map, logarithm's gone from your complexity, but if you use C++ always remember that std::map on basic types works faster than std::unordered_map for sizes up to 1e6 or such because of the scary hash function so I recommend you to use the ordered one.
Perfect. Thanks a lot!