Orn0's blog

By Orn0, history, 2 hours ago, In English

Hello Codeforcers,

Recently I've stumbled upon this AtCoder problem. The naive solution in $$$\mathcal{O}(n^2m)$$$ gets TLE ($$$n \leq 2000$$$, $$$m \leq 2000$$$), and the solution is to use arrays of bitsets.

I don't understand why it's so much more efficient. The editorial uses bitsets of size $$$1000$$$, so I feel like the overall complexity would be $$$\mathcal{O}(1000nm)$$$, which is pretty similar to the naive complexity. Should I understand that performing $$$XOR$$$ on bitsets of size $$$n$$$ isn't of complexity $$$\mathcal{O}(n)$$$ ? (using precomputed tables with masks of size 64 ?)

I had a hard time understanding the solution, so any similar problem resource to learn would be appreciated.

  • Vote: I like it
  • 0
  • Vote: I do not like it

»
2 hours ago, # |
Rev. 2   Vote: I like it 0 Vote: I do not like it

when you think of bitsets, think of them as a list of 64-bit integers without you having to worry about indexing and iteration. Xor'ing N-bit bitset is simply xor's on N/64 integers, hence it is O(N/64) or sometimes denoted as O(N/w) where w = 32 or 64 and depends on the architecture.