You are given an array of N numbers. The game goes like this. You have to select any two elements P and Q from the array. The cost of selecting these two elements will be (P | Q — P & Q) where "|" denotes the bitwise OR of two numbers and "&" denotes the bitwise AND of two numbers. Now out of these two numbers, throw one number and put the other number back into the array.
This way the size of the array will get reduced by one element. The game stops when there is only one element left in the array. The total cost of reducing the array to a single element is the sum of the costs incurred in all the steps.
Now, your task is to reduce the array to a single element while minimizing the total cost incurred.
Constraints:
1 <= N <= 10^3
1 <= A[i] <= 10^3
Make a complete graph where the cost of the edge between $$$u$$$ and $$$v$$$ is the cost as you wrote in the starement. The answer is the cost of minimum spanning tree of the graph.
Thanks, thats a nice solution. Can there be also a dp solution to it?
I don't know, wouldn't count on it.
Thanks!! but what to do when N <= 10^5 ?
Then it is precisely 888G - Xor-MST, for which you can find an editorial.
I like the tag 'hevi problem'(´∀` )
I'm not sure why they gave the cost in that way, it's the xor between P and Q —
p|q
contains each power of 2 that appears in any of the two numbers,p&q
contains those that appear in both. When subtracting, you are left with only those that appear in one of the twoThis is a bit overkill for the given constraints, but if
M
is the maximum value inA
, the problem can be solved inO(N*log^2(M))
using divide-and-conquer.If
A
has fewer than two unique elements, the answer is 0. Otherwise, find the highest bit where any two elements ofA
differ. Then, divideA
into two arraysB
andC
, whereB
contains all elements ofA
with a 0 in that bit andC
contains all elements ofA
with a 1. Solve the problem recursively on those two arrays. Then, construct a bitwise trie out ofB
and, for each elementc
ofC
, find the elementb
ofB
minimizingb XOR c
. This can be done by searching forb
in the trie, going down the opposite branch from each node whereb
's branch does not exist. The answer to the problem can be calculated as the sum of the two subproblems' solutions plus the minimum value ofb ^ c
plus a number with only the current bit set. Each level of the divide-and-conquer takesO(N*log(M))
. There areO(log(M))
levels, meaning the overall complexity isO(N*log^2(M))
.As other comments pointed out, the problem comes down to finding the minimum spanning tree cost in a complete graph of elements of
A
, where edge weights are the bitwiseXOR
of their nodes. Note that the minimum spanning tree can be constructed by considering edges in order of nondescending weight. Since the cost of each edge is theXOR
of its vertices, edges connecting two vertices that share the firstk
bits will be considered before edges connecting two vertices that differ anywhere in thesek
bits. Thus, we can simply find the MST for each subset based on its value in the first bit where any element differs and then connect these two subtrees optimally with a single edge.