_Muhammad's blog

By _Muhammad, history, 6 years ago, In English

In this problem I am trying to find maximum X1 + X2. Lets X is the Xor of all given numbers. I assumed each number as a vector and then found the basis.Let see an example :

5 = 1 0 1

6 = 1 1 0

7 = 1 1 1

After using Gaussian elimination the basis are :

1 0 0

0 1 0

0 0 1

And then I iterated from most significant bit and If there is 0 in i -th bit of X then I tried to put 1 in X1. If it is not possible then I skipped that bit and went to next bit. But this process is not giving me maximum X1+X2. Here is the Implementation. Please can anyone tell me what is wrong here.Is my logic is wrong?? Or the Implementation is not prefect?

Please help me.I am stucked here.

  • Vote: I like it
  • 0
  • Vote: I do not like it