How to solve the following problem ?
For any 3 bit strings (consisting only of 1s and 0s) A, B and C, f(A,B,C) is defined as:
f(A,B,C)=(no. of 1s in ((A XOR B) AND C))/(no. of 1s in C).
We are given N bit strings initially. Then we are given Q queries. Each query contains 2 bit strings B and C. For each query output a single bit string A from the initial set of N bit strings, such that f(A,B,C) is minimum.
It is given that the size of all the bit strings given in the input = 2048 (constant).
Input:
First line contains 2 integers N and Q.
Then N lines follow, each containing a single bit string of length = 2048.
Then Q lines follow,
each containing 2 bit strings B and C each of length = 2048. B and C are space separated.
Output:
For each query output one line containing the string A and f(A,B,C) (space separated).
Constraints:
1<=N<=10000
1<=Q<=10000
|A|=|B|=|C|=2048 (fixed for all strings)
Example:
Input:
2 1
1001
1011
1
1111 1111
Output:
1011 0.25
Eagerly waiting for your replies. Thanks in advance.
Auto comment: topic has been updated by viralm (previous revision, new revision, compare).
I think this can be done using trie. Build a trie using the initial N strings and search a string A which matches with B as closely as possible. Once you have the string A, rest is easy.
I think it is not possible in this way because it might be the case that as you move from the root towards the leaves, allowing the root node to have a mismatch might lead to the best possible answer.
Yes, you are right. Sorry about that. :)